In the following link, there's documentation of how to publish git repo to Azure wiki.
https://learn.microsoft.com/en-us/azure/devops/project/wiki/publish-repo-to-wiki?view=azure-devops&tabs=browser
I have a project written in c++ / c.
It is in Azure repo.
I'm trying to understand how to create wiki pages to document the code with links to a specific location in the code - function let's say. What happens when the repo changes? or the function changes? the wiki doesn't update the link.
I understand that the wiki repo publish will fix it. Since it is git based.
When I do it, the files are empty, as I understand it - it is because only md files are read by azure.
But I have over 100 files of cpp, and creating an md for each one will be very messy and will cloud my project heavily.
Is there a way to do it without the conversion mess?
Or in a way to not cloud my project with so many files?
While also being consistent with changes.
I don't understand why this is even required.
I have a git project in Azure, and then I need to swap it to md which is basically text files to publish them back into git, this seems unneeded.
Any help will be appreciated
Related
I'm using the code from a github repo for my project, and have been trying to understand some of the dev history about a problem I'm having. However, I'm unable to list the commits from one particular developer. Usually, you will see a link option to list all the commits, on the page of any one commit of any given author. But for this one dev only, that link isn't there. Github docs say this could be due to that author no longer having a github account, but that isn't the case here. Also, I can manually append a commits?author=username to the URL, but that doesn't work either. The only way I can see the commits is to load the network graph, scroll the timeline, and mouse-over each commit dot one at a time. It is a difficult way to scan through dozens of commits.
This is best done locally. Clone the repo and then use log or shortlog:
git log --author=author#email.com
or
git shortlog --author=author#email.com
gitk also supports the --author option to filter commits by author.
New user here...
Installed D8+Civi by building a composer based git repo for the platform then stamping out a few test sites.
It worked really well.
But now I am at the point of realizing I missed a few modules and I want to add some themes to apply to the sites.
I can easily to it in the git which was used to define the platform. But what is the proper way to manage the central platform data and files that are then used for the x number of sites.
I know the docs try to discuss this be a tutorial walk-through would be very helpful.
As a guess, I could make the central platform files a git clone and pull down clones for the new stuff. But if there was a need for an database updates that wouldn't get done.
Ideas?
Thanks
It's not clear what you mean by "central platform data".
If you mean assets that are relevant for the entire platform, that can apply to all of the sites, you would do the following:
Add anything new to Git and push it.
Create a new platform to match the latest code in Git.
Ran a Migrate task on the old platform to migrate the sites to the new one.
Database schema updates happen automatically.
The sites will now be running on the new codebase.
If you're talking about site-specific assets that you don't want to be included in the platform's code, then you can enable Git for sites with the Aegir Hosting Git module.
It allows you to deploy site-specific Git repositories.
However, I don't recommend using that module for platforms, just sites, because it allows you to git pull on Production sites, which is a terrible idea. For that, see Aegir Deploy.
Both of these modules ship with Aegir so you won't need to install them. Some of the Hosting Git features may need to be enabled, however.
We are using Build Pipeline in Azure DevOps to create a Deployment Artifact. Typical steps in such pipeline are:
Build Solution / Project
Copy dlls output into $Build.ArtifactStagingDirectory
Publish Artifact from $Build.ArtifactStagingDirectory
I just wonder if I can rely on the fact, that on start of each Build the Build.ArtifactStagingDirectory is empty. Or should I clean the folder as first step to be sure?
From my experience the folder was always empty, but I am not sure if I can rely on that. Is that something specific to Azure hosted Agent and maybe by using custom Build agents I have to do manual clean-ups of this folder? Maybe some old files from last build could remain there? I did not found this info in documentation.
Thanks.
I think that the main idea of this variable $Build.ArtifactStagingDirectory is to be a clean area so you can manage the code you're pushing from your repo. As far as I know, there is no explicit information on documentation talking that this folder is empty at every new build, but there are a few "clues":
You can see at the Microsoft's Build Variables documentation that Build.StagingDirectory is always purged before each new build, so you have a fresh start every build.
In the documentation above you have a few cases where it explicitly cites that some folders or files are not cleaned on a new build, like the Build.BinariesDirectory variable.
I've run a few build and realeases pointing to my Web App on Azure, and I never saw an unwanted file or folder that was not related to my build pipeline.
I hope that helps.
I probably end up re-inventing parts of the github REST API for my own repo server. But maybe there is some server script to do that already out there? Or maybe you have other suggestions?
This is my use case:
I am developing a Firefox Extension, that shall display the data of a
git log -- <path>
I always could write a little server script that implements the well developed JGit and does the "git log" command there. But then, the FF extension depends on that server script ;(
I was wondering, if there exists something like the github REST API for "not-github"-repos that would be more standard as my little server script?
I also thought about a Git JS Client, like Git.JS (apparently the only JS Client; workes with node.js; Unfortunatly the project is no more active and has no documentation.) . However, I don't need a full client. I just want to retrieve some information Read Only from the remote master repo.
Although I am late to the party I have noticed a few a might contribute to the answer.
Orion Git API Orion is an Eclipse project
RESTFul Git from Hulu on github
If you haven't tried it, GitBlit is a VERY cool option. I have multiple installations on a few windows dedicated servers that I pull together using a REST API. I had it up and running in 5 minutes, in Windows, using the "GO: Single-Stack Solution".
Gitblit GO is an integrated, single-stack solution based on Jetty.
You do not need Apache httpd, Perl, Git, or Gitweb. Should you want to use some or all of those, you still can; Gitblit plays nice with the other kids on the block.
This is what you should download if you want to go from zero to Git in less than 5 mins.
I would say you definitely need to implement some kind of server-side code by your own.
You can choose any server-side language you like. I believe ruby or python will work fine. Than create simple web-site with one page embedding output of git log according to the parameters given.
All other options will not work for you, I believe. You cannot remotely access git repository's history due to distributed nature of git — you can read history of your local repository only.
Reading that web-page by your extension and parsing output will give you what you need.
Question 1:
I am about to deploy my first Django website and I was wondering what tools are recommended to gathering all your Django files.
Like for example I don't need my sass and coffeescript files I just want the compiled css and js files. I also want to use the correct production settings file.
Question 2:
Do I put these files ready for deployment into their own version control repository? I guess the advantage is that you can easily roll back changes?
Question 3:
Do I run my tests before gathering the files or before deploying?
Shell scripts could be a solution but maybe there is a better way? I looked at jenkins/hudson but that seems more like a tool that sits on top of the tools that I am looking for.
For questions one and two, I'd recommend using a version control system for this. I'm sure you're already using some sort of version control, so you can just say which branch of your repository you would like to deploy. And yes, this makes rollbacks incredibly easy. Probably the most popular method for Django deployment is to package your files using git, and then deploy these files and run any deployment scripts using fabric.
Using git, packaging your files using your local repository would look something like:
git archive --format=tar HEAD | gzip > my_repo.tar.gz
Alternately, you can first push your changes to a github repository, and then in your deployment script just clone your repository from your production server.
For your third question, if you use this version control method for packaging your files, then just make sure when you are testing you have the deployment branch checked out.
I'll typically use Fabric for deploying most Django projects:
http://docs.fabfile.org/en/1.0.0/?redir
It has a decent api for communicating with remote servers and it's all in Python – bonus!
You don't need to store your concatenated media files in a separate repo. They're only needed for production. In that case I've found libraries like django-mediasync and django-compress to be useful. They both provide template tags/settings that can concatenate and cache your static files for you depending on the DEBUG setting/environments (production vs development).
You can run your tests whenever. Some people will run them as a version control hook to prevent broken code from being checked in or during deployment, stopping the deployment in case of test failure.