Develop files and Production files for each branch - build

I have a structure directory like:
src => contains all files to develop the app (gulpfile, components, etc.).
build => contains all final files like styles/styles.css, js/app.js, index.html etc., compressed, minified, to deploy on server.
How could handle/ignore files on each branch, like:
master branch => all files/directories (src, build, etc.).
production branch => only build directory to deploy on server.
I tried with .gitignore, but I have a conflict with the .gitignore files when merge branches.
I tried editing config file inside .git directory, but nothing happens.
BTW, my project not includes nodejs.
It's possible do this?

In general what you are trying to do isn't a best practice. To really do what you want you would have a different git repository for your build artifacts.
A better approach is to have a CI server that builds the branch and archives the results somewhere. Usually you would have your build zip it up and store it somewhere you know it is immutable.
Alternatively you could commit your build artifacts but change your deployment scripts to only deploy the build directory.

Related

How do I view the contents of my build artifact folder in Azure Dev Ops?

I am trying to modify my configuration file, dataSettings.json, located somewhere inside the build artifacts folder. Figuring out the correct access path to it is like working in the dark. Using "**/dataSettings.json" as a path doesn't work in my task since I don't know the artifact's folder structure, nor wether dataSettings.json even exists.
Is there a way to quickly view the contents of a build artifacts folder within DevOps?
Add a script step in your shell scripting language of choice (bash, PowerShell, Windows command prompt, etc) that recursively outputs the directory structure. Specific commands are easy to Google. i.e. PowerShell would be gci -rec. DOS would be dir /s. Bash would be ls -R.
You can quickly view the contents of the artifacts in many of the tasks in your release pipeline.
For example, If you are using File transform task or Azure App Service deploy task. You can click the 3dots at the right end of the Package or folder field to view the contents and folder structure of the artifacts.
The Source Folder field of Copy files tasks for example:
If the artifacts is a zip file. You can navigate to its correponding build pipeline runs and download the artifacts locally to check its contents. You can download the build artifacts at the Build summary page.

Google Container Registry build trigger on folder change

I can setup a build trigger on GCR to build my Docker image every time my Git repository gets updated. However, I have a single repository with multiple folders, and a Docker file in each folder.
Ex:
my_app
-- service-1
Dockerfile-1
-- service-2
Dockerfile-2
How do I only build Dockerfile-1 when the service-1 folder gets updated?
This is a variation on this GitHub feature request -- in your case, differential behavior based on the changed files (folders) rather than the branch.
We are considering this feature as part of the development of support for more advanced workflow control and will post back on that GitHub issue when it becomes available.
The work-around available to you today is to use a bash script that conditionally builds (or doesn't) based on an inspection of the files changed in the $COMMIT_SHA that triggered the build. Note that the git builder can be used to get the list of files changed via git diff-tree --no-commit-id --name-only -r $COMMIT_SHA.

Using Bitbucket for existing project

I have an existing django project on my local machine (in virtualwrapper). How do I add it to the Bitbucket?
Let say my django project is like this:
project
--manage.py
--apps
--db.sqlite3
...
So should I do 'git init' under 'project' directory?
Since it is develop in the virtualwrapper, so I think only the project files will be pushed to the Bitbucket, is that right? If I want to develop the project on a different computer, and want to pull the project files from Bitbucket, how should I do it? I mean should I create another virtual environment in my new machine, install django and necessary pakcages before import the files from bitbucket?
I am new to git, so I don't know what is the best to do it.
So should I do 'git init' under 'project' directory?
Yes, but after that, don't add everything.
Create a .gitignore file first, where you declare the files that shouldn't be versioned (the one that are generated)
Then add and commit: that updates a local repo.
you can easily link it to an existing empty BitBucket repo:
git remote add origin ssh://git#bitbucket.org/username/myproject.git
git push -u origin master # to push changes for the first time
Normally, you wouldn't store a binary like db.sqlite3 in a source repo.
But this blog post suggests a way to do so through
In a .gitattributes or .git/info/attributes file, give Git a filename pattern and the name of a diff driver, which we'll define next. In my case, I added:
db.sqlite3 diff=sqlite3
Then in .git/config or $HOME/.gitconfig, define the diff driver. Mine looks like:
[diff "sqlite3"]
textconv = dumpsqlite3
I chose to define an external dumpsqlite3 script, since this can be useful elsewhere.
It just dumps SQL to stdout for the filename given by its first argument:
#!/bin/sh
sqlite3 $1 .dump

How do I push git commits in different Django directories to GitHub without having to keep doing pull merges?

I am pushing one set of commits to GitHub from my Django app directory and another set of commits from my templates directory. Because each is a different directory, I had to set up a git remote in each one so that it knows to push to GitHub.
Here's the problem: Each set of commits has its own "master" branch that isn't aware of the other. First, I push the commits from the Django app directory's master branch to GitHub's master branch and that works fine. But then, if I push the set of commits from the template directory's master branch to the same GitHub master branch, I get an error saying that I need to do a merge before I can push. The push goes through after I do the merge, but it's annoying to have to keep merging the GitHub master to the different master branches of my different Django directories.
My question is: is it possible to set one master branch for all the different django directories I need to work with so that i can just do one push for files in all my directories? It seems that I need to initialize a .git file for each directory I want to work with which consequently gives each directory its own master branch.
That is very strange. I use a single master branch for django projects that are hosted remotely on github.
You need to make a clone of the local repo in the topmost directory to have a single remote master branch.
For example, my project name is: BigCoolApp
If you do: cd /user/BigCoolApp
You should find a .git folder in there. It will be among all the other folders. If done correctly, you will now have a single master branch for your Django project.

jenkins ci : how to select the artifacts to be archived

I'm using Jenkins to build a maven 2 project. As part of the build a couple of jar files get generated in the target directory. I would like Jenkins to archive/copy a specific jar from the target location to a custom folder.
How can I achieve this ? I've tried using the 'Archive the artifacts' post build option but it does not allow me to select the file under target. I get a error message saying such a location does not exist.
I'm new to Jenkins so any help is appreciated.
Thanks.
Sirius
You may have your file specification or the base directory for the artifacts wrong. From the help text:
Can use wildcards like 'module/dist/*/.zip'. See the #includes of Ant fileset for the exact format. The base directory is the workspace.
So you'll need to figure out where your target directory is relative to the workspace directory.
The archive feature copies/saves your build artifacts out of the workspace into the build's individual directory. You cannot specify where it puts them. That said, I would probably leave archiving turned on if you'll ever need to refer back to a previous version.
You can use a script build step to create the dir if it does not exist and perform the copy.
But you have not said why you want to move the artifacts around. If it is to make them available to other projects, you should look instead at the Copy Artifact build step.