Assign Global Variable/Argument for Any Build to Use - build

I have several (15 or so) builds which all reference the same string of text in their respective build process templates. Every 90 days that text expires and needs to be updated in each of the templates. Is there a way to create a central variable or argument

One solution would be to create an environment variable on your build machine. Then reference the variable in all of your builds. When you needed to update the value you would only have to set it in one place.
How to: Use Environment Variables in a Build
If you have more than one build machine then it could become too much of a maintenance issue.
Another solution would involve using MSBuild response files. You create an .rsp file that holds the property value and the value would be picked up and set from MSBuild via the command line.

You need to place it into somewhere where all your builds can access it, then customize your build process template to read from there (build definitions - as you know - do not have a mechanism to share data between defs).
Some examples would be a file checked into TFS, a file in a known location (file share), web page, web service, etc.
You could even make a custom activity that knew how to read it and output the result as an OutArgument (e.g. Custom activity that read the string from a hardcoded URL).

Related

Can a user define an automatic macro variable?

I have two different SAS platforms and I run a macro on each platform to create a report.
I use
%include "&MACRO_Path.";
what do i want is creating an automatic variable "MACRO_PATH" which contains file location. So when I start my SAS session, I do not need to define the file location in that MACRO_PATH variable.
I know one way is I can create auto executable file in the location which contains the code line but just wondering if I can create automatic Macro variable.
I, also, know automatic macro variables are the ones which are created by macro processor.
But is there any work around?
If you just have two hosts and you know which ones they are then you might be able to use SYSSCP or SYSHOSTNAME. For example if one is a Windows host and another a unix host you might do this.
%let macro_path=%sysfunc(ifc(&sysscp=WIN,c:\mymacros,~/mymacros));
If you're using Enterprise Guide, you can do this at the project level by using the Autoexec process flow. Name one process flow "Autoexec" and select the option that automatically runs that process flow when you open the project.
What I typically do for projects where I have, say, Dev, Test, and Prod environments, is to have linked files that are not synced in version control, stored in the project directory (you need to enable Relative File Paths in the Project Properties under File References). That contains the value for the macro variable(s) that are specific to the environment.
Then, that shortcut is stored in the autoexec process flow. The project itself is in version control, so when I sync my dev changes with test or prod it has the correct local references.

Change stored macro SAS

In SAS using SASMSTORE option I can specify a place where the SASMACR catalog will exist. In this catalog will reside some macro.
At some moment I may need to change the macro and this moment may occure while this macro and therefore the catalog will be in use by another user. But then it will be locked and unavailable to be modified.
How can I avoid such a situation?
If you're using a SAS Macro catalog as a public catalog that is shared among colleagues, a few options exist.
First, use SVN or similar source control option so that you and your colleagues each have a local copy of the macro catalog. This is my preferred option. I'd do this, and also probably not used stored compiled macros - I'd just set it up as autocall macros, personally - because that makes it easy to resolve conflicts (as you have separate files for each macro). Using SCMs you won't be able to resolve conflicts, so you'll have to make sure everyone is very well behaved about always downloading the newest copy before making any changes, and discusses any changes so you don't have two competing changes made at about the same time. If SCMs are important for your particular use case, you could version control the macros that create the SCMs and build the SCM yourself every time you refresh your local copy of the sources.
Second, you could and should separate development from production here. Even if you have a shared library located on a shared network folder, you should have a development copy as well that is explicitly not locked by anyone except when developing a new macro for it (or updating a currently used macro). Then make your changes there, and on a consistent schedule push them out once they've been tested and verified (preferably in a test environment, so you have the classic three: dev, test, and prod environments). Something like this:
Changes in Dev are pushed to Test on Wednesdays. Anyone who's got something ready to go by Wednesday 3pm puts it in a folder (the macro source code, that is), and it's compiled into the test SCM automatically.
Test is then verified Thursday and Friday. Anything that is verified in Test by 3pm Friday is pushed to the Dev source code folder at that time, paying attention to any potential conflicts in other new code in test (nothing's pushed to dev if something currently in test but not verified could conflict with it).
Production then is run at 3pm Friday. Everyone has to be out of the SCM by then.
I suggest not using Friday for prod if you have something that runs over the weekend, of course, as it risks you having to fix something over the weekend.
Create two folders, e.g. maclib1 and maclib2, and a dataset which stores the current library number.
When you want to rebuild your library, query the current number, increment (or reset to 1 if it's already 2), assign your macro library path to the corresponding folder, compile your macros, and then update the dataset with the new library number.
When it comes to assigning your library, query the current library number from the dataset, and assign the library path accordingly.

Linked directory not found

I have following scenario:
The main software I wrote uses a database created by a simulator. This database is around 10 GB big at the moment, so I want to keep only one copy of that data per system.
Assuming I have following projects:
Main Software using the data, located at /SimData
DLL using the data for debugging, searching for data at /SimData
Debugging tool to parse the image database, searching for the data at /SimData
Since I do not want to have all those programs have their own copy of SimData (not only to decrease place used, but also to ensure that all Simulation data used is always up to date for all programs).
I created for the DLL and Debugging Utility a link named SimData to MainSoftware/SimData, but when opening a file with "SimData\MyFile.data" it cannot find it, only the MainSoftware with the ACTUAL SimData folder can find it.
How can I use the MainSoftware/SimData folder without setting absolute paths?
This is on Windows 7 x64
I agree with Peter about adding the DB location as a configurable parameter. A common place to store that is in the registry.
however, If you want to create links that will be recognized by your software, try hardlinks. . fsutil should do the trick as described here.
You need a way to configure the database location. You could use an INI or other configuration file, or a registry setting, or a command-line input, or an environment variable. Or You could write your program to search a directory hierarchy... for example, if the various modules are usually siblings of each other in your directory tree, you could search for SimData/MyFile.data, ../SimData/MyFile.data, ../../MainSoftware/SimData/Myfile.data, and use the first one found.
Which answer is the "right one" depends on your situation.

Referencing information in builds specified in a run parameter [Hudson]

Day 1 with using Hudson for our CI build. Slowly but surely getting up to speed.
My question is about run parameters. I've seen that I can use them to reference a particular run of a particular project - that's all fine.
What I don't understand (and can't find any documentation on - there's nothing at Parameterized Build) is how I refer to anything in the run defined by the run parameter.
Essentially I want to reference the %BUILD_NUMBER% and %SVN_REVISION% of the run that is selected in the run parameter.
How can I do that?
Do you really need to add extra property values, extra parameters for your job?
Since BUILD_NUMBER and SVN_REVISION are already defined as environment variables (see Building a software project), you can use those in your job.
When a Hudson job executes, it sets some environment variables that you may use in your shell script, batch command, or Ant script
or:
illustrates you already have those values at your disposal.
You can then use them to define other environment variables/properties within your shell or ant script.
When it comes to pass a variable value from one job to another, the Parameterized Trigger Plugin should do the trick:
The parameters section can contain a combination of one or more of the following:
a set of predefined properties
properties from a properties file read from the workspace of the triggering build
the parameters of the current build
"Subversion revision": makes sure the triggered projects are built with the same revision(s) of the triggering build.
You still have to make sure those projects are actually configured to checkout the right Subversion URLs.
Note: there might be an issue with the Join Plugin, which might not work when the Parameterized Trigger is in action.

How to store the Visual C++ debug settings?

The debug settings are stored in a .user file which should not be added to source control. However this file does contain useful information. Now I need to set each time I trying to build a fresh checkout.
Is there some workaround to make this less cumbersome?
Edit: It contains the debug launch parameters. This is often not really a per-user setting. The default is $(TargetPath), but I often set it to something like $(SolutionDir)TestApp\test.exe with a few command line arguments. So it isn't a local machine setting per se.
Well, I believe this file is human readable (xml format I think?), so you could create a template that is put into source control that everyone would check out, for instance settings.user.template. Each developer would than copy this to settings.user or whatever the name is and modify the contents to be what they need it to be.
Its been a while since I've looked at that file, but I've done similar things to this numerous times.
Set the debug launch parameters in a batch file, add the batch file to source control. Set the startup path in VS to startup.bat $(TargetPath).