I want to ask if anyone knows how to build XML files that can be used in Build Forge as an adaptor, what I mean is if there is any reference on how to write this type of XML files, and how this XML file deals with the varaibles because sometimes I can see that there are some ocurrences of this pattern ($1, $2 ... etc), I do not know what those patterns refers to...
Appreciate your help and thank you...
Unfortunately, the documentation included with the Build Forge application isn't the most complete. I often have to use their online documentation.
You are most likely looking at the Command or ResultBlock section of the adaptor.
IBM Build Forge adaptors use Perl syntax inside the XML.
I would suggest watching this video IBM posted online.
https://mediacenter.ibm.com/media/Using+Build+Forge+adaptors/0_9g8f456l/33943992
The accepted answer is a bit stale and some of the links have changed. I'm deep in this particular Hell myself, so this is for those that have to deal with Build Forge and the quirky API and the spotty documentation.
The video has moved. It's now found here:
https://mediacenter.ibm.com/media/Using+Build+Forge+adaptors/0_9g8f456l/33943992
After you watch the video, there is a cookbook instruction set on how to make a basic adaptor in ClearCase for Build Forge here:
http://www.ibm.com/developerworks/rational/library/continuous-integration-build-forge-clearcase/
I have managed to make a perl cli script that will fire a Build Forge project build, but it doesn't get the tagging correct yet. So far, the best method I've found is to schedule the build job within Build Forge, then the deploy is run separately with a cron job that fires off another perl cli script. I played games with the tagging to smoothly hand it off from the build to the deploy by having the build process write the tag to a file outside of Build Forge where the perl api deploy script picks it up. That is a broken kludge, but it's the best I've got so far. I'm hoping that the adaptor can make that process more precise. I'll update this entry once I have a full solution, but this is the crankiest tool set I have ever had to deal with for Continuous Integration.
Related
I'm admittedly new to Netsuite, so this may be obvious, although I've been unable to find anything specific one way or the other. In fact, I don't even attend any training until next week, but I'm trying to get part of my development environment setup with one of the editors/IDEs I prefer. I know that Netsuite offers an Eclipse plugin, but I'm not an Eclipse fan. I'd prefer to use either WebStorm or TextMate. (I'm on MacOS Sierra)
I tried installing the WebStorm plugin, but it's throwing an exception and is not functional. I submitted a bug on GitHub, but what I'd really like to know is if it's possible for me to write my own script to upload/download files to the cabinet, so I could just roll my own feature in TextMate. Is this possible, and if so, how? (Just a link to the docs is perfectly fine)
In other words, is it possible via their API, to submit changes to a script I've been working on in another editor/IDE? Or interact with our cabinet? (Not sure if I'm using the proper NS verbiage, but hopefully you get my intent) I'm thinking about writing a Python script, that accepts a local script path as a parameter, that will then get submitted to our cabinet. Thanks for the help in advance.
I wrote a plugin for JetBrains IDEs (I use WebStorm specifically though) that mimics NetSuite's Eclipse plugin. Feel free to take a look. It is open source and has ~1500 downloads at the current moment.
https://plugins.jetbrains.com/plugin/8305?pr=
If you are the same person that opened this issue (https://github.com/Topher84/NetSuite-Tools-For-WebStorm/issues/7), it has been closed and was due to using an older version of WebStorm.
I don't like eclipse personally, so I just make my scripts in whatever and use Netsuite's script backend to upload the scripts as 'new' when I'm done. If I want to change them, simply use their backend again to 'edit' the script. You'll see a simple editor, where you can change things, or you just copy and paste what you have in there. It's a little more work than something integrated, but it does work..
I'm interested in setting up a remote build system at work, initially for internal use, potentially for some customers going forward. We need to compile library code on several different machines (PC, Mac) and with multiple compilers, and it can be a real pain trying to get access to a full set. This is not our main build system, which is Jenkins-based and uses an approach that is not easily modified for the purpose envisaged here.
The idea would be that you could post your source to a website with some basic build parameters, it would compile the code and you could then download the generated code. Ideally users could pick which version of the underlying software they compiled their libraries against. I envisage it being supported by a virtual machine.
Reason I'm posting is that I don't really want to roll-my-own as much as possible - longer term it has maintenance implications - and would prefer something as pre-existing as possible. Obviously one would expect some adaptation in terms of scripting.
Any suggestions? It would have to be supported on Mac and PC at absolute minimum.
This sounds like something you could do by creating a parameterized Jenkins job (the build params given as input to your web frontent could be passed on to the job, perhaps via the Jenkins API). Personally, I would see if you could skip the step of creating a new webfrontend, and have users pass their build params directly to Jenkins.
To support downloading the resulting compiled code, you could have the Jenkins job archive the build as an artifact. Users could then download the files from the result page for that individual build.
As for how to make a Jenkins job accept source code to compile as input, perhaps you could use branches in your CM system? Your users could push their code to a branch, and then pass the branch as a build param. Otherwise, you might be able to use the file parameter feature of Jenkins.
From last few day , I was searching for static code analysis tool for ColdFusion. I have not got a good one till now. I found two.
YASCA
https://code.google.com/p/cf-metrics/
From YASCA I was getting only XSS alerts and some alerts for session mgmt, nothing more than that. I have tried with my entire project.
I am not even able to properly install cf-metrics using ColdFusion10 , After putting the required jar file in the lib folder I was not able to access any one my IIS site
because of some isapi redirect isse.
Any other tools available?
If you're still looking for a ColdFusion Linter, I would recommend CFLint. It's hosted on GitHub and Maven. The parser was updated to use ANTLR4, so it's much faster than previous editions. We're also making it easier to customize than JSLint.
I've looked at this a couple of times in the past, as I maintain a large CF application.
Each time I looked I was unable to find anything suitable. I spent a while looking into using the Railo CFML parser (because it's open source) to build something ourselves and concluded at the time that it was possible, but was no small task.
You may be able to re-examine the Railo approach, but feed the AST from Railo into an existing code analysis tool. I never got that far, but it may be possible to an extent.
I'd love to hear different, but the short answer is that there's not much out there.
We are moving our build and testing system to Jenkins, and are looking for an easy way (where we don't have to code all the logic ourselves) to manage the build artifacts.
Basically we need an organised way to store them in order by the type of the build, the user who built it and such for example:
johnd/nightly/r543241/win32/program.zip
johnd/nightly/trunk/lin64/program.tar.gz
master/release/2.1/win32/program.zip
This way we can upload when a build is complete, and retrieve the required artifact in the testing stage with ease.
Until now we simply stored the files in directories on NFS, but recently started considering an artifact manager. I've looked at Artifcatory, Archiva and Nexus. But all seem very Java centered or at least required maven to work with. Since I don't want to introduce more complexity (We mainly work with python, scons is our build tool) and I don't want to introduce maven to the mix, I'm looking for something that has an easy command line (or better REST/Python interface) to upload, download, manage artifacts.
If you don't use an artifact manager, but use some other clever method to manage your C++ artifacts for release/test needs I'd be happy to hear that also.
I have exactly the same issue. I didn't succeed to integrate artifactory easilly, so I forgot it.
For the moment I use the copy artefact plugin to copy by scp the file to an NFS share. But this is not a good solution since shares may be mounted differently on two machines, and windows cannot access them directly.
But a real artifact manager would be so such a good thing for our project.
In my previous job, I was using a documentation CMS (LiveLink) which provided good services:
- self organising folder.
- persistant url (we can move, rename folders and files, a public URL to a given files never changes, that was SOOO great : "http://server/go/123456789123456789" always pointed to the same file. So you just had to give this url to wget and you get the file
- file versionning.
- meta data, advanced search
I'm still searching such a solution in opensource software and don't find it.
In you just need to share artefact between two jobs, simple use the "copy artifact for another build" plugin. I use it to split my job into a a build job and a test job (actually there are 3 test jobs, one very fast done after each commit, one quite slow done every day, and one very slow done each week end).
I would love to hear ideas on how to best move code from development server to production server.
A list of gotcha's, don't do this list would be helpful.
Any tools to help automate the steps of.
Make backups of existing code, given these list of files
Record the Deployment of these files from dev to production
Allow easier rollback if deployment or app fails in any way...
I have never worked at a company that had a deployment process, other than a very manual, ftp files from dev to production.
What have you done in your companies, departments, etc?
Thank you...
Yes, I am a coldfusion programmer, but files are files, and this should be language agnostic question.
OK, I'll bite. There's the technology aspect of this problem, which other answers have already covered. But the real issue is a process problem. Where the real focus should be ensuring a meaningful software development life cycle (SDLC) - planning, development, validation, and deployment. I'll cover each in turn. What you want is a repeatable activity at each phase.
Planning
Articulating and recording what's to be delivered. Often tickets or user stories are enough. Sometimes you do more, like a written requirements document, that a customer signs off on, that's translated into various artifacts such as written use cases - ultimately what you want though is something recorded in an electronic system where you can associate changes to code with it. Which leads me to...
Development
Remember that electronic system? Good. Now when you make changes to code (you're committing to source control right?) you associate those change with something in this electronic system - typically tickets. I like Trac, but have also heard good things about Atlassian's suite. This gives you traceability. So you can assert what's been done and how. Then you can use this system and source control to create a build - all the bits needed for whatever's changed - and tag that build in source control - that's your list of what's changed. Even better, have a build contain everything, so that it's standalone entity that can easily be deployed on it's own. The build is then delivered for...
Validation
Perhaps the most important step that many shops ignore - at their own peril. Defects found in production are exponentially more expensive to fix then when they're discovered earlier in the process. And validation is often the only step where this occurs in many shops - so make sure yours does it.
This should not be done by the programmer! That's like the fox watching the hen house. And whoever is doing is should be following some sort of plan. We use Test Link. This means each build is validated the same way, so you can identify regression bugs. And, this build should be deployed in the same way as you would into production.
If all goes well (we usually need a minimum of 3 builds) the build is validated. And this goes to...
Deployment
This should be a non-event, because you're taking a validated build following the same steps as you did in testing. Could be first it hits a staging server, where there's an automated copying process, but the point being is that is shouldn't be an issue at this point, because you validated with the same process.
Conclusion
In terms of knowing what's where, what you really want is a logical way to group changes together. This is where the idea of a build comes in. It's really the unit that should segue between steps in the SDLC. If you already have that, then the ability to understand the state of a given system becomes trivial.
Check out Ant or Maven - these are build and deployment tools used in the Java world which can help you copy / ftp files, backup and even check out code from SVN.
You can automate your deployment steps using these tools, for example Ant will allow you declare a set of tasks as part of your deployment. So you could, for example:
Check out a revision using SVNAnt or similar to a directory
Copy (and perhaps zip first) these files to a backup directory
FTP all the files to your web server(s)
Create a report to email to the team illustrating the deployment
Really you can do almost anything you wish to put time into using Ant. Maven is a little more strucutred (and newer) and you can see a discussion of the differences here.
Hope that helps!
In a nutshell...
You should start with some source control solution - probably Subversion or Git. Once that's in place you can create a script that generates a clean build of your source code and deploys it to your production server(s).
You could do this with a simple batch script or use something like Ant for more control. Here is a simple example of a batch file using Subversion:
svn copy svn://path/to/your/project/trunk -r HEAD svn://path/to/your/project/tags/%version%
svn checkout svn://path/to/your/project/trunk -r HEAD //path/to/target/directory
Ant makes it easy to do things like automatically run unit tests and sync directories. For example:
<sync todir="//path/to/target/directory" includeEmptyDirs="true" overwrite="true">
<fileset dir="${basedir}">
<exclude name="**/*.svn"/>
<exclude name="**/test/"/>
</fileset>
</sync>
This is really just a starting point. A next step might be a continuous integration solution like Hudson. I would also recommend reading "Pragmatic Project Automation: How to Build, Deploy, and Monitor Java Applications".
One ColdFusion specific gotcha is to make sure you clear the Application scope when required (to update any singleton components). A common approach here is to use a URL parameter that causes onRequestStart() to call onApplicationStart(). You may also have to clear the trusted cache.
We use a system called AnthillPro: http://www.anthillpro.com
It's commercial software, but it allows us to completely automate our deployment process across multiple servers and operating systems (We currently use it for both ColdFusion and Java, but it can be used for most languages. It has a ton of 3rd party integrations:
http://www.anthillpro.com/html/products/anthillpro/tool-integrations.html