Possible for server to run scripts - server-side

So I realize this question is broad, as I'm looking for a place to start.
Is it possible to have a server that for example runs PHP scripts every day, or every 2 minutes?
Pretty much, I don't want to put these tasks in the users script, because I'm scared they would get run a lot more than is necessary, and I'd prefer to keep it separate.
I'm looking to keep it javascript, PHP or SQL as I don't have the time to learn a new language atm, so if any of you have any good places for me to read up on this, I would greatly appreciate it.

There might be some way to run javascript on the server side, but I'd certainly stick to PHP. SQL is not a script, by the way...
You can execute your PHP scripts every x minutes / hours / days using cronjobs. Just google crontab. You need (root) access to the server though.

Javascript will not run server side as you are expecting. what I would suggest is you look into writing a cron job to run a bash file. It would take minimal research to complete.
Cron Intro
Bash Intro

Yes, it's completely possible. On Linux (Mac too?), you use the cron job scheduler to run your scripts.
For Windows, you can try the Windows Task Scheduler. It provides similar functionality.
You can run just about any script with cron, but not JavaScript (unless you use it for serverside stuff).

Related

Coldfusion continuous Integration

let me begin by saying I 'm a coldfusion newbie.
I 'm trying to research if its possible to do the following and what would be the best approach to achieve it.
Whenever a developer checks in code into SVN, I would like to do a get all the new changes/files and do an auto build to check if the code can be deployed successfully to production server. I guess there are two parts to it, one syntax checking and second integration test(if functionality is working as expected). For the later part some unit test tools would have to be used.
Can someone comment on their experience doing something similar for coldfusion.
Sorry for being a bit vague...I know its a very open-ended question but any feedback would be appreciated.
Thanks
There's a project called "Cloudy With A Chance of Tests" that purports to do what you require. In particular it brings together a number of other CFML code analysis projects (VarScope & QueryParam) to check code, as well as unit testing. I am not currently using it myself but did have a look at it some time ago (more than 12 months) and it appeared to be quite good.
https://github.com/mhenke/Cloudy-With-A-Chance-Of-Tests
Personally I run MXUnit tests in Jenkins using the instructions from the MXUnit site - available here:
http://wiki.mxunit.org/display/default/Continuous+Integration+--+Running+tests+with+Jenkins
Essentially this is set up as an ant task in Jenkins, which executes the MXUnit tests and reports back the results.
We're not doing fully continuos integration, but we have a process which automates some of the drudgery of our builds:
replace the site's application.cf(m|c) with one that tells users that the app is being deployed (we had QA staff raising defects that were due to re-deployments)
read a database manifest XML which lists all SQL scripts which make up the current release. We concatenate the scripts into a single upgrade script, suitable for shipping
execute the SQL script against the server's DB, noting any errors. The concatenation process also adds a line of SQL after each imported script that white to a runlog table, so we can see what ran, how long it took and which build it was associated with. If you're looking to replicate this step, take a look at Liquibase
deploy the latest code
make an http call to a ?reset=true type URL to tell the app to re-initialize
execute any tests
The build is requested manually through the build servers we have, but you click a button, make tea and it's done.
We've just extended the above to cope with multiple servers in a cluster and it ticks along nicely. I think the above suggestion of using the Jenkins SVN plugin to automate the process sounds like the way to go.

Rather than using crontab, can Django execute something automatically at a predefined time

How to make Django execute something automatically at a particular time.?
For example, my django application has to ftp upload to remote servers at pre defined times. The ftp server addresses, usernames, passwords, time, day and frequency has been defined in a django model.
I want to run a file upload automatically based on the values stored in the model.
One way to do is to write a python script and add it to the crontab. This script runs every minute and keeps an eye on the time values defined in the model.
Other thing that I can roughly think of is maybe django signals. I'm not sure if they can handle this issue. Is there a way to generate signals at predefined times (Haven't read indepth about them yet).
Just for the record - there is also celery which allows to schedule messages for the future dispatch. It's, however, a different beast than cron, as it requires/uses RabbitMQ and is meant for message queues.
I have been thinking about this recently and have found django-cron which seems as though it would do what you want.
Edit: Also if you are not specifically looking for Django based solution, I have recently used scheduler.py, which is a small single file script which works well and is simple to use.
I've had really good experiences with django-chronograph.
You need to set one crontab task: to call the chronograph python management command, which then runs other custom management commands, based on an admin-tweakable schedule
The problem you're describing is best solved using cron, not Django directly. Since it seems that you need to store data about your ftp uploads in your database (using Django to access it for logs or graphs or whatever), you can make a python script that uses Django which runs via cron.
James Bennett wrote a great article on how to do this which you can read in full here: http://www.b-list.org/weblog/2007/sep/22/standalone-django-scripts/
The main gist of it is that, you can write standalone django scripts that cron can launch and run periodically, and these scripts can fully utilize your Django database, models, and anything else they want to. This gives you the flexibility to run whatever code you need and populate your database, while not trying to make Django do something it wasn't meant to do (Django is a web framework, and is event-driven, not time-driven).
Best of luck!

Is it possible to run a django function automatically on certain dates?

In my django project I would like to be able to delete certain entries in the database automatically if they're too old. I can write a function that checks the creation_date and if its too old, deletes it, but I want this function to be run automatically at regular intervals..Is it possible to do this?
Thanks
This is what cron is for.
You will be better off reading this section of the Django docs http://docs.djangoproject.com/en/1.2/howto/custom-management-commands/#howto-custom-management-commands
Then you can create your function as a Django management command and use it in conjunction with cron on *nix (or scheduled tasks on Windows) to run it on a schedule.
See this for a good intro guide to cron http://www.unixgeeks.org/security/newbie/unix/cron-1.html
What you require is a cron job.
A cron job is time-based job scheduler. Most web hosting companies provide this feature which lets u run a service or a script at a time of your choosing. Most Unix-based OSes have this feature.
You would have better direct help asking this question on serverfault.com, the sister site of stackoverflow.

Continuous Integration: PowerShell vs. CI Server (CC.NET or Hudson)

So, a friend and I have been discussing continuous integration and bat/powershell scripts versus CI servers like CruiseControl.Net or Hudson.
The following powershell pseudo script works to update from SVN, build using msbuild, deploy/copy out, update a build/revision number in the app, and emails on failed builds. The next step would be to add calls to MSTest and email results when not successful.
svn update
msbuild > build_deploy_development_out_msbuild
([xml](svn info --xml)).info.entry.commit.revision + [char]13 + [char]10 + (echo %date% %time%) > build_revision_number.html
$linenumber = Select-String build_deploy_development_out_msbuild -pattern "Build Failed" | Select-Object Linenumber
$smtp = New-Object System.Net.Mail.SMTPClient -ArgumentList localhost | if($linenumber > 0) $smtp.Send("From:Email","To:Email", "build failed", "build failed... some one must die!")
This has lead me to the question of the value of CI servers, when you can write your own shell scripts to accomplish the same goal, using the specific tools of the project (build tool, source control, unit testing) (i.e. msbuild, nant, svn, git, nunit, mstest, etc.)
I have not experienced the maintenance cost as of yet. I wanted to get others opinions on the roll your own shell script versus a CruiseControl.Net or Hudson. Please note, I do not have experience with CI servers, thus the question, so please don't take this as being critical of CI servers; I simply don't know the best answer, and thought I would ask the community.
Best wishes!
Pete Gordon
CI Servers give you several advantages:
Web access, usually with ability to integrate with existing authentication mechanisms (see Hudson's ActiveDirectory/LDAP support)
Tons of existing support for unit testing, zip archive creationg, etc.
Hudson (and others) supports slave build nodes, for doing distributed CI tasks.
No need to maintain it yourself.
Some of these may not be things you need now but are you sure they aren't things you might need in the future?
I've installed Hudson a couple of weeks ago to replace the current CruiseControl server. The greatest advantage I see in Hudson is that pretty much anybody can use it, while launching a parametrized build with CruiseControl (or a batch file) is still scary for a lot of people.
Usually I tend to write all my build scripts with Ant (because it's portable), insert a couple of parameters and I invoke them from Hudson.
Hudson gives your scripts a great visibility (everything can be seen on the front page) and they are self explanatory. Usually with a bash script, you need to write a readme (that nobody reads) and remember where they are located.
... or have it both. Ayende (the creator of Rhino Mocks) has done that recently. He wrote a CI server using PowerShell. Perhaps this provides new insights for your discussion.
For a year I've tried to maintain custom-written python scripts to do basic CI stuff: recieving notifications of commits via e-mail, checking out and building stuff, sending back blames and congrats, then when it came to publishing this for use by everyone else in my team, it turned out raaaather unusable without monitoring, web-access, etc, etc.
Then I've dived into buildbot and found it truly beautiful. I've set up basically the same process in a couple of days. Build script is a true python object, that is customizeable at the master, from where it gets transferred to slaves and executed there. Built upon twisted framework, that is lots of stuff out-of-the-box ;)
Web UI is minimalistic, though sufficient.
Well, this is unpublished too, though I'm close to it this time %)
Below are my thoughts on CI Server over a Powershell scripts
Highly Configurable Plugins are available for all different kinds of version control, notifications and testing.
Logs These are maintained wonderfully. Failure and succesful build logs are at your finger tip.
Scheduling You can set all kinds of scheduling including triggerring based on other succesful build
Security You can set different groups to be able to execute, view only or set to see some projects
Visibility You can use a web dashboard or cctray for different audiences.
Scalability. Easy to scale when needed.
Bottom line if you have to maintain lots of builds for different environment and team projects then CI Server is the way to go. Other than that a simple PowerShell Script is enough for small projects. Once the project grows you can just hook up the existing PowerShell Script to a CI Server.

How do you deploy cron jobs to production?

How do people deploy/version control cronjobs to production? I'm more curious about conventions/standards people use than any particular solution, but I happen to be using git for revision control, and the cronjob is running a python/django script.
If you are using Fabric for deploment you could add a function that edits your crontab.
def add_cronjob():
run('crontab -l > /tmp/crondump')
run('echo "#daily /path/to/dostuff.sh 2> /dev/null" >> /tmp/crondump')
run('crontab /tmp/crondump')
This would append a job to your crontab (disclaimer: totally untested and not very idempotent).
Save the crontab to a tempfile.
Append a line to the tmpfile.
Write the crontab back.
This is propably not exactly what you want to do but along those lines you could think about checking the crontab into git and overwrite it on the server with every deploy. (if there's a dedicated user for your project.)
Using Fabric, I prefer to keep a pristine version of my crontab locally, that way I know exactly what is on production and can easily edit entries in addition to adding them.
The fabric script I use looks something like this (some code redacted e.g. taking care of backups):
def deploy_crontab():
put('crontab', '/tmp/crontab')
sudo('crontab < /tmp/crontab')
You can also take a look at:
http://django-fab-deploy.readthedocs.org/en/0.7.5/_modules/fab_deploy/crontab.html#crontab_update
django-fab-deploy module has a number of convenient scripts including crontab_set and crontab_update
You can probably use something like CFEngine/Chef for deployment (it can deploy everything - including cron jobs)
However, if you ask this question - it could be that you have many production servers each running large number of scheduled jobs.
If this is the case, you probably want a tool that can not only deploy jobs, but also track success failure, allow you to easily look at logs from the last run, run statistics, allow you to easily change the schedule for many jobs and servers at once (due to planned maintenance...) etc.
I use a commercial tool called "UC4". I don't really recommend it, so I hope you can find a better program that can solve the same problem. I'm just saying that administration of jobs doesn't end when you deploy them.
There are really 3 options of manually deploying a crontab if you cannot connect your system up to a configuration management system like cfengine/puppet.
You could simply use crontab -u user -e but you run the risk of someone having an error in their copy/paste.
You could also copy the file into the cron directory but there is no syntax checking for the file and in linux you must run touch /var/spool/cron in order for crond to pickup the changes.
Note Everyone will forget the touch command at some point.
In my experience this method is my favorite manual way of deploying a crontab.
diff /var/spool/cron/<user> /var/tmp/<user>.new
crontab -u <user> /var/tmp/<user>.new
I think the method I mentioned above is the best because you don't run the risk of copy/paste errors which helps you maintain consistency with your version controlled file. It performs syntax checking of the cron tasks inside of the file, and you won't need to perform the touch command as you would if you were to simply copy the file.
Having your project under version control, including your crontab.txt, is what I prefer. Then, with Fabric, it is as simple as this:
#task
def crontab():
run('crontab deployment/crontab.txt')
This will install the contents of deployment/crontab.txt to the crontab of the user you connect to the server. If you dont have your complete project on the server, you'd want to put the crontab file first.
If you're using Django, take a look at the jobs system from django-command-extensions.
The benefits are that you can keep your jobs inside your project structure, with version control, write everything in Python and configure crontab only once.
I use Buildout to manage my Django projects. With Buildout, I use z3c.recipe.usercrontab to install cron jobs in deploy or update.
You said:
I'm more curious about conventions/standards people use than any particular solution
But, to be fair, the particular solution will depend in your environment and there is no universal elegant silver bullet. Given that you happen to be using Python/Django, I recommend Celery. It is an asynchronous task queue for Python, which integrates nicely with Django. And, on top of the features that it gives as an asynchronous task queue, it also has specific features for periodic tasks.
I have personally used the django-celery-beat integration and it integrates perfectly with Django settings and behaves correctly in distributed environments. If your periodic tasks are related to Django stuff, I strongly recommend to take a look at Celery I started using it only for certain asynchronous mailing and ended up using it for a lot of asynchronous tasks + periodic sanity checks and other web application maintenance stuff.