Has anyone written or knows the whereabouts of an Airflow Operator for Alteryx? Trying to automate the execution of an existing Alteryx workflow, typically executed manually. If this helps at all, both Airflow and Alteryx are behind the same firewall in an on-prem datacenter.
**For reasons beyond the scope of this question, I recognize there are better ways of starting an Alteryx workflow other than Airflow.
There is currently not an Airflow Operator for directly executing Alteryx workflows on Windows. However, with the existing PythonOperator and Python libraries like pypsrp and pywinrm, one is able to remote into the Windows box that operates Alteryx and execute PowerShell commands to run Alteryx via the command line.
PYPSRP - Python Library
PYWINRM - Python Library
I have a requirement to modify ~/.ssh/authorized_keys to add custom public keys for login. I found this article which is for python job.
https://beam.apache.org/documentation/sdks/python-pipeline-dependencies/
How can we do same for Java dataflow job.
You could create a custom container, potentially with a custom entrypoint.
See:
https://cloud.google.com/dataflow/docs/guides/using-custom-containers
There is also https://github.com/apache/beam/blob/master/sdks/java/core/src/main/java/org/apache/beam/sdk/harness/JvmInitializer.java , that may potentially help here, but it was written for a different purpose.
I am using cloud init to configure nodes as they come up now also need a decommissioning script to clear up some entries from remote servers. Is there an easy way to specify the decommission script like we specify the init script on node launch?
That's not actually a cloud-init function as much as a linux function generally. This question should help you run scripts on shutdown, which is what you seek.
In my django project I would like to be able to delete certain entries in the database automatically if they're too old. I can write a function that checks the creation_date and if its too old, deletes it, but I want this function to be run automatically at regular intervals..Is it possible to do this?
Thanks
This is what cron is for.
You will be better off reading this section of the Django docs http://docs.djangoproject.com/en/1.2/howto/custom-management-commands/#howto-custom-management-commands
Then you can create your function as a Django management command and use it in conjunction with cron on *nix (or scheduled tasks on Windows) to run it on a schedule.
See this for a good intro guide to cron http://www.unixgeeks.org/security/newbie/unix/cron-1.html
What you require is a cron job.
A cron job is time-based job scheduler. Most web hosting companies provide this feature which lets u run a service or a script at a time of your choosing. Most Unix-based OSes have this feature.
You would have better direct help asking this question on serverfault.com, the sister site of stackoverflow.
How do people deploy/version control cronjobs to production? I'm more curious about conventions/standards people use than any particular solution, but I happen to be using git for revision control, and the cronjob is running a python/django script.
If you are using Fabric for deploment you could add a function that edits your crontab.
def add_cronjob():
run('crontab -l > /tmp/crondump')
run('echo "#daily /path/to/dostuff.sh 2> /dev/null" >> /tmp/crondump')
run('crontab /tmp/crondump')
This would append a job to your crontab (disclaimer: totally untested and not very idempotent).
Save the crontab to a tempfile.
Append a line to the tmpfile.
Write the crontab back.
This is propably not exactly what you want to do but along those lines you could think about checking the crontab into git and overwrite it on the server with every deploy. (if there's a dedicated user for your project.)
Using Fabric, I prefer to keep a pristine version of my crontab locally, that way I know exactly what is on production and can easily edit entries in addition to adding them.
The fabric script I use looks something like this (some code redacted e.g. taking care of backups):
def deploy_crontab():
put('crontab', '/tmp/crontab')
sudo('crontab < /tmp/crontab')
You can also take a look at:
http://django-fab-deploy.readthedocs.org/en/0.7.5/_modules/fab_deploy/crontab.html#crontab_update
django-fab-deploy module has a number of convenient scripts including crontab_set and crontab_update
You can probably use something like CFEngine/Chef for deployment (it can deploy everything - including cron jobs)
However, if you ask this question - it could be that you have many production servers each running large number of scheduled jobs.
If this is the case, you probably want a tool that can not only deploy jobs, but also track success failure, allow you to easily look at logs from the last run, run statistics, allow you to easily change the schedule for many jobs and servers at once (due to planned maintenance...) etc.
I use a commercial tool called "UC4". I don't really recommend it, so I hope you can find a better program that can solve the same problem. I'm just saying that administration of jobs doesn't end when you deploy them.
There are really 3 options of manually deploying a crontab if you cannot connect your system up to a configuration management system like cfengine/puppet.
You could simply use crontab -u user -e but you run the risk of someone having an error in their copy/paste.
You could also copy the file into the cron directory but there is no syntax checking for the file and in linux you must run touch /var/spool/cron in order for crond to pickup the changes.
Note Everyone will forget the touch command at some point.
In my experience this method is my favorite manual way of deploying a crontab.
diff /var/spool/cron/<user> /var/tmp/<user>.new
crontab -u <user> /var/tmp/<user>.new
I think the method I mentioned above is the best because you don't run the risk of copy/paste errors which helps you maintain consistency with your version controlled file. It performs syntax checking of the cron tasks inside of the file, and you won't need to perform the touch command as you would if you were to simply copy the file.
Having your project under version control, including your crontab.txt, is what I prefer. Then, with Fabric, it is as simple as this:
#task
def crontab():
run('crontab deployment/crontab.txt')
This will install the contents of deployment/crontab.txt to the crontab of the user you connect to the server. If you dont have your complete project on the server, you'd want to put the crontab file first.
If you're using Django, take a look at the jobs system from django-command-extensions.
The benefits are that you can keep your jobs inside your project structure, with version control, write everything in Python and configure crontab only once.
I use Buildout to manage my Django projects. With Buildout, I use z3c.recipe.usercrontab to install cron jobs in deploy or update.
You said:
I'm more curious about conventions/standards people use than any particular solution
But, to be fair, the particular solution will depend in your environment and there is no universal elegant silver bullet. Given that you happen to be using Python/Django, I recommend Celery. It is an asynchronous task queue for Python, which integrates nicely with Django. And, on top of the features that it gives as an asynchronous task queue, it also has specific features for periodic tasks.
I have personally used the django-celery-beat integration and it integrates perfectly with Django settings and behaves correctly in distributed environments. If your periodic tasks are related to Django stuff, I strongly recommend to take a look at Celery I started using it only for certain asynchronous mailing and ended up using it for a lot of asynchronous tasks + periodic sanity checks and other web application maintenance stuff.