Using plugin in Google Composer make it crash - google-cloud-platform

I wrote a small plugin for Apache Airflow, which runs fine on my local deployment. However, when I use Google Composer, the user interface hangs and becomes unresponsive. Is there any way to restart the webserver in Google Composer

(Note: This answer is currently more suggestive than finalized.)
As far as restarting the webserver goes...
What doesn't work:
I reviewed Airflow Web Interface in the docs which describes using the webserver but not accessing it from a CLI or restarting.
While you can also run Airflow CLI commands on Composer, I don't see a command for restarting the webserver in the Airflow CLI today.
I checked the gcloud CLI in the Google Cloud SDK but didn't find a restart related command.
Here are a few ideas that may work for restarting the Airflow webserver on Composer:
In the gcloud CLI, there's an update command to change environment properties. I would assume that it restarts the scheduler and webserver (in new containers) after you change one of these to apply the new setting. You could set an arbitrary environment variable to check, but just running the update command with no changes may work.
gcloud beta composer environments update ...
Alternatively, you can update environment properties excluding environment variables in the GCP Console.
I think re-running the import plugins command would cause a scheduler/webserver restart as well.
gcloud beta composer environments storage plugins import ...
In a more advanced setup, Composer supports deploying a self-managed Airflow web server. Following the linked guide, you can: connect into your Composer instance's GKE cluster, create deployment and service Kubernetes configuration files for the webserver, and deploy both with kubectl create. Then you could run a kubectl replace or kubectl delete on the pod to trigger a fresh start.
This all feels like a bit much, so hopefully documentation or a simpler way to achieve webserver restarts emerges to succeed these workarounds.

Related

Can I use Cloud Build to perform CI/CD tasks in VM instances?

I'm using Google Cloud Platform and exploring its CI/CD tools.
I have an app deployed in a VM instance and I'm wondering if I can use GCP's tool such as Cloud Build to do CI/CD instead of using Jenkins.
From what I've learned over several resources, Cloud Build seems to be a nice tool for Cloud Run (deploying Docker images) and Cloud Functions.
Can I use it for apps deployed in VM instances?
When you create a job in Cloud Build, you set up a cloudbuild.yaml file in which you specify the build steps. How to write the step such that it will go into a linux VM, log in as a particular user, cd into a directory, pull the master branch of the project repo, and start running the main.py (say it's a python project)?
You can do this like that
- name: gcr.io/cloud-builders/gcloud
entrypoint: "bash"
args:
- "-c"
- |
gcloud compute ssh --zone us-central1-a my_user#oracle -- whoami;ls -la;echo "cool"
However, it's not a cloud native solution to deploy an app. The VM aren't "pets" but "cattle", that means, when you no longer need it, kill it, no emotion!
So, a modern way to use the cloud, is to create a new VM with the new version of your app. Optionally, you can keep the previous VM, stopped (to pay nothing) in case of rollback. To achieve this, you can add a startup script which install all the required packages, libraries, and you app on the VM, and start it.
An easiest way is to create a container. Like this, all the system dependencies are inside the container, and the VM doesn't need any customization: simply download the container and run it
Cloud Build allows you creating a VM with a startup script with the gcloud CLI. You can also stop the previous one. Do you have a persistent to reuse (for the data between version)? with cloud build you can also clone it and attach it to the new VM; or detach it from the previous one and attach it to the new one!

Running a code from an instance in Google Cloud Composer

I am new to google cloud composer. I have some code in google cloud compute engine -
for eg: test.py
Currently I am using Jenkins as my scheduler - and I'm running the code like below
echo "cd /home/user/src/digital_platform &&/home/user/venvs/bdp/bin/python -m test.test.test" | ssh user#instance-dp
I want to run the same code from google cloud composer.
How I can do that..
Basically I need to ssh to an instance in google cloud and run the code in an automated way using google cloud composer.
It seems that SSHOperator might be something that might work for you. This operator is an Airflow feature, not Cloud Composer feature per se.
The other operator that you might want to take a look at before making your final decision is BaskOperator
You need to create a DAG (workflows), Cloud Composer schedules only the DAGs that are in the DAGs folder in the environment's Cloud Storage bucket. Each Cloud Composer environment has a web server that runs the Airflow web interface that you can use to manage DAGs.
Bash Operator is useful to run command-line programs. I suggest you follow the Cloud Composer Quickstart which shows you how to create a Cloud Composer environment in the Google Cloud Console and run a simple Apache Airflow DAG.

Schedule restart of applications on Pivotal Cloud Foundry

I want to schedule restart for an app so is there any way that applications will be restarted automatically after specific timelimit in PCF?
I am not sure if there is anything within PCF that can execute CF Commands. My suggestion is to have a CI/CD Job Configured (Jenkins-Job for example) that will execute cf restart <app_name> at scheduled Intervals
I've been working on a scheduler service which you can register at Cloud Foundry containing service plans also for restarting apps. There are also other service plans like just triggering an arbitrary http-endpoint. I'd be happy if you try it out and provide me feedback. Just check it out in GIT: https://github.com/grimmpp/cloud-foundry-resource-scheduler
I've also started to describe what is provides and how it can be installed and configured. For using it you just need to create service instances in the marketplace of cloud foundry and specify some parameters for e.g. how often or when it should be called. ...

How to access deployed project in gcloud shell

Deployed a django project in gcloud in flexible environment from local using gcloud app deploy.
The changes are getting reflected in the live url.
I am trying to access the deployed django project folder through gcloud shell, but not able to find it.
What am i doing wrong ?
Extended from discussion with #babygameover.
Google App Engine(GAE) is a PaaS. in GAE, one could just code locally and deploy the project, while the scaling of instances and its related resources would be taken care by gcloud.
And to have control over instances, the project should be moved into Google Compute Engine(GCE) where one would get finer control over instance configurations.

how to restart Django server without terminating database and server instance - AWS

I am hosting a Django application on AWS Elastic Beanstalk. I recently made changes to my URLS.py and apparently (according to this thread: Django ignoring changes made to URLS.py file - Amazon AWS ) I need to 'reload the django process / restart the thread'. I figured that meant for me to run
eb stop
and then
eb start
again but when I ran
eb stop
it needed to first terminate my database as well as my EC2 instance, cloudwatch alarm etc. Is there any way for me to restart the DJango process so that it can update the URLS.py file without me having to run
eb stop
eb start
?
You do not need to stop and start your environment. From what I understand you need to update your environment with your updated source code. Did you try git commit folloed by git aws.push?
Take a look here:
http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/command-reference-get-started.html
Let me know if you run into any issues with git aws.push.
You can also try restart app server on your environment using aws cli:
http://docs.aws.amazon.com/cli/latest/reference/elasticbeanstalk/restart-app-server.html
But as far as I can tell, git aws.push will suffice.
I've had troubles with my Django files not updating after using:
$ eb deploy
The eb cli tool does not have a restart command, however you can login to the AWS console and restart your environment through the actions menu on the dashboard for your eb environment.
This generally fixes any issues that I have. However sometimes I've had to ssh directly into the instance and enable debugging through the settings.
The other command that Rohit referenced is from a different aws cli tool. I haven't personally tried it but here is more documentation on the command and how to install it:
http://docs.aws.amazon.com/cli/latest/userguide/installing.html