This is about GCP Deployment Manager Templates. Based on this documentation, a Python based deployment manager template must use Python 2.7.
Want to know:
Does this mean no support for Python 3?
Is a Python version upgrade planned?
Is there an ETA for Python 3 support?
1 Yeah, Python 3 is not supported on templates
2 In some courses, some google people say that it is planning but there is not a currently BETA version, I think.
Related
I am a scientist who is exploring the use of Dask on Amazon Web Services. I have some experience with Dask, but none with AWS. I have a few large custom task graphs to execute, and a few colleagues who may want to do the same if I can show them how. I believe that I should be using Kubernetes with Helm because I fall into the "Try out Dask for the first time on a cloud-based system like Amazon, Google, or Microsoft Azure" category.
I also fall into the "Dynamically create a personal and ephemeral deployment for interactive use" category. Should I be trying native Dask-Kubernetes instead of Helm? It seems simpler, but it's hard to judge the trade-offs.
In either case, how do you provide Dask workers a uniform environment that includes your own Python packages (not on any package index)? The solution I've found suggests that packages need to be on a pip or conda index.
Thanks for any help!
Use Helm or Dask-Kubernetes ?
You can use either. Generally starting with Helm is simpler.
How to include custom packages
You can install custom software using pip or conda. They don't need to be on PyPI or the anaconda default channel. You can point pip or conda to other channels. Here is an example installing software using pip from github
pip install git+https://github.com/username/repository#branch
For small custom files you can also use the Client.upload_file method.
I am using Cloud Composer and I noticed that it selects the version of Apache Airflow and Python (2.7.x) for me. I want to use a different version of Airflow and/or Python. How can I change this?
Cloud Composer deploys the latest stable build of Airflow. New versions of Airflow are usually deployed by Composer within a a few weeks of their stable release. The Airflow version deployed and the Python version installed cannot be changed at this time. A future release of Cloud Composer may offer the ability to select the Airflow and/or Python version for new environments.
If you want to deploy a specific version of Airflow you will need to use the gcloud CLI tool in order to specify this. It is not currently possible to do this from the web front end.
Have a look at the follow page to see the available versions https://cloud.google.com/composer/docs/concepts/versioning/composer-versions
If you would like to deploy say Airflow 1.10 and Python 3 to your environment you would use the
--image-version
--python-version
flags in order to set this. For example if you used the following it would install with Composer 1.4.1, Airflow 1.10 and Python 3
gcloud beta composer environments create ENV_NAME --image-version composer-1.4.1-airflow-1.10.0 --python-version 3
You will need to specify all the other parameters and arguments required for the environment as well. The above only shows the two arguments to set the Airflow and Python versions.
the latest runtime version 1.4 of the Google machine learning engine supports Python 3.5:
https://cloud.google.com/ml-engine/docs/runtime-version-list
How do I use Python 3.5 for training models in the ML engine?
Thanks
The python_version works for me. Please note that python_version is a string. And also it only works with 1.4 runtime version.
trainingInput:
pythonVersion: '3.5'
scaleTier: BASIC
You can set the field pythonVersion as 3.5 in the TrainingInput. If you use gCloud, you can pass this flag in a yaml file and specify the flag --config=config.yaml assuming config.yaml is your yaml file name.
I am writing after a lot of searching and trial and error with no luck.
I am trying to deploy a service in app engine.
You might be aware that deploying on app engine is usually practiced a two step process
1. Deploy on local dev app server
2. If step 1 succeeds deploy on cloud
My problems are with step 1 when I include third party python libraries such as numpy, sklearn, gcloud etc.
I am trying to deploy a service in local devapp server. When I import numpy or any other third party libraries in my main.py script it throws an error saying unable to find the module.
I am using cloud sdk and have two python distributions, the default python 2.7 and anaconda with python 2.7. When I change the path to look for the modules in anaconda distribution, it fails to find module ‘setup’ required by the cloud sdk.
Is there a way to install the cloud sdk for anaconda distribution ?
Any help/pointers will be much appreciated!
When using app engine python standard environment, you can install pure python 3rd party libs using pip by vendoring them as explained here.
There are also a number of libraries included in the python27 runtime which can be requested using the libraries directive in your app.yaml as explained here.
If there's a lib which is not pure python (i.e it uses C extensions) that you want to use in your project, and it's not part of this list, then your only option is to use a flexible VM. If you want to use anaconda, you should consider customizing the runtime for your flexible VM.
Was anyone able to create a Diy cartridge in OpenShift with Ruby 2.2?
If so can you share with us how? Or if there is any plans on having Ruby 2.2 as a default cartridge?
Thanks
Openshift3 changed the way they build cartages, they build cartages from docker images now.
Here is how you can build ruby-2.2 deployment environment:
https://blog.openshift.com/deploying-ruby-applications-on-openshift-3/
Pretty old question provided OpenShift v2 is close to EOL. But I thought to share how I made RAILS 5 app work on v2.
The key points is to:
use the DIY cart
create start/stop/deploy hooks that install ruby 2.3 by rvm, etc.
possibly use alternative gems
But too long to paste all scripts here. It involves a hack or two to make RVM happy. I'd suggest looking at last commit of this branch to see my action-hooks from that commit and possibly borrow from there.
+1 to suggestion to plan on deploying on v3 if possible.