TPOT model in google ml-engine - google-cloud-ml

I have a trained model using TPOT. When I tried loading the model to ml-engine it says:
No module named tpot.builtins.stacking_estimator
The error makes sense since TPOT is an external package, not included in Cloud ML Engine runtime versions. Is there any way to get around it?

For what I can see, you only need PyPI to add the related dependencies. Checking the dependencies for the installation of TPOT, all of those can be installed through pip.
So, follow the documentation for "Adding standard (PyPI) dependencies" and include this module and all the related dependencies in your setup.py file. And follow the related steps to package and upload your training application.

Related

Enable Babel plugins in ember-auto-import for transpilation of imported library

I want to move some of utility functions and classes from my Ember app into a separate NPM library and import it with ember-auto-import. I don't want to transpile the library code before publication but publish in authoring format. This shouldn't be an issue as ember-auto-import transpiles the code automatically at build time depending on app's configuration.
But this code is experimental decorators feature as many Ember code these days does. Babel used by ember-auto-import throws an error that the decorators-legacy feature is not enabled:
Support for the experimental syntax 'decorators-legacy' isn't
currently enabled
How can I enable it in the configuration of ember-auto-import? I only see an option to disable transpilation per dependency and custom webpack configuration in ember-auto-import's documentation. I don't have much experience with Webpack. Is babel controlled through Webpack configuration?
I just noticed that I get the same error if I reference a dependency on local file system using link protocol. I don't see the error if I use file protocol. I'm using yarn. This issue was solved by deleting node_modules and installing dependencies in referenced addon.

Drupal composer install: Your requirements could not be resolved to an installable set of packages

I am using drupal 8 and I am trying to install the Search API Solr module each time I run the command.
composer require drupal/search_api_solr
I am getting this error
Your requirements could not be resolved to an installable set of
packages.
Here is the screenshot of the error I am getting. Could anyone please help me
Which Drupal Core version do you use ?
It could be related to the symfony/event-dispatcher library.
Can you try composer require symfony/event-dispatcher:"4.3.4 as 3.4.99" drupal/search_api_solr ?
More infos here : https://www.drupal.org/project/drupal/issues/2876675#comment-13272878
Looks like the module you are trying to install needs some additional package.

install be_helper on datalab

I knew that BigQuery module is already installed on datalab. I just wanna to use bq_helper module because I learned it on Kaggle.
I did !pip install -e git+https://github.com/SohierDane/BigQuery_Helper#egg=bq_helper and it worked.
but I can't import the bq_helper. The pic is shown below.
Please help. Thanks!
I used python2 on Datalab.
I am not familiar with the BigQuery Helper library you shared, but in general, in Datalab, it may happen that you need to restart the kernel in order for the libraries to be properly loaded.
I reproduced the scenario you proposed: installing the library with the command !pip install -e git+https://github.com/SohierDane/BigQuery_Helper#egg=bq_helper and then trying to import it in the notebook using:
from bq_helper import BigQueryHelper
bq_assistant = BigQueryHelper("bigquery-public-data", "github_repos")
bq_assistant.project_name
At first, it did not work and I obtained the same error as you; then I clicked on the Reset Session button and the library was loaded properly.
Some other details that may be relevant if this does not work for you are:
I am also running on Python2 (although the GitHub page of the library suggests that it was only tested in Python3.6+).
The Custom metadata parameters in the Datalab GCE instance are: created-with-datalab-version: 20180503 and created-with-sdk-version: 208.0.2.

Couldn't import cv2 in GCP ml-engine (runtime version 1.8)

When using runtime version 1.8, i'm getting this error when I tried to import cv2:
/usr/lib/python2.7/dist-packages/cv2.x86_64-linux-gnu.so: undefined symbol: _ZN2cv9Algorithm7getListERSt6vectorINSt7__cxx1112basic_stringIcSt11char_traitsIcESaIcEEESaIS7_EE
Anyone knows if there's any workaround? Looks like glib needs to be installed in the image, but it wasn't.
Cloud ML images already has installed the python-opencv package. If you are facing the issue in your local environment instead of CloudML, most probably you have dependencies problems, for example when two different programs modify the same package. Other similar threads that solved the issue are:
Related to PKG_CONFIG_PATH.
Related to differences between pip versions 2.7 and 3.4.
I found this tutorial that may be useful for you Running a Spark Application with OpenCV on Cloud Dataproc.

Import setup module error while deploying to app engine via google cloud sdk

I am writing after a lot of searching and trial and error with no luck.
I am trying to deploy a service in app engine.
You might be aware that deploying on app engine is usually practiced a two step process
1. Deploy on local dev app server
2. If step 1 succeeds deploy on cloud
My problems are with step 1 when I include third party python libraries such as numpy, sklearn, gcloud etc.
I am trying to deploy a service in local devapp server. When I import numpy or any other third party libraries in my main.py script it throws an error saying unable to find the module.
I am using cloud sdk and have two python distributions, the default python 2.7 and anaconda with python 2.7. When I change the path to look for the modules in anaconda distribution, it fails to find module ‘setup’ required by the cloud sdk.
Is there a way to install the cloud sdk for anaconda distribution ?
Any help/pointers will be much appreciated!
When using app engine python standard environment, you can install pure python 3rd party libs using pip by vendoring them as explained here.
There are also a number of libraries included in the python27 runtime which can be requested using the libraries directive in your app.yaml as explained here.
If there's a lib which is not pure python (i.e it uses C extensions) that you want to use in your project, and it's not part of this list, then your only option is to use a flexible VM. If you want to use anaconda, you should consider customizing the runtime for your flexible VM.