No module named cloud while using google.cloud import bigquery - python-2.7

i have built an app engine application to load data into bigquery table using google app engine launcher but when I run it on local host or on the cloud i get the No module named cloud while using google.cloud import bigquery error message in log file. I have installed the google cloud client library but it is still giving me the same error. please see below the code I am using
---main.py file contains
import argparse
import time
import uuid
from google.cloud import bigquery
def load_data_from_gcs(dataset_name, table_name, source):
bigquery_client = bigquery.Client()
dataset = bigquery_client.dataset(dataset_name)
table = dataset.table(table_name)
job_name = str(uuid.uuid4())
job = bigquery_client.load_table_from_storage(
job_name, table, source)
job.begin()
wait_for_job(job)
print('Loaded {} rows into {}:{}.'.format(
job.output_rows, dataset_name, table_name))
def wait_for_job(job):
while True:
job.reload()
if job.state == 'DONE':
if job.error_result:
raise RuntimeError(job.error_result)
return
time.sleep(1)
if __name__ == '__main__':
parser = argparse.ArgumentParser(
description=__doc__,
formatter_class=argparse.RawDescriptionHelpFormatter)
parser.add_argument('Test')
parser.add_argument('mytable')
parser.add_argument('gs://week/geninfo.csv')
args = parser.parse_args()
load_data_from_gcs(
args.dataset_name,
args.table_name,
args.source)
--app.yaml file contains the following code
application: mycloudproject
version: 1
runtime: python27
api_version: 1
threadsafe: yes
handlers:
- url: /favicon\.ico
static_files: favicon.ico
upload: favicon\.ico
- url: .*
script: main.app
Please let me know what is missing or if I am doing something wrong here?

This can be a bit tricky. Google Cloud uses the new Python namespace format (if you look at the source you'll notice that there's no __init__.py in the directory structure).
This was changed in Python 3.3 with PEP-420
Fortunately in Python 2.7 you can fix this easily by avoiding implicit imports. Just add this to the top of your file:
from __future__ import absolute_import
Hope that helps.

Find the directory containing google/cloud/..., and add that directory to the PYTHONPATH so that python can find it. See this post for details on how to add to PYTHONPATH. It outlines two common ways to do it:
Here's how to do it with a bash command:
export PYTHONPATH=$PYTHONPATH:/<path_to_modules>
Or you could append it to the path in your script:
# if the google/ directory is in the directory /path/to/directory/
path_to_look_for_module = '/path/to/directory/'
import sys
if not path_to_look_for_module in sys.path:
sys.path.append(path_to_look_for_module)
If that doesn't work, here is some code I found in one of my projects for importing Google Appengine modules:
def fixup_paths(path):
"""Adds GAE SDK path to system path and appends it to the google path
if that already exists."""
# Not all Google packages are inside namespace packages, which means
# there might be another non-namespace package named `google` already on
# the path and simply appending the App Engine SDK to the path will not
# work since the other package will get discovered and used first.
# This emulates namespace packages by first searching if a `google` package
# exists by importing it, and if so appending to its module search path.
try:
import google
google.__path__.append("{0}/google".format(path))
except ImportError:
pass
sys.path.insert(0, path)
# and then call later in your code:
fixup_paths(path_to_google_sdk)
from google.cloud import bigquery

It looks like you are trying to use the Cloud Datastore client library in a Google App Engine's standard environment. As documented in Google's documentation you should not be doing this. Instead, either use the NDB Client Library or do not use the standard environment.

Are you sure you've updated to the latest version of the library? The version installed by pip may be out of date. Previously, the module was imported as:
from gcloud import bigquery
If that works, you're running an older version. To install the latest, I'd recommend pulling from the master in the github project.

Related

What changes are needed to my django app when deploying to pythonanywhere? error points to nowhere

Deploying my django website with S3 as storage which runs fine locally to pythonanywhere gives a strange error I can't google a solution for:
"TypeError: a bytes-like object is required, not 'str'"
What I'm doing wrong?
I've tried to put my environment variables out of settings.env (aws keys, secret_key, etc) ad set them directly in my settings.py app. + every suggestion I could find but it's still the same :(
here's my /var/www/username_pythonanywhere_com_wsgi.py:
# +++++++++++ DJANGO +++++++++++
# To use your own Django app use code like this:
import os
import sys
from dotenv import load_dotenv
project_folder = os.path.expanduser('~/portfolio_pa/WEB') # adjust as appropriate
load_dotenv(os.path.join(project_folder, 'settings.env'))
# assuming your Django settings file is at '/home/myusername/mysite/mysite/settings.py'
path = '/home/corebots/portfolio_pa'
if path not in sys.path:
sys.path.insert(0, path)
os.environ['DJANGO_SETTINGS_MODULE'] = 'WEB.settings'
## Uncomment the lines below depending on your Django version
###### then, for Django >=1.5:
from django.core.wsgi import get_wsgi_application
application = get_wsgi_application()
###### or, for older Django <=1.4
#import django.core.handlers.wsgi
#application = django.core.handlers.wsgi.WSGIHandler()
I'd expect the website to run fine just like it does locally.
Boto library doesn't have a good Python3 support. This particular issue is known in the boto bugtracker: https://github.com/boto/boto/issues/3837
The best way of fixing this is to use boto3 which has decent Python3 support and is a generally most supported AWS SDK for Python.
The reason why it works on your local machine and doesn't work on production is that pythonanywhere setup seems to be using proxy which triggers this incompatible boto code. See the actual calling code: https://github.com/boto/boto/blob/master/boto/connection.py#L747
Your error traceback confirms this.
Unfortunately, I'm not familliar with the django-photologue, but a brief look doesn't suggest that it strongly depends on boto3. Maybe I'm wrong.
I still think that the best way is to go with boto3. As a backup strat you can fork boto with a fix for this issue and install that instead of the official one from PyPI: https://github.com/boto/boto/pull/3699

The worker for a Google Cloud Platform task cannot find the logging library

I have created a simple task based on the Google Cloud Platform "update counter" push task example. All I want to do is log that it has been invoked to the Stackdriver logs.
from google.cloud import logging
logging_client = logging.Client()
log_name = 'service-log'
logger = logging_client.logger(log_name)
import webapp2
class UpdateCounterHandler(webapp2.RequestHandler):
def post(self):
amount = int(self.request.get('amount'))
logger.log_text('Service startup task done.')
app = webapp2.WSGIApplication([
('/update_counter', UpdateCounterHandler)
], debug=True)
After deploying this and invoking it, there is an error. In the logs online it says:
from google.cloud import logging
ImportError: No module named cloud
This isn't a local version, but one that I've deployed. It's hard for me to believe that I have to actually install python libraries into the production runtime. (I can't even imagine that I can.)
As the root readme states:
Many samples require extra libraries to be installed. If there is a requirements.txt, you will need to install the dependencies with pip.
Try adding the library as explained here.
When using logging from the Python standard library in App Engine, the logs also end up in Stackdriver. So you could use import logging instead of from google.cloud import logging.
When you are specifically interested in using the google.cloud.logging library, then it needs to be installed to a project folder ./lib as referred by Tudormi: here

How to import BigQuery in AppEngine for Python

I am trying to run a BigQuery query from Google AppEngine (deployed) using Python 2.7, but I am seeing this error in StackDriver's Error Reporting:
ImportError: No module named cloud
This is my code (main.py):
from __future__ import absolute_import
import webapp2
from google.cloud import bigquery
class MainPage(webapp2.RequestHandler):
def get(self):
# Instantiates a client
bigquery_client = bigquery.Client()
# The name for the new dataset
dataset_name = 'my_new_set'
# Prepares the new dataset
dataset = bigquery_client.dataset(dataset_name)
# Creates the new dataset
dataset.create()
# Remove unwanted chars
#self.response.write(str(container))
app = webapp2.WSGIApplication([
('/', MainPage),
], debug=True)
This is my (app.yaml):
runtime: python27
api_version: 1
threadsafe: true
handlers:
- url: /.*
script: main.app
The error message would make me assume that the BigQuery's library is not being imported. However, if this code is being deployed in AppEngine, shouldn't the library already be installed in AppEngine by default?
Trying to solve the problem
Attempt # 1
I found this post that refers to a similar issue. The suggestion was to add this line to the top of the file. I added the line to my file, but the problem still exists:
from __future__ import absolute_import
Source:
No module named cloud while using google.cloud import bigquery
Attempt # 2
I installed BigQuery's client locally in my laptop:
pip install google-cloud-bigquery==0.22.1
I also installed the same client in the "lib" folder to have it uploaded to AppEngine once it is deployed:
pip install --target='lib' google-cloud-bigquery==0.22.1
This last, also requires a file named "appengine_config.py" to be created with this content:
# appengine_config.py
from google.appengine.ext import vendor
# Add any libraries install in the "lib" folder.
vendor.add('lib')
Source: https://cloud.google.com/appengine/docs/standard/python/tools/using-libraries-python-27
However, this attempt did not work either. The error message changed to the following:
*File "/base/data/home/apps/p~experimenting-1130/2.400173726395247238/lib/httplib2/__init__.py", line 352: print('%s:' % h, end=' ', file=self._fp) ^ SyntaxError: invalid syntax
at <module> (/base/data/home/apps/p~experimenting-1130/2.400173726395247238/lib/google_auth_httplib2.py:23)
at <module> (/base/data/home/apps/p~experimenting-1130/2.400173726395247238/lib/google/cloud/_helpers.py:31)
at <module> (/base/data/home/apps/p~experimenting-1130/2.400173726395247238/lib/google/cloud/bigquery/_helpers.py:21)
at <module> (/base/data/home/apps/p~experimenting-1130/2.400173726395247238/lib/google/cloud/bigquery/__init__.py:26)
at get (/base/data/home/apps/p~experimenting-1130/2.400173726395247238/main.py:75)
at dispatch (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:545)
at dispatch (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:547)
at __call__ (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:1077)
at default_dispatcher (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:1253)
at __call__ (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:1505)
at __call__ (/base/data/home/runtimes/python27/python27_lib/versions/third_party/webapp2-2.3/webapp2.py:1511)*
How can I import the BigQuery library correctly in AppEngine (deployed)?
Thanks for your help.
The following solution worked for me without having to use from __future__ import absolute_import. There are three main steps to follow.
1. Copy google-api-python-client and google-cloud into project folder
Even though it sounds counterintuitive, according to the documentation
[...] Python client libraries are not installed in the App Engine Python runtime environment, [so] they must be vendored into your application just like third-party libraries.
So in order to use google.cloud one must copy the library code into the project's source directory. The library code, along with the application code, is uploaded to App Engine.
To copy a library code into your project:
Create a directory (e.g. lib/) to store third-party libraries in your project's root folder
mkdir lib
Copy the google-api-python-client and google-cloud libraries into the folder you just created. I use pip in the following example.
pip install -t lib/ --upgrade google-api-python-client
pip install -t lib/ --upgrade google-cloud
2. Link installed libraries to app
Create a file named appengine_config.pyin the same folder as your app.yaml file
Edit appengine_config.py and include the following code
# appengine_config.py
from google.appengine.ext import vendor
# Add any libraries install in the "lib" folder.
vendor.add('lib')
3. Include added libraries to requirements.txt
Edit your requirements.txt file and include the names of your added libraries
# other requirements
google-api-python-client
google-cloud
You should now be able to use from google.cloud import bigquery with no problem after deploying your app.
For more information see using third-party libraries

Odoo custom module with external Python library

I created an Odoo Module in Python using the Python library ujson.
I installed this library on my development server manually with pip install ujson.
Now I want to install the Module on my live server. Can I somehow tell the Odoo Module to install the ujson library when it is installed? So I just have to add the Module to my addons path and install it via the Odoo Web Interface?
Another reason to have this automated would be if I like to share my custom module, so others don't have to install the library manually on their server.
Any suggestions how to configure my Module that way? Or should I just include the library's directory in my module?
You should try-except the import to handle problems on odoo server start:
try:
from external_dependency import ClassA
except ImportError:
pass
And for other users of your module, extend the external_dependencies in your module manifest (v9 and less: __openerp__.py; v10+: __manifest__.py), which will prompt a warning on installation:
"external_dependencies": {
'python': ['external_dependency']
},
Big thanks goes to Ivan and his Blog
Thank you for your help, #Walid Mashal and #CZoellner, you both pointed me to the right direction.
I solved this task now with the following code added to the __init__.py of my module:
import pip
try:
import ujson
except ImportError:
print('\n There was no such module named -ujson- installed')
print('xxxxxxxxxxxxxxxx installing ujson xxxxxxxxxxxxxx')
pip.main(['install', 'ujson'])
In python file using the following command, you can install it (it works for odoo only). Eg: Here I am going to install xlsxwriter
try:
import xlsxwriter
except:
os.system("pip install xlsxwriter")
import xlsxwriter
The following is the code that is used in odoo base module report in base addons inside report.py (odoo_root_folder/addons/report/models/report.py) to install wkhtmltopdf.
from openerp.tools.misc import find_in_path
import subprocess
def _get_wkhtmltopdf_bin():
return find_in_path('wkhtmltopdf')
try:
process = subprocess.Popen([_get_wkhtmltopdf_bin(), '--version'], stdout=subprocess.PIPE, stderr=subprocess.PIPE)
except (OSError, IOError):
_logger.info('You need Wkhtmltopdf to print a pdf version of the reports.')
basically you need to find some python code that will run the library and install it and include that code in one of you .py files, and that should do it.

unresolved import gcs_oauth2_boto_plugin

I am currently trying to create bucket in the Google Cloud Storage using Python and the documentation provided by Google here at this link.
https://cloud.google.com/storage/docs/gspythonlibrary
I have followed the instructions and I have successfully install the stand alone gsutil. However, once I go into eclipse and import gcs_oauth2_boto_plugin, it does not recognize it even though it recognizes the import boto.
It was a problem with both my PYTHON PATH and Eclipse. What I ended up doing was
import sys
sys.path.append("/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages")
try:
import boto
from boto import connect_gs
except:
print 'neither of the modules were imported'
This solved my problem. Updating the python path did not.