How to install Azure module in IBM Data Science Experience - data-science-experience

I'm trying to import Azure data into DSx. I get an error when I try to import the module. When I use the command "from azure.storage.blob import BlobService" in DSx, it tells me that there's no module with that name. Do I have to do some further setup in DSx to access this module?

Please install the azure package by running following command in your notebook:-
!pip install azure
then run this to import your library
from azure.storage.blob import BlobService
Please also refer to this article for different ways of installing libraries:-
http://datascience.ibm.com/docs/content/analyze-data/importing-libraries.html
Thanks,
Charles.

Related

How to resolve cannot import the name coord_cython from pymatgen.util in pymatgen (2022.9.21) with python 3.8?

I am using the Linux operating system on a remote server. I used pip install to install pymatgen 2022.9.21 but I am getting an error which I have pasted below. I tried the older version of Pymatgen, but still could not resolve it. I would appreciate your suggestions.
Thank you.
ImportError: cannot import name 'coord_cython' from 'pymatgen.util' (/home/ikhatri/.local/lib/python3.8/site-packages/pymatgen/util/init.py)

Deploying Python script that use google-api-python-client library as Google Cloud Function

I've created a Python script that use google-api-python-client library to pull data from my DFA account.
I would like to deploy this script to Google Cloud Scheduler so I can run this script on a daily basis. When I deployed the Cloud Function, I received an error that said:
"ModuleNotFoundError: No module named "googleapiclient".
After I've added "googleapiclient" to the requirements.txt and deployed it again, I got a new error that said it couldn't find the googleapiclient library to add.
May I ask if it's possible to install the googleapiclient library under the Google Cloud Platform?
Yes it's possible to install and use the googleapiclient library in GCP. Use googleapiclient module when importing the Google API Client in your python code.
import googleapiclient
Then add the google-api-python-client in your requirements.txt
# Function dependencies, for example:
# package>=version
google-api-python-client
You may check this sample code from github that uses the googleapiclient module and this requirements.txt.

No module named 'nltk.lm' in Google colaboratory

I'm trying to import the NLTK language modeling module (nltk.lm) in a Google colaboratory notebook without success. I've tried by installing everything from nltk, still without success.
What mistake or omission could I be making?
Thanks in advance.
.
Google Colab has nltk v3.2.5 installed, but nltk.lm (Language Modeling package) was added in v3.4.
In your Google Colab run:
!pip install -U nltk
In the output you will see it downloads a new version, and uninstalls the old one:
...
Downloading nltk-3.6.5-py3-none-any.whl (1.5 MB)
...
Successfully uninstalled nltk-3.2.5
...
You must restart the runtime in order to use newly installed versions.
Click the Restart runtime button shown in the end of the output.
Now it should work!
You can double check the nltk version using this code:
import nltk
print('The nltk version is {}.'.format(nltk.__version__))
You need v3.4 or later to use nltk.lm.

Using Pyomo GLPK in google cloud app engine

I set up a Flask service local using pyomo glpk solver, and it runs correctly on my local machine.
But when I uploaded it to a GCloud App Engine, with the exact same virtual environment that worked locally, I got the error:
RuntimeError: Attempting to use an unavailable solver.
I've already downloaded the glpk windows version from the glpk website and used glpsol.exe path as an argument and that worked locally, but didn't work on my GCloud App Engine.
I ran conda install -c conda-forge glpk with the virtual environment activated, which did not help.
import pandas as pd
from pyomo.opt import SolverStatus, TerminationCondition
from pyomo.environ import *
import sys
...
solver=SolverFactory('glpk', executable='venv\\Library\\bin\\glpsol.exe')
This is the relevant part of my code. I've tried different glpsol.exe paths, with no success so far.
Does anyone know how to deploy a pyomo with glpk solver to a GCloud App Engine environment?
You won't be able to run a Windows executable on App Engine.
There's no Windows OS through the service..
I didnt't get a solution to this problem, so I decided to use another solver library.

install be_helper on datalab

I knew that BigQuery module is already installed on datalab. I just wanna to use bq_helper module because I learned it on Kaggle.
I did !pip install -e git+https://github.com/SohierDane/BigQuery_Helper#egg=bq_helper and it worked.
but I can't import the bq_helper. The pic is shown below.
Please help. Thanks!
I used python2 on Datalab.
I am not familiar with the BigQuery Helper library you shared, but in general, in Datalab, it may happen that you need to restart the kernel in order for the libraries to be properly loaded.
I reproduced the scenario you proposed: installing the library with the command !pip install -e git+https://github.com/SohierDane/BigQuery_Helper#egg=bq_helper and then trying to import it in the notebook using:
from bq_helper import BigQueryHelper
bq_assistant = BigQueryHelper("bigquery-public-data", "github_repos")
bq_assistant.project_name
At first, it did not work and I obtained the same error as you; then I clicked on the Reset Session button and the library was loaded properly.
Some other details that may be relevant if this does not work for you are:
I am also running on Python2 (although the GitHub page of the library suggests that it was only tested in Python3.6+).
The Custom metadata parameters in the Datalab GCE instance are: created-with-datalab-version: 20180503 and created-with-sdk-version: 208.0.2.