I am trying to write to my PostgreSQL database with AWS Lambda using the python2.7 runtime. I care very little about how I do this, so if anyone has a different way that I can understand that works, I'd love to hear it.
The method I'm currently trying is to use psycopg2, as this is the only way I know. In order to do this, I need to upload the psycopg2 module to my environment on AWS Lambda. As per instructions, I've created a directory with my source and psycopg2 using pip install psycopg2 -t ..\my-project, zipped my-project, and uploaded it.
My error message is this from within the AWS Lambda console: Unable to import module 'lambda_function': No module named _psycopg
The code runs on my windows machine. I think the issue is that when I import psycopg2 from my local windows machine, the _psycopg module is being imported from _psycopg.pyd, and .pyd files are windows specific. I may be wrong about this.
I'm really just looking for any way to achieve the desired result described in my first paragraph, but here's a more specific question: How do I tell windows to pip install and compile psycopg2 without using .pyd files? Is this possible? Do I have something completely wrong?
I know the formatting of this question is a little unorthodox, I think I've given all the necessary information, let me know if there's anything else I can provide.
I solved the problem by opening an ubuntu instance on VirtualBox, pip installing the package there, pulling the relevant folders out, and placing them in my-project before zipping and uploading to AWS Lambda.
See these instructions.
Related
I have one enterprise application built using python flask. There are several individual modules are there which I converted then to whl files and referring them in other modules inside requirements.txt.
Is there any GCP service available to store these whl files other than GCS so that I can directly refer them inside requirements.txt
There is an alpha feature in artifact registry that creates a private pipy for you. You should be able to store your wheel file and to get them, with pip, when you need it (and if you have the permission for).
I'm not a python expert and I didn't tested it, but the promise is here. I will be happy if you provide me feedbacks on this feature!
I'm working with this article Asynchronous Amazon Transcribe Streaming SDK for Python.
I'm trying to create a lambda layer for the required libraries.
I used the following command:
pip3 install amazon-transcribe aiofile -t .
But I get the following error when I use the layer in my lambda function:
Unable to import module 'lambda_function': No module named '_awscrt'
The same works fine with virtual environment locally. I'm not sure what's the exact issue.
I even tried installing awscrt separately but it didn't work.
Any kind of help will be greatly appreciated. Thanks!
Lambda layers .zip files need to follow a specific directory file structure. Look at this section of the documentation to see how it should be structured for Python. This might be your problem.
I built the layer on Amazon Linux and it worked fine!
The troubleshooting guide in the repo helped:
The caio linux implementation works normal for modern linux kernel versions and file systems. So you may have problems specific for your environment. It's not a bug and might be resolved some ways:
1. Upgrade the kernel
2. Use compatible file system
3. Use threads based or pure python implementation.
I knew that BigQuery module is already installed on datalab. I just wanna to use bq_helper module because I learned it on Kaggle.
I did !pip install -e git+https://github.com/SohierDane/BigQuery_Helper#egg=bq_helper and it worked.
but I can't import the bq_helper. The pic is shown below.
Please help. Thanks!
I used python2 on Datalab.
I am not familiar with the BigQuery Helper library you shared, but in general, in Datalab, it may happen that you need to restart the kernel in order for the libraries to be properly loaded.
I reproduced the scenario you proposed: installing the library with the command !pip install -e git+https://github.com/SohierDane/BigQuery_Helper#egg=bq_helper and then trying to import it in the notebook using:
from bq_helper import BigQueryHelper
bq_assistant = BigQueryHelper("bigquery-public-data", "github_repos")
bq_assistant.project_name
At first, it did not work and I obtained the same error as you; then I clicked on the Reset Session button and the library was loaded properly.
Some other details that may be relevant if this does not work for you are:
I am also running on Python2 (although the GitHub page of the library suggests that it was only tested in Python3.6+).
The Custom metadata parameters in the Datalab GCE instance are: created-with-datalab-version: 20180503 and created-with-sdk-version: 208.0.2.
I just tried deploying my Django application on my windows machine using Bitnami's Django stack. However, when I try to access my project via localhost/myapp/, I get an error stating that I can't load my modules/python libraries. I checked via pip and I have these modules installed. It seems this error applies to all my modules/python libraries. How do I solve this? Thanks!
When you install the Bitnami Django Stack, it will ask you for changing Association Setting for Python with the following message:
"This option lets you change the Windows properties to associate the Python files to the new Python that are going to be installed. If you have your own Python you may want to disable this feature."
If you decided to use your Python instead of the one that is installed with the Stack, you have must ensure that you have installed Python in your system and all the modules and libraries are loaded, otherwise you won't be able to use any of them.
Trying to upload my django app on my obunto slice. The problem I'm facing right now there are a couple of packages I'm using. Which I installed in site packages on my machine. Now when I put them online on the server their sadly not working. Any ideas how to make them work.
p.s I get a error on import
Python must have a way to find these packages. Did you use standard installation procedures for them (i.e. setup.py install) or copy them in an accessible directory? If you didn't use setup.py install, check your PYTHONPATH environment variable. It should contain the directory where your packages are stored. If it doesn't, you can create it.
This is a Python issue really, not a Django issue.
To get more help paste the import error you're getting, as well as the directory structure of where you installed this package.