How to use Weasyprint with AWS Lambda? (Django & Zappa) - django

I have a simple Django app which has been pushed to AWS Lambda using Zappa.
This process has worked properly, with one exception : cannot load library 'pango-1.0': pango-1.0: cannot open shared object file: No such file or directory. Additionally, ctypes.util.find_library() did not manage to locate a library called 'pango-1.0'
I'm using Weasyprint to generate PDF files. Weasyprint needs Cairo and Pango.
I don't know how to get Pango to work on my AWS Lambda install.
What should I do to make it work ?

So, after asking around in multiple locations, I found out that I needed the static versions of all the libraries required by Weasyprint, and that I needed to push these in my zappa package.
Luckily, a github user has uploaded a working repo of the static requirements : https://github.com/Prasengupta/weasyprint_for_awslambda
So all I had to do was download it and extract all the folders at the root of my django app (the folders must be at the same level as the zappa_settings.json file).
I then just had to do a zappa update command to upload all these files to my AWS Lambda install, and it worked!
My Django app is now full of weird directories, but at least the whole thing works.

Related

azure static files missing/ wrong MIME

Django website deployed to Azure (F1 - free subscription) Linux, all static files are missing/ not rendered. Even the files from admin which are not changed. Locally works fine, I've googled around tried to upload without VS code etc. still does not work.
Source code of app - https://github.com/Azure-Samples/python-docs-hello-django
Tutorial - https://learn.microsoft.com/en-us/azure/app-service/quickstart-python?tabs=bash&pivots=python-framework-django
Deployed via Azure CLI, any pointers I would gladly take.
P.S. Templates dir and HTML files are fine/ loaded correctly, images from static are also missing.
Solution to my problem was new install + whitenoise, requirements.txt for Azure, following the approach and settings file in the video below:
https://www.youtube.com/watch?v=D6Wyk9q2JM0&list=RDCMUCsMica-v34Irf9KVTh6xx-g&start_radio=1&t=302s
Github project - https://github.com/microsoft/beginners-django

Is there any GCP managed service is available to store python whl files and use them in requirements.txt

I have one enterprise application built using python flask. There are several individual modules are there which I converted then to whl files and referring them in other modules inside requirements.txt.
Is there any GCP service available to store these whl files other than GCS so that I can directly refer them inside requirements.txt
There is an alpha feature in artifact registry that creates a private pipy for you. You should be able to store your wheel file and to get them, with pip, when you need it (and if you have the permission for).
I'm not a python expert and I didn't tested it, but the promise is here. I will be happy if you provide me feedbacks on this feature!

GhostScript in Azure

I'm in the process of moving some on-premise app to Azure and struggling with once aspect - GhostScript. We use GhostScript to convert PDF's to multi page TIFF's. At present this is deployed in an Azure VM, but it seems like a WebApp and WebJob would be a better fit - from a management point of view. In all of my testing I've been unable to get a job to run the GhostScript exe.
Has anyone been able to run GhostScript or any third party exe in a WebJob?
I have tried packaging the GhostScript exe, lib and dll into a ZIP file and then unzip to Path.GetTempPath() and then using a new System.Diagnostics.Process to run the exe with the required parameters - this didn't work - the process refused to start with an exit code of -1073741819.
Any help or suggestions would be appreciated.
We got it to work here:
Converting PDFs to Multipage Tiff files Using Azure WebJobs. The key was putting the Ghostscript assemblies in the root of the project and setting "Copy always". This is what allows them to be pushed to the Azure server, and to end up in the correct place, when you publish the project.
Also, we needed to download the file to be processed by Ghostscript to the local Azure WebJob temp directory. This is discovered by using the following code:
Environment.GetEnvironmentVariable("WEBJOBS_PATH");

How do I make a django project compatible with AWS Beanstalk?

I want to make a Django project compatible with AWS Beanstalk.
I dont want this to be like in AWS tutorial, since they use git and need to setup the whole project as they tell.
I just want to know if there is a way of converting an already created Python-Django project to be AWS Beanstalk compatible. I mean, isn't there a standard project layout to download or a plugin or command-line tool that creates the .ebsettings folder for me? I want to convert my project and upload it throw the AWS web gui, dont need all the git stuff.
You can do this without using git route. You just need to zip your source bundle and upload to the Beanstalk Web Console. The code structure can be kept the way you want.
Key configurations are:
1. WSGIPath : This should point to the .py file which you need to start the app (WSGI app)
2. static: This should point to the path containing the static files
You can add the configurations in the .ebextensions folder, which should at the root of your app zip. You can read more details here: Customizing and Configuring a Python Container - AWS Elastic Beanstalk

Is there a Python3 compatible Django storage backend for Amazon S3?

I'm building a Django app in Python 3.3.1 to be deployed on Heroku. Due to its ephemeral filesystem, Heroku can't serve the app's static files from a local filesystem, so they need to be located elsewhere, and Amazon S3 is where I'd like to put them.
I've found a number of helpful tutorials (Deploying Django on Heroku, among others), all of which make use of the django-storages app and boto to collect the static files and store them on S3. Unfortunately, work on porting boto to Python3 is still incomplete. There are other S3 storage providers that django-storages can work with (Apache Libcloud or the simple Amazon S3 Python library), but django-storages itself doesn't run on Python3, either.
I've also seen hacks that add a collectstatic call to the Heroku app's Procfile, which does put the files somewhere that they can be used by the Django app, but it slows down deployment; the files must be collected and uploaded every time the app deploys. Heroku dynos aren't well-suited to serving static files, anyhow, and I'd eventually like to store user data, as well, which will require a non-Heroku data store like S3.
Is there a Python3-compatible storage backend for Django other than those provided in django-storages? Or am I stuck with Python 2.7 for the time being?
django-storages-redux (now just django-storages) is working for me very nicely in conjunction with boto which now has Python 3 support for its s3 functionality.
django-storages-p3 looks promising. Give it a try and let me know :D.