Debug django app running inside docker image, using pycharm debugger - django

My app is running inside a docker image (My development team never install software in their machines, only the docker images have the dependencies).
I Need to debug something using pycharm debugger, how do I connect pycharm's debugger to the docker image's python?

One possible method is to treat your Docker container as a remote host and use remote debugging: https://www.jetbrains.com/pycharm/help/remote-debugging.html

Related

How can I package my Django project with a server software so that it is ready to run on any machine?

I want to share the project to multiple clients so that they can use it on their own local networks without doing a lot of work. So somehow packaging the Django project along with a secure server software together so that it easily can be run on any machine would be nice.
Docker might be your best bet, you will need to create a docker image and on any machine that you'll want to run, have the docker client run this image. running it after installing the docker client can be a 1 line in the command line.
an example:
https://docs.docker.com/samples/django/

how to run Django locally as a windows executable app

I'm considering building app with Django for only one user that would run application locally on windows. I read about containerizing Django apps, but it's new for me.
My goal is to run Django server on one click like a standard windows app and connect to it on localhost and interact with web browser. It is even possible?
It is possible, but this may not be the best solution. If you want to release a Django app that can be installed on your client computer, you usually need to ensure all the dependencies are shipped with the app.
Containerising your application means it will depends on Docker runtime (or any container system you use). You will have to setup Docker with your app, or ensure your client has Docker on his machine to run it. If the destination machine runs Windows or macOS, you will need to setup docker-desktop which may be more complicated than standard Docker runtime (linux only).
But if you decide to ship your app without containerising it, it will only depends on a Python interpreter and some dependencies (Django, dateutil, etc.). In such case, using python tools like virtualenv, you may prepare a ready-to-run application by creating the venv and installing dependencies at "build time". Then, with a proper setup (MSI for Windows or DMG for macOS), you may be able to distribute the final application so the client can install and run it without any additional step (you do all the hard job yourself).
Django app can be convert into .exe but it wont work as local server while click .exe because runserver command and some of django service wont support on this way as per my experiment.

Can a Docker remote host keep its files synced with your local machine? Django project that needs auto-reloading

I'm considering the purchase of one of the new Macbook M1's. Docker Desktop is apparently unworkable with their Rosetta 2 engine, and all of my development efforts rely on Docker desktop, and a local development environment that auto-reloads when files are changed.
I haven't done much with Docker remote hosts, but I see that this could be a stop-gap solution until Docker rewrites its engine. Google is failing me... can you keep files on your local machine synced up with your Docker remote host?
No, Docker doesn't do this. Instead, Docker packages your application code into an image; that image can be transferred to a repository (with Docker Hub being the most prominent option), and then run on the remote system, without necessarily needing to have the application code or the interpreter directly installed there. Beyond the image system, Docker has no direct ability to transfer or mount files from one system to another (you could do something like create an NFS-backed named volume, but you would need to run the NFS server yourself).
For day-to-day development, using your language's native isolation system often will work better than trying to simulate a local development environment using Docker. For Python, consider using a tool like Pipfile to create a virtual environment. Python is reasonably platform-independent, so you shouldn't notice any trouble using Apple silicon vs. Intel's.
Don't even consider using the Docker remote API. If you don't configure it perfectly, it's trivial to use it to root the host (and there are many instances of this in the wild). Even if it is configured, you can't use it to mount files from your local system (a docker run -v bind-mount option is always interpreted relative to the Docker host it runs on). If you need to work directly on the remote host for whatever reason, use an ordinary ssh connection.

VS Code integration with C++ development-environment inside Docker

I would like to run VSCode on my host machine, but (using its features / extensions) fire up tools from within the dev-env living inside my Docker container.
I have set up a docker image as a development environment for C++. Let's call it dev-env.
It is linux-based and contains required libraries, crosscompilation toolchains and various tools we use for building and testing our software (cmake, ninja, cppcheck, clang-tidy etc.)
I have a GIT repository on a host machine, which I mount inside a docker.
So my usual workflow would be to run docker:
host$
host$ docker run -v path/to/my/codebase/on/host:path/inside/docker -h dev-env --rm -it image_name bash
docker#
docker# cd build; cmake ..
etc...
And as such, I can build, test and run my tools inside my unified development environment inside the docker.
Now, the goal is to take it out of the terminal to the world of IDE.
I happen to use VS Code.
On host machine, I open my codebase folder in VSCode. Since it's mapped inside the docker, any changes I make locally will be available inside dev-env as well.
But if I now run anything from VSCode (CMake configure, build, etc.) it will of course call the tools from within my host machine - which of course will not work, and is not what I want.
With tasks defined in tasks.json I could probably manage with having them run something like docker exec CONTAINER my_command
It gets more complicated with extensions:
What I would like is to have the e.g. VSCode CMake Tools extension configured in such a way, that when I run Cmake Configure (in a VSCode running on my host machine), it will actually run cmake commands from within Docker container, using cmake installed inside Docker, not from my host machine.
Temporary solution: Forwarding display through X / VNC
So Installing VSCode inside the Docker, running x/vnc server inside the Docker, exposing port and connecting to it from the host machine.
Yes, it is possible, I have it running here. But it has many limitations and problems, of which the most painful is the lag/delay.
This is bad solution in general, so I would strongly push for avoiding this.
Another solution that I can think about:
VSCode instance running as a server inside the docker.
VSCode instance on your host connecting to the server instance.
You do all the work inside your host VSCode, but anytime you run a command, it is executed by a server instance, which runs everything inside Docker.
I guess this would require support from VSCode (or maybe an extension).
VSCode Live Share extension is not made exactly for that, but it's functionalities might do the job. I have not tested it yet.

Pycharm Django Supervisor Manage.py

I'm using Pycharm Professional edition to maintain a Django web application hosted on a remote Linux OS. Until now I've used Expandrive to mount the remote volume as a local windows mapped drive and can use remote debugging.
However I would like to develop locally and deploy to the server which I've got working but struggling to configure the run/debug configuration in conjunction with the Django settings in preferences considering I use supervisor to control all the processes.
Here's a screenshot of my deployment configuration (connection)
Here's a screenshot of my deployment configuration (mappings)
Here's a screenshot of run/debug configuration
When I enable django support, point to my manage.py script and run the configuration it tries to run using manage.py whereas I want it to use the supervisor script. So I therefore get this:
However the real problem is I would like to control the remote supervisor from pycharm