How to make my PyCharm to reload Django dev server after i make changes in templates?
It reloads on save on every other changes but not for template changes.
server is starting by docker compose up
We are using Django 3.2.16
entrypoint.sh:
exec gunicorn app.wsgi:application --config gunicorn.py --reload
settings.py
[...]
TEMPLATES = [
{
"BACKEND": "django.template.backends.django.DjangoTemplates",
"DIRS": [],
"APP_DIRS": True,
"OPTIONS": {
"context_processors": [
"django.template.context_processors.debug",
"django.template.context_processors.request",
"django.contrib.auth.context_processors.auth",
"django.contrib.messages.context_processors.messages",
"app.context_processors.get_version",
"app.context_processors.get_env",
]
},
}
]
[...]
Related
I have an Angular flavoured Nativescript project, which must be tested with "vanila" Jasmine, in a browser (so not in mobile) with ng test.
By default, with "naked" tests, it works. But the problem is, if I try to test/import anything, that has a ".tns" alternative, in some cases it loads that, and the build fails.
My problem is similar to this thread but there were no good solution described there.
So for instance:
I have two files:
app.component.tns.ts
app.component.ts
and I try to import it for testing in app.component.spec.ts:
import {AppComponent} from "#src/app/app.component";
it loads the .tns. file, and the build fails, as it cannot load the mobile-specific libraries.
ERROR in ./src/app/app.component.tns.ts
Module not found: Error: Can't resolve 'nativescript-ui-sidedrawer' in '/home/..../src/app'
resolve 'nativescript-ui-sidedrawer' in '/home/...../src/app'
Parsed request is a module
using description file: /home/...../src/package.json (relative path: ./app)
Field 'browser' doesn't contain a valid alias configuration
resolve as module
...
# ./src/app/app.component.tns.ts 25:35-72
# ./src/app/app.module.spec.ts
# ./src sync \.spec\.ts$
# ./src/test.ts
is there any solution to "remove" the .tns. files, just as if I were running a simple ng serve?
update: My tsconfig.spec.json should exclude these files, but it does not work either ...
"exclude": [
"**/*.tns.ts",
"**/*.android.ts",
"**/*.ios.ts"
]
}
it seems the problem was with tsconfig.json. Specificly this part:
"compilerOptions": {
...
"paths": {
"#src/*": [
"src/*.android.ts",
"src/*.ios.ts",
"src/*.tns.ts",
"src/*.web.ts",
"src/*.ts"
]
},
As this was extended by the tsconfig.spec.json.
I modified the tsconfig.spec.json to this:
{
"compilerOptions": {
"target": "es5",
"declaration": false,
"module": "esnext",
"moduleResolution": "node",
"emitDecoratorMetadata": true,
"experimentalDecorators": true,
"skipLibCheck": true,
"typeRoots": [
"node_modules/#types"
],
"lib": [
"es2017",
"dom",
"es6",
"es2015.iterable"
],
"baseUrl": ".",
"resolveJsonModule": true,
"esModuleInterop": true,
"paths": {
"#src/*": [
"src/*.ts"
]
},
"outDir": "../out-tsc/spec",
"types": [
"jasmine",
"node"
]
},
"files": [
"src/test.ts",
"src/polyfills.ts"
],
"include": [
"**/*.spec.ts",
"**/*.d.ts"
],
"exclude": [
"**/*.tns.ts",
"**/*.android.ts",
"**/*.ios.ts"
]
}
and now the tests run, and the correct components are imported.
I have setup a Django server on AWS Elastic Beanstalk. In that server, I have following services
Application server
Celery Worker
Celery Beat
I have using Docker to deploy my application which means I build my docker image, us that image to run all three services. My Dockerrun.aws.json file is below.
{
"AWSEBDockerrunVersion": 2,
"containerDefinitions": [
{
"command": [
"sh",
"-c",
"./entry_point.sh && gunicorn Project.wsgi:application -w 2 -b :8000 --timeout 120 --graceful-timeout 120 --worker-class gevent"
],
"environment": [
],
"essential": true,
"image": "695189796512.dkr.ecr.us-west-2.amazonaws.com/Project-20181107174734",
"name": "app",
"memory": 500,
"portMappings": [
{
"containerPort": 8000,
"hostPort": 80
}
]
},
{
"command": [
"celery",
"-A",
"Project",
"beat",
"--loglevel=info",
"--uid",
"django"
],
"environment": [
],
"essential": true,
"image": "695189796512.dkr.ecr.us-west-2.amazonaws.com/Project-20181107174734",
"memory": 200,
"name": "celery-beat"
},
{
"command": [
"celery",
"-A",
"Project",
"worker",
"--loglevel=info",
"--uid",
"django"
],
"environment": [
],
"essential": true,
"image": "695189796512.dkr.ecr.us-west-2.amazonaws.com/Project-20181107174734",
"memory": 200,
"name": "celery-worker"
}
],
"family": "",
"volumes": []
}
Problem:
This configuration is working fine. But the problem with this configuration is that on all my nodes, all three services run. When load increases, my server scales up to multiple nodes and all services run on each node. I can have multiple celery worker running on server as my task queue is same (I am using SQS). But I only want one Celery Beat service running on all my nodes because multiple Beat services can cause task added to queue multiple time.
What I wanted to do is to run a centralized service in my server. I want help that how I can achieve this in my current setup.
Also I want to know is there any problem in using multiple instance of Celery Worker on my server?.
My editor is vs code, I am running flask with the below config
{
"name": "Python: Flask",
"type": "python",
"request": "launch",
"module": "flask",
"env": {
"FLASK_APP": "application.py",
"FLASK_ENV": "development",
"DATABASE_URL": "postgres://localhost/cs50w_project1_development",
"FLASK_DEBUG": 1,
"SECRET_KEY": "abcefefe"
},
"args": [
"run",
"--no-debugger",
"--no-reload"
],
"jinja": true
},
It seems all good except flask won't hot reload when I am changing the code. E.G., add an action.
I have to manually reload the flask by clicking the restart button.
Is there any issue with my current config?
Remove the --no-reload in the launch.json. It's a very old thread but posting the answer here for future visitors.
I am following the instructions at https://docs.docker.com/compose/django/ to get a basic dockerized django app going. I am able to run it locally without a problem but I am having trouble to deploy it to AWS using Elastic Beanstalk. After reading here, I figured that I need to translate docker-compose.yml into Dockerrun.aws.json for it to work.
The original docker-compose.yml is
version: '2'
services:
db:
image: postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
and here is what I translated so far
{
"AWSEBDockerrunVersion": 2,
"volumes": [
{
"name": "db"
},
{
"name": "web"
}
],
"containerDefinitions": [
{
"name": "db",
"image": "postgres",
"essential": true,
"memory": 256,
"mountPoints": [
{
"sourceVolume": "db"
"containerPath": "/var/app/current/db"
}
]
},
{
"name": "web",
"image": "web",
"essential": true,
"memory": 256,
"mountPoints": [
{
"sourceVolume": "web"
"containerPath": "/var/app/current/web"
}
],
"portMappings": [
{
"hostPort": 8000,
"containerPort": 8000
}
],
"links": [
"db"
],
"command": "python manage.py runserver 0.0.0.0:8000"
}
]
}
but it's not working. What am I doing wrong?
I was struggling to get the ins and outs of the Dockerrun format. Check out Container Transform: "Transforms docker-compose, ECS, and Marathon configurations"... it's a life-saver. Here is what it outputs for your example:
{
"containerDefinitions": [
{
"essential": true,
"image": "postgres",
"name": "db"
},
{
"command": [
"python",
"manage.py",
"runserver",
"0.0.0.0:8000"
],
"essential": true,
"mountPoints": [
{
"containerPath": "/code",
"sourceVolume": "_"
}
],
"name": "web",
"portMappings": [
{
"containerPort": 8000,
"hostPort": 8000
}
]
}
],
"family": "",
"volumes": [
{
"host": {
"sourcePath": "."
},
"name": "_"
}
]
}
Container web is missing required parameter "image".
Container web is missing required parameter "memory".
Container db is missing required parameter "memory".
That is, in this new format, you must tell it how much memory to allot each container. Also, you need to provide an image - there is no option to build. As is mentioned in the comments, you want to build and push to DockerHub or ECR, then give it that location: eg [org name]/[repo]:latest on Dockerhub, or the URL for ECR. But container-transform does the mountPoints and volumes for you - it's amazing.
You have a few issues.
1) 'web' doesn't appear to be an 'image', you define it as 'build . ' in your docker-compose.. Remember, the Dockerrun.aws.json will have to pull the image from somewhere (easiest is to use ECS's Repositories)
2) I think 'command' is an array. So you'd have:
"command": ["python" "manage.py" "runserver" "0.0.0.0:8000"]
3) your mountPoints are correct, but the volume definition at the top is wrong.
{
"name": "web",
"host": {
"sourcePath": "/var/app/current/db"
}
Im not 100% certain, but the path works for me.
if you have the Dockerrun.aws.json file, next to is a directory called /db .. then that will be the mount location.
I am exporting a django project from a computer that runs Ubutnu to another that runs Windows 10. I've exported the environment with pip install -r requeriments.txt and everything seems to work fine, since, for example, python manage.py migrate is working properly.
The server starts fine with python manage.py runserver, however, when I open the URL http://127.0.0.1:8000/home in my browser I get the following error:
IOError at /home/ [Errno 22] Invalid argument:
u'C:\Users\myusername\Envs\myenv\myproject\:\HomePage.html'
The views are correctly setup because the project was running OK in ubuntu. The problem seems to be that the HomePage.html, which is supposed to be located in ...\\myproject\\marketingApp\\templates\\HomePage.html, is being looked for in ...\\myproject\\:\\HomePage.html and I don't know how/where to fix that.
I managed to load http://127.0.0.1:8000/home by adding my project folder path 'C:\\Users\\myuser\\Envs\\myenv\\myproject' directly into the TEMPLATES dict:
TEMPLATES = [
{
'BACKEND': 'django.template.backends.django.DjangoTemplates',
'DIRS': [
os.path.join(BASE_DIR, 'templates'),
'C:\\Users\\myuser\\Envs\\myenv\\myproject'
],
'APP_DIRS': True,
'OPTIONS': {
'debug':DEBUG,
'context_processors': [
'django.template.context_processors.debug',
'django.template.context_processors.request',
'django.contrib.auth.context_processors.auth',
'django.contrib.messages.context_processors.messages',
"django.core.context_processors.i18n",
"django.core.context_processors.media",
"django.core.context_processors.static",
"django.core.context_processors.tz"
],
},
},
]