I am on a virtualenv running on vagrant following http://gettingstartedwithdjango.com/en/lessons/introduction-and-launch/, and when I type:
$ sudo su postgres
I get asked to enter a password:
$ [sudo] password for postgres:
Does anyone have any tips for how to find the password? I'm confused that I'm even getting this password request, since previously, this is what happened:
createuser: creation of new role failed: ERROR: role "vagrant" already exists
postgres#precise64:/vagrant/projects/microblog$ createuser -P
could not change directory to "/vagrant/projects/microblog"
(blog-venv)vagrant#precise64:/vagrant/projects/microblog$ createdb microblog
createdb: database creation failed: ERROR: permission denied to create database
(blog-venv)vagrant#precise64:/vagrant/projects/microblog$ sudo su postgres
postgres#precise64:/vagrant/projects/microblog$ dropuser vagrant
could not change directory to "/vagrant/projects/microblog"
postgres#precise64:/vagrant/projects/microblog$sudo -u postgres psql postgres is not in the sudoers file. This incident will be reported.
postgres#precise64:/vagrant/projects/microblog/microblog$ sudo su postgres
[sudo] password for postgres:
Ultimately I'm trying to make the createdb microblog create a database for me in Postgres, but I'm running into these strange password request issues.
Related
When I enter su postgres it is asking for a password and I did not set any prior password.
I am using a Django book and have installed PostgreSQL with sudo apt-get install PostgreSQL PostgreSQL-contrib I installed it within a virtual environment the instructions are vague on where to install so I just install it in there. Also installed the install the psycopg2 PostgreSQL adapter for Python with
pip install psycopg2-binary==2.8.4. I am now in the next step to create a user for the PostgreSQL but can't pass that step the instructions provided are to enter su postgres and immediately after createuser -dp user-name and it is asking for a password. I have entered my superuser password but showing authentication failure.
Thanks
in your linux terminal write the commands
sudo -s
su postgres
psql
then you connect to postgres
to create a user from psql you type
create user username with encrypted password 'password';
to create a database with specific owner you type
create database databasename with encoding 'utf8' owner username;
to exit from psql you type
\q
last but not least to alter the password for user postgres with psql run the command
alter user postgres with encrypted password 'password';
then restart the postgresql server
sudo service postgresql restart
Context
I am trying to run my Django application and Postgres database in a docker development environment using docker-compose (it's my first time using Docker).
I want to use my application with a custom role and database both named teddycrepineau (as opposed to using the default postgres user and db).
Goal
My goal is to deploy a web app powered on the front end by react and the backend by django restapi, the whole running in a docker.
System/Version
python: 3.7
django: 2.1
OS: Mac OS High Sierra
What error am I getting
When running docker-compose up with my custom role and db, I am getting the following error django.db.utils.OperationalError: FATAL: role "teddycrepineau" does not exist. When running the same command with the default role and db postgres Django is able to start normally.
My understanding was that running docker-compose up would create the role and db passed as environment variable.
What I have tried so far
I read multiple threat on this site, GitHub, and docker:
tried to delete my container and rebuilt it with formatting as suggested here
Went through this GitHub issue
Tried to move my environment variable from .env file the environment inside my docker-compose.yml file and rebuild my container
Files
docker-compose.yml
version: '3'
volumes:
postgres_data: {}
services:
postgres:
image: postgres
volumes:
- postgres_data:/var/lib/postgresql/data
env_file: .env
ports:
- "5432"
django:
build:
context: teddycrepineau-backend
dockerfile: teddycrepineau-root/Dockerfile
command: ./teddycrepineau-backend/teddycrepineau-root/start.sh
env_file: .env
volumes:
- .:/teddycrepineau-backend
ports:
- "8000:8000"
depends_on:
- postgres
Dockerfile
FROM python:3.7
ENV PYTHONUNBUFFERED 1
WORKDIR /teddycrepineau-backend/
ADD ./teddycrepineau-root/requirements.txt /teddycrepineau-backend/
RUN pip install -r requirements.txt
ADD . /teddycrepineau-backend/
RUN chmod +x ./teddycrepineau-root/start.sh
start.sh
#!/usr/bin/env bash
python3 ./teddycrepineau-backend/teddycrepineau-root/manage.py runserver
.env
POSTGRES_PASSWORD=
POSTGRES_USER=teddycrepineau
POSTGRES_DB=teddycrepineau
EDIT
My file structure is as follow
root
|___ teddycrepineau-backend
|___ teddycrepineau-root
|___ teddycrepineau
|___ Dockerfile
|___ manage.py
|___ start.sh
|___ teddycrepineau-frontend
|___ React-App
|___ .env
|___ docker-compose.yml
When I move my docker-compose.yml file inside my backend folder, it starts as expected (though I am not able to access my site when going to 127.0.0.1:8000 but that is mostly a different issue) with custom user and db. When I put my docker-compose.yml file to my root folder, I get the error django.db.utils.OperationalError: FATAL: role "teddycrepineau" does not exist
This happens because your pgsql db was launched without any envs. The pgsql docker image only uses the envs the first time you created the container, after that it won't recreate DB and users.
The solution is to remove the pgsql volume so next time you docker-compose up you will have a fresh db with envs read. Simple way to do it is docker-compose down -v
Change your env order like this.
POSTGRES_DB=teddycrepineau
POSTGRES_USER=teddycrepineau
POSTGRES_PASSWORD=
I find it at this issue. I hope it works.
when you run the
sudo docker-compose exec web python manage.py migrate
yes of course you will receive
"django.db.utils.OperationalError: FATAL: role "user" does not exist
first you need to put
sudo docker-compose down -v
sudo docker system prune
check container, they should be deleted
sudo docker ps -a
then check images
sudo docker image ls
don`t forget delete images
sudo docker image rm 3e57319a7a3a
run to the project folder and then check out
python manage.py migrate
if it didn`t works put the
python manage.py migrate —run-syncdb
and
sudo docker-compose up -d --build
sudo docker-compose exec web python manage.py collectstatic --no-input
sudo docker-compose exec web python manage.py makemigrations
sudo docker-compose exec web python manage.py migrate auth
sudo docker-compose exec web python manage.py migrate --run-syncdb
I encountered the issue due to a mismatch between the $POSTGRES_DB and $POSTGRES_USER variables. By default, psql will attempt to set the database to the same name as the user logging in, so when there is a mismatch between the variables it fails with an error along the lines of psql:
FATAL: database "root" does not exist
I had to edit the init script that I was writing to include the -d "$POSTGRES_DB" option like so:
#!/bin/bash
set -e
psql -v ON_ERROR_STOP=1 --username "$POSTGRES_USER" -d "$POSTGRES_DB" <<-EOSQL
CREATE USER docker;
CREATE DATABASE docker;
GRANT ALL PRIVILEGES ON DATABASE docker TO docker;
EOSQL
Somehow I wanted to install MySQL manually, so I did as explained in How to Set Up MySQL on Google Compute Engine except that I didn't need to create a VM instance because I already have one.
That is, since my OS is 'Debian GNU/Linux 9 (stretch)', I executed the following two commands to install MySQL:
$ sudo apt-get update
and
$ sudo apt-get -y install mysql-server
There's no errors in this step and I created a root password by the following command:
$ sudo mysql_secure_installation
But when I tried to connect to MySQL, (entering the root password I created)
$ mysql --user=root --password
I've got
ERROR 1698 (28000): Access denied for user 'root'#'localhost'
What would be a problem here and what should I check in order to connect to MySQL?
There is an error on the tutorial. In order to connect to the MySQL you should run:
$ sudo mysql --user=root --password
containerizing a django app built with the pydanny cookiecutter for deployment to an EC2 instance. the docker_compose.yml is pretty straigtforward:
version: '2'
volumes:
postgres_data: {}
postgres_backup: {}
services:
postgres:
build: ./compose/postgres
volumes:
- postgres_data:/var/lib/postgresql/data
- postgres_backup:/backups
env_file: .env
....
nothing exotic in the dockefile; just pointers to backup and restore scripts and commands to make them executable:
FROM postgres:9.4
# add backup scripts
ADD backup.sh /usr/local/bin/backup
ADD restore.sh /usr/local/bin/restore
ADD list-backups.sh /usr/local/bin/list-backups
# make them executable
RUN chmod +x /usr/local/bin/restore
RUN chmod +x /usr/local/bin/list-backups
RUN chmod +x /usr/local/bin/backup
I've tried several variations on my db env variables, the latest of which looks like:
# PostgreSQL
POSTGRES_PASSWORD='postgrespass'
POSTGRES_USER='postgres'
the container builds and initializes without problem on:
docker-compose build postgres
docker-compose up -d
but when I try to make and migrate initial data to the db with:
docker-compose run django /usr/local/bin/python manage.py makemigrations
the db is unresponsive – "Postgres is unavailable - sleeping" and docker logs db returns:
DETAIL: Connection matched pg_hba.conf line 95: "host all all all md5"
FATAL: password authentication failed for user "'postgres'"
obviously I have some permission issues, but I'm not quite sure how to address them. My containers are running on an Ubuntu 16.04 AMI.
You can go to psql console and change the password for user postgres by typing following commands in your terminal
sudo -u postgres psql
postgres=# \password
Enter new password:
Enter it again:
postgres=#
Or To reset the password if you have forgotten:
ALTER USER "user_name" WITH PASSWORD 'new_password';
I've imported to PyCharm 5.1 Beta 2 a tutorial project, which works fine when I run it from the commandline with docker-compose up
: https:// docs.docker.com/compose/django/
Trying to set a remote python interpreter is causing problems.
I've been trying to work out what the service name field is expecting:
remote interpreter - docker compose window - http:// i.stack.imgur.com/Vah7P.png.
My docker-compose.yml file is:
version: '2'
services:
db:
image: postgres
web:
build: .
command: python manage.py runserver 0.0.0.0:8000
volumes:
- .:/code
ports:
- "8000:8000"
depends_on:
- db
When I try to enter web or db or anything at all that comes to mind, I get an error message: Service definition is expected to be a map
So what am I supposed to enter there?
EDIT1 (new version: Pycharm 2016.1 release)
I have now updated to the latest version and am having still issues: .IOError: [Errno 21] Is a directory
Sorry for not tagging all links - have a new user link limit
The only viable way we found to workaround this (Pycharm 2016.1) is setting up an SSH remote interpreter.
Add this to the main service Dockerfile:
RUN apt-get install -y openssh-server
RUN mkdir /var/run/sshd
RUN echo 'root:screencast' | chpasswd
RUN sed -i 's/PermitRootLogin without-password/PermitRootLogin yes/' /etc/ssh/sshd_config
# SSH login fix. Otherwise user is kicked off after login
RUN sed 's#session\s*required\s*pam_loginuid.so#session optional pam_loginuid.so#g' -i /etc/pam.d/sshd
ENV NOTVISIBLE "in users profile"
RUN echo "export VISIBLE=now" >> /etc/profile
EXPOSE 22
CMD ["/usr/sbin/sshd", "-D"]
Then log into docker container like this (in the code sample pass 'screencast'):
$ ssh root#192.168.99.100 -p 2000
Note: We aware the IP and port might change depending on your docker and compose configs
For PyCharm just set up a remote SSH Interpreter and you are done!
https://www.jetbrains.com/help/pycharm/2016.1/configuring-remote-interpreters-via-ssh.html