Cannot drop a column in PostgreSQL - django

I am pretty new in PostgreSQL and only know basic operations about database. I have a database called neuro_db with username neuro_user. Now I found I cannot delete a column in a table from neuro_db.
After I tried psql neuro_db in ubuntu 14.04, I got in a command line started with neuro_db=#. The \d command would show a table like this:
List of relations
Schema | Name | Type | Owner
--------+-----------------------------------+----------+------------
public | auth_group | table | neuro_user
public | auth_group_id_seq | sequence | neuro_user
public | auth_group_permissions | table | neuro_user
public | auth_group_permissions_id_seq | sequence | neuro_user
public | auth_permission | table | neuro_user
public | auth_permission_id_seq | sequence | neuro_user
public | auth_user | table | neuro_user
public | auth_user_groups | table | neuro_user
public | auth_user_groups_id_seq | sequence | neuro_user
public | auth_user_id_seq | sequence | neuro_user
public | auth_user_user_permissions | table | neuro_user
public | auth_user_user_permissions_id_seq | sequence | neuro_user
public | django_admin_log | table | neuro_user
public | django_admin_log_id_seq | sequence | neuro_user
public | django_content_type | table | neuro_user
public | django_content_type_id_seq | sequence | neuro_user
public | django_migrations | table | neuro_user
public | django_migrations_id_seq | sequence | neuro_user
public | django_session | table | neuro_user
public | neuro_category | table | neuro_user
public | neuro_category_id_seq | sequence | neuro_user
public | neuro_page | table | neuro_user
public | neuro_page_id_seq | sequence | neuro_user
(23 rows)
I wanted to edit the table "neuro_page" and after typing \d neuro_page
, it will show this table:
Table "public.neuro_page"
Column | Type | Modifiers
-------------+------------------------+---------------------------------------------------------
id | integer | not null default nextval('neuro_page_id_seq'::regclass)
category_id | integer | not null
title | character varying(128) | not null
text | text | not null
url | character varying(200) | not null
photo | character varying(100) | not null
order | character varying(128) | not null
date | date | not null
views | integer | not null
likes | integer | not null
I wanted to delete the "order" column in this table so I tried typed ALTER TABLE neuro_page DROP COLUMN order right after the neuro_db=# line, but nothing happened. There was also nothing after trying ALTER TABLE neuro_page and DROP COLUMN order respectively. The "order" column is still there! And no error information!
I have also searched on google for the answer but found nothing. I think I have followed the normal operations in database but can not drop a single column in a table. There is no foreign key for "order" column.
I'd be very greatful if anyone can help me with this issue. Thanks!

You need to escape the column name order in backticks because order is a keyword (e.g. order by 'column_name'). Make sure you are using a semicolon to show the end of the command:
ALTER TABLE neuro_page DROP COLUMN `order`;

Related

PowerBI filter Table on 2 different columns whose values are obtained from another table

I am new to PowerBI. I am trying to implement the following scenario in PowerBI
I have the following 2 tables -
Table 1:
| ExtractionId | DatasetId | UserId |
| -- | --- | --- |
| E_ID1 | D_ID1 | sta#example.com |
| E_ID2 | D_ID1 | dany#example.com |
| E_ID3 | D_ID2 | dany#example.com |
Table 2:
| DatasetId | Date | UserId | Status |
| --| --- | --- | --- |
| D_ID1 | 05/30/2021 | sta#example.com | Completed |
| D_ID1 | 05/30/2021 | dany#example.com | Completed |
| D_ID1 | 05/31/2021 | sta#example.com | Partial |
| D_ID1 | 05/31/2021 | dany#example.com | Completed |
| D_ID2 | 05/30/2021 | sta#example.com | Completed |
| D_ID2 | 05/30/2021 | dany#example.com | Completed |
| D_ID2 | 05/31/2021 | sta#example.com | Partial |
| D_ID2 | 05/31/2021 | dany#example.com | Completed |
I am trying to create a PowerBI report where, given an extraction id (in a slicer), we need to identify the corresponding DatasetId and UserID from Table 1 and use those fields to filter Table 2 and provide a visual of user status on the given date range.
When I am trying to implement the above scenario, I am creating a Many-Many relationship between DatasetID columns of Table1 and Table2, but cannot do the same for UserID column simultaneously as I get the following error :
You can't create a direct active relationship between Table1 and Table2 because an active set of indirect relationship already exists.
Because of this, given an extractionId, I can filter on DatasetID but not UserId and vice versa. Could you please help me understand what mistake I am doing here and how to resolve the same?
Thanks in advance
This case you said too. You can only merge two or more columns. Than you will create relationships.

pg_restore not restoring a certain table?

In a Django project, I have a model Question in the lucy_web app, but the corresponding lucy_web_question table does not exist, as seen from a \dt command in the database shell:
(lucy-web-CVxkrCFK) bash-3.2$ python manage.py dbshell
psql (10.4)
Type "help" for help.
lucy=> \dt
List of relations
Schema | Name | Type | Owner
--------+------------------------------+-------+---------
public | auditlog_logentry | table | lucyapp
public | auth_group | table | lucyapp
public | auth_group_permissions | table | lucyapp
public | auth_permission | table | lucyapp
public | auth_user | table | lucyapp
public | auth_user_groups | table | lucyapp
public | auth_user_user_permissions | table | lucyapp
public | defender_accessattempt | table | lucyapp
public | django_admin_log | table | lucyapp
public | django_content_type | table | lucyapp
public | django_migrations | table | lucyapp
public | django_session | table | lucyapp
public | lucy_web_checkin | table | lucyapp
public | lucy_web_checkintype | table | lucyapp
public | lucy_web_company | table | lucyapp
public | lucy_web_expert | table | lucyapp
public | lucy_web_expertsessiontype | table | lucyapp
public | lucy_web_family | table | lucyapp
public | lucy_web_lucyguide | table | lucyapp
public | lucy_web_notification | table | lucyapp
public | lucy_web_package | table | lucyapp
public | lucy_web_packagesessiontype | table | lucyapp
public | lucy_web_preactivationfamily | table | lucyapp
public | lucy_web_profile | table | lucyapp
public | lucy_web_questionanswer | table | lucyapp
public | lucy_web_questioncategory | table | lucyapp
public | lucy_web_session | table | lucyapp
public | lucy_web_sessioncategory | table | lucyapp
public | lucy_web_sessiontype | table | lucyapp
public | lucy_web_userapn | table | lucyapp
public | oauth2_provider_accesstoken | table | lucyapp
public | oauth2_provider_application | table | lucyapp
public | oauth2_provider_grant | table | lucyapp
public | oauth2_provider_refreshtoken | table | lucyapp
public | otp_static_staticdevice | table | lucyapp
public | otp_static_statictoken | table | lucyapp
public | otp_totp_totpdevice | table | lucyapp
public | two_factor_phonedevice | table | lucyapp
(38 rows)
We also have a staging environment, deployed on Aptible, which does appear to have these tables. Using the Aptible CLI to create a database tunnel, if I psql <connection_url> and dt I do see the lucy_web_question table:
db=# \dt
List of relations
Schema | Name | Type | Owner
--------+------------------------------+-------+---------
public | auditlog_logentry | table | aptible
public | auth_group | table | aptible
public | auth_group_permissions | table | aptible
public | auth_permission | table | aptible
public | auth_user | table | aptible
public | auth_user_groups | table | aptible
public | auth_user_user_permissions | table | aptible
public | defender_accessattempt | table | aptible
public | django_admin_log | table | aptible
public | django_content_type | table | aptible
public | django_migrations | table | aptible
public | django_session | table | aptible
public | lucy_web_checkin | table | aptible
public | lucy_web_checkintype | table | aptible
public | lucy_web_company | table | aptible
public | lucy_web_expert | table | aptible
public | lucy_web_expertsessiontype | table | aptible
public | lucy_web_family | table | aptible
public | lucy_web_lucyguide | table | aptible
public | lucy_web_notification | table | aptible
public | lucy_web_package | table | aptible
public | lucy_web_packagesessiontype | table | aptible
public | lucy_web_preactivationfamily | table | aptible
public | lucy_web_profile | table | aptible
public | lucy_web_question | table | aptible
public | lucy_web_questionanswer | table | aptible
public | lucy_web_questioncategory | table | aptible
public | lucy_web_questionprompt | table | aptible
public | lucy_web_session | table | aptible
public | lucy_web_sessioncategory | table | aptible
public | lucy_web_sessiontype | table | aptible
public | lucy_web_userapn | table | aptible
public | oauth2_provider_accesstoken | table | aptible
public | oauth2_provider_application | table | aptible
public | oauth2_provider_grant | table | aptible
public | oauth2_provider_refreshtoken | table | aptible
public | otp_static_staticdevice | table | aptible
public | otp_static_statictoken | table | aptible
public | otp_totp_totpdevice | table | aptible
public | two_factor_phonedevice | table | aptible
(40 rows)
Because the data on these test environments is not important, I'd like to pg_dump the Aptible database and pg_restore it on my local machine.
My local DATABASE_URL is postgres://lucyapp:<my_password>#localhost/lucy, so firstly, I did a pg_dump with --format=custom and specifying a --file as follows:
Kurts-MacBook-Pro-2:lucy2 kurtpeek$ touch staging_db_12_July.dump
Kurts-MacBook-Pro-2:lucy2 kurtpeek$ pg_dump postgresql://aptible:<some_aptible_hash>#localhost.aptible.in:62288/db --format=custom --file=staging_db_12_July.dump
Kurts-MacBook-Pro-2:lucy2 kurtpeek$ ls -lhtr | tail -1
-rw-r--r-- 1 kurtpeek staff 1.5M Jul 12 18:09 staging_db_12_July.dump
This results in a 1.5Mb .dump file, which I tried to restore from using pg_restore with the --no-owner option and --role=lucyapp (in order change the owner from aptible to lucyapp). However, this gives rise to a large number of 'already exists' errors, of which one is shown below:
Kurts-MacBook-Pro-2:lucy2 kurtpeek$ pg_restore staging_db_12_July.dump --dbname=lucy --no-owner --role=lucyapp
pg_restore: [archiver (db)] Error while PROCESSING TOC:
pg_restore: [archiver (db)] Error from TOC entry 3522; 0 0 COMMENT EXTENSION plpgsql
pg_restore: [archiver (db)] could not execute query: ERROR: must be owner of extension plpgsql
Command was: COMMENT ON EXTENSION plpgsql IS 'PL/pgSQL procedural language';
pg_restore: [archiver (db)] Error from TOC entry 2; 3079 16392 EXTENSION hstore
pg_restore: [archiver (db)] could not execute query: ERROR: permission denied to create extension "hstore"
HINT: Must be superuser to create this extension.
Command was: CREATE EXTENSION IF NOT EXISTS hstore WITH SCHEMA public;
pg_restore: [archiver (db)] Error from TOC entry 3523; 0 0 COMMENT EXTENSION hstore
pg_restore: [archiver (db)] could not execute query: ERROR: extension "hstore" does not exist
Command was: COMMENT ON EXTENSION hstore IS 'data type for storing sets of (key, value) pairs';
pg_restore: [archiver (db)] Error from TOC entry 197; 1259 16515 TABLE auditlog_logentry aptible
pg_restore: [archiver (db)] could not execute query: ERROR: relation "auditlog_logentry" already exists
Command was: CREATE TABLE public.auditlog_logentry (
id integer NOT NULL,
object_pk character varying(255) NOT NULL,
object_id bigint,
object_repr text NOT NULL,
action smallint NOT NULL,
changes text NOT NULL,
"timestamp" timestamp with time zone NOT NULL,
actor_id integer,
content_type_id integer NOT NULL,
remote_addr inet,
additional_data jsonb,
CONSTRAINT auditlog_logentry_action_check CHECK ((action >= 0))
);
WARNING: errors ignored on restore: 294
The problem is that if I \dt again in the python manage.py dbshell, I still don't see the lucy_web_question table.
I've come across this solution, Django : Table doesn't exist, for my situation, but in my case the Question model is imported and used as a foreign key in so many places that I thought it would be easier just to restore a database. Why is it not restoring the lucy_web_question table, though?
It seems the problem was that the lucyapp user did not have sufficient privileges to create the table. I basically had to ensure that the \dn+ command produced this result:
lucy=# \dn+
List of schemas
Name | Owner | Access privileges | Description
--------+----------+----------------------+------------------------
public | postgres | postgres=UC/postgres+| standard public schema
| | =UC/postgres +|
| | lucyapp=UC/postgres |
(1 row)
where lucyapp has both USAGE (U) and CREATE (C) privileges. Following https://www.postgresql.org/docs/9.0/static/sql-grant.html, this can be achieved with the commands
GRANT USAGE ON SCHEMA public TO lucyapp;
GRANT CREATE ON SCHEMA public TO lucyapp;
I also made lucyapp a superuser prior to running these commands, although that is not recommended for production.

How to add "description" for a column in Postgres DB using the corresponding Django model?

For e.g., in this table, I'd like to be able add the "description" text at the Django ORM layer and have it reflected at the database level.
test=# \d+ django_model
Table "public.django_model"
Column | Type | Modifiers | Description
--------+---------+-----------+-------------
i | integer | |
j | integer | |
Indexes:
"mi" btree (i) - Tablespace: "testspace"
"mj" btree (j)
Has OIDs: no
I suppose you can't do it. Here's the https://code.djangoproject.com/ticket/13867 request. Closed 6 ya as "Won't do".
You still can use postgres COMMENT extension, eg:
t=# create table t (i int, t text);
CREATE TABLE
Time: 12.068 ms
t=# comment on column t.i is 'some description';
COMMENT
Time: 2.994 ms
t=# \d+ t
Table "postgres.t"
Column | Type | Collation | Nullable | Default | Storage | Stats target | Description
--------+---------+-----------+----------+---------+----------+--------------+------------------
i | integer | | | | plain | | some description
t | text | | | | extended | |

Django: Limit QuerySet to user input (checkboxes)

My question is similar to Django Advanced Filtering but I need another approach:
Abstract:
Tables: manufacturer, supplies
Manufacturers have multiple supplies (1 or 0 in "supply" table)
I have a HTML form with multiple (20+ checkboxes) which should limit the queryset with AND queries (so standard). The HTML checkbox names equal MySQL field names. My table looks like this:
mysql> explain supply;
+----------------------+------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+----------------------+------------+------+-----+---------+----------------+
| id | int(11) | NO | PRI | NULL | auto_increment |
| manufacturer_id | int(11) | NO | MUL | NULL | |
| supply1 | tinyint(1) | NO | | NULL | |
| supply2 | tinyint(1) | NO | | NULL | |
| supply3 | tinyint(1) | NO | | NULL | |
| [...] | tinyint(1) | NO | | NULL | |
| supply20 | tinyint(1) | NO | | NULL | |
Now in pseudo SQL, I'd like to:
User selected checkboxes supply2 and supply14: SELECT * FROM supply WHERE supply2 = 1 AND supply14 = 1;
User selected checkboxes supply1, supply9 and supply18: SELECT * FROM supply WHERE supply1 = 1 AND supply9 = 1 AND supply18 = 1;
I'm pretty sure I need some QuerySet with kwargs, but I'm unable to construct the view for my needs (still learning Django).
I wonder if the data model here couldn't use some tweaking? You might want to have a supply table with twenty rows and an intermediate table connecting them (that is a ManytoMany(Supply) or something like that). Then you could just have a multi select field, rather than 20 check boxes (unless you really need them for some other reason).
If you need to add another supply, it's simply adding another row, rather than a schema migration.
supplies = Supply.objects.filter( supply1 = 1 )
And if you want to filter again:
supplies = supplies.filter(supply2 = 1)
The filter() method returns a QuerySet, so you can chain as many filter() calls as you like.

ERROR 1452 (23000): Cannot add or update a child row

I have the following database and when I try to run the query that has been shown below I get the error that says Cannot add or update a child row. I do not understand why am I getting this error.
The object_id that I am entering (327) exists in the database. I have tried this with other object_ids as well and the same error comes up. What is happening here?
like_objects
+-------------+---------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+-------------+---------+------+-----+---------+----------------+
| object_id | int(11) | NO | MUL | NULL | |
| id | int(11) | NO | PRI | NULL | auto_increment |
| user_id | int(11) | NO | MUL | NULL | |
+-------------+---------+------+-----+---------+----------------+
objects
+--------------+--------------+------+-----+---------+----------------+
| Field | Type | Null | Key | Default | Extra |
+--------------+--------------+------+-----+---------+----------------+
| object_name | longtext | NO | | NULL | |
| object_desc | longtext | NO | | NULL | |
| id | int(11) | NO | PRI | NULL | auto_increment |
| likes | int(11) | NO | | NULL | |
+--------------+--------------+------+-----+---------+----------------+
Here's the query and the error it produces:
insert into like_objects (object_id,user_id) values (327,1)
ERROR 1452 (23000): Cannot add or update a child row: a foreign key constraint fails
(`test_db`.`like_objects`, CONSTRAINT `object_id_refs_id_57f96810` FOREIGN KEY (`object_id`)
REFERENCES `objects` (`id`))
This is very strange because I have used a similar like logic for another table and it is working absolutely fine.
Important I have obtained objects table from a mysqldump. Initially the objects table was created form PhpMyAdmin and the like_objects table has been created by the django models
I have figured out what the problem was. As I have posted in my question that the objects table was created using phpmyadmin and then dumped into my database which is part of a django project.
So the storage engine of the initially created table was MyISAM, but the tables created by django were having storage engine innodb. So as result whenever I tried to insert an object in the like_objects table which referenced the objects table there was this incompatibility of the the storage types.
This link helped me to find out what was the storage engine for my table and then I changed the MyISAM engine to innodb using the belwo given command and it is working fine.
ALTER TABLE products ENGINE = innodb