Django-rearart sequence of postgres id field with 0 instead of 1 - django

I want to reset the sequence of a "postgresql" table to start from "0" in a django application.
My code is:
views.py
sequence_sql = connection.ops.sequence_reset_sql(no_style(), [ModelName])
with connection.cursor() as cursor:
for sql in sequence_sql:
cursor.execute(sql)
print("sequence reset")
The sequence is restarted successfully; but with "1".
I want the sequence to start from 0.
How can I achieve that?

Related

DJANGO How to execute this sql in django.db with execute - SET PREPARE EXECUTE [duplicate]

import mysql.connector
connection = mysql.connector.connect(user="REMOVED",
password="REMOVED",
host="REMOVED",
database="REMOVED")
cur = connection.cursor()
# Latitude - remove letter A
cur.execute("UPDATE tau._inm_exportados_test_csv SET latitud = REPLACE (latitud, 'a=','');")
print("Latitude change remove letter A - executed!")
# Longitude - remove letter A
cur.execute("UPDATE tau._inm_exportados_test_csv SET longitud = REPLACE (longitud, 'a=','');")
print("Longitude change remove letter A - executed!")
# Latitude - MODIFY COLUMN
cur.execute("ALTER TABLE tau._inm_exportados_test_csv MODIFY COLUMN latitud DECIMAL(10,6);")
print("Latitude - MODIFY COLUMN - executed!")
# Longitude - MODIFY COLUMN
cur.execute("ALTER TABLE tau._inm_exportados_test_csv MODIFY COLUMN longitud DECIMAL(10,6);")
print("Longitude - MODIFY COLUMN - executed!")
# Post Code data type change
cur.execute("ALTER TABLE tau._inm_exportados_test_csv MODIFY COLUMN codigo_postal varchar(255);)")
print("Post Code data type change to varchar(255) - executed!")
connection.commit()
cur.close()
connection.close()
I'm trying to make this simple list of statements work without success. What makes it more confusing is that the first four statements work whereas the final one doesn't work even when I comment out the rest! The final statement gets the following reponse:
mysql.connector.errors.InterfaceError: Use multi=True when executing multiple statements
The datatype for codigo_postal is int(11) unlike latitud and longitud which are varchar.
I have tried creating new connections, new cursors, new connections AND cursors. I have tried adding multi="True" and combining statements into one operation. I have tried adding multi="True" to each cur.execute() as both the second and third parameter. I have run the statement in Workbench to ensure the statement is valid and it works.
No success with it here though...
You can use commit after you executed DML (Data Manipulation Language) commands. Also using multi=True can be more convenient to complete this job, but you need to run the generator which created by execute. doc.
Ordinary method:
cur = connection.cursor()
def alter(state,msg):
try:
cur.execute(state)
connection.commit()
except Exception as e:
connection.rollback()
raise e
print(msg)
alter("ALTER TABLE address MODIFY COLUMN id int(15);","done")
alter("ALTER TABLE address MODIFY COLUMN email varchar(35);","done")
alter("ALTER TABLE address MODIFY COLUMN person_id int(35);","done")
With multi=True:
cur = connection.cursor()
def alter(state,msg):
result = cur.execute(state,multi=True)
result.send(None)
print(msg,result)
try:
alter("ALTER TABLE address MODIFY COLUMN id int(45)","done")
alter("ALTER TABLE address MODIFY COLUMN email varchar(25)","done")
alter("ALTER TABLE address MODIFY COLUMN person_id int(25);","done")
connection.commit()
except Exception as e:
connection.rollback()
raise e
I had the same problem.
I wanted my code to be clean and I wanted to have all my commands in a list and just run them in a sequence.
I found this link and this link and finally was able to write this code:
import mysql.connector as sql
from mysql.connector import Error
commands = [
'''
USE sakila;
SELECT * FROM actor;
''',
'''
USE sakila;
SELECT * FROM actor WHERE actor_id < 10;
'''
]
connection_config_dict = {
'user': 'username',
'password': 'password',
'host': '127.0.0.1',
}
try:
connection = sql.connect(**connection_config_dict)
if connection.is_connected():
db_Info = connection.get_server_info()
print("Connected to MySQL Server version ", db_Info, '\n')
cursor = connection.cursor()
for command in commands:
for result in cursor.execute(command, multi=True):
if result.with_rows:
print("Rows produced by statement '{}':".format(
result.statement))
print(result.fetchall())
else:
print("Number of rows affected by statement '{}': {}".format(
result.statement, result.rowcount), '\n')
record = cursor.fetchall()
except Error as e:
print("Error while connecting to MySQL", e, '\n')
finally:
if connection.is_connected():
cursor.close()
connection.close()
print("MySQL connection is closed", '\n')

How to use stored procedures with Django backend?

I have created a stored procedure in SSMS for the query SELECT * FROM TABLE and now I want to create a Django API and test it. What is the entire procedure?
My script from SQL Stored Procedure:
USE [test]
GO
/****** Object: StoredProcedure [dbo].[spGetAll] ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author: <Author,,Name>
-- Create date: <Create Date,,>
-- Description: <Description,,>
-- =============================================
CREATE PROCEDURE [dbo].[spGetAll]
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
SET NOCOUNT ON;
-- Insert statements for procedure here
SELECT * from app_comment
END
GO
In order to call a stored procedure do the following-
from django.db import connection
def dictfetchall(cursor):
columns = [col[0] for col in cursor.description]
return [
dict(zip(columns, row))
for row in cursor.fetchall()
]
with connection.cursor() as cursor:
cursor.callproc("stored_procedure", [arg1, arg2, arg3])
data = dictfetchall(cursor)
For reference view docs
I could not get alamshafi2263's answer to work, giving various syntax errors. My procedure takes a parameter but returns no value.
Here is what i used successfully:
from django.db import connection
local_cursor = connection.cursor()
call_definition = f'call public.ap_update_holding_value({transaction_id})'
local_cursor.execute(call_definition)
In the above transaction_id is the parameter value.
Tested with Django==3.1.13, psycopg2==2.9.1

Django Postgres migration: Fastest way to backfill a column in a table with 100 Million rows

I have a table in Postgres Thing that has 100 Million rows.
I have a column that was populated over time that stores some keys. The keys were prefixed before storing. Let's call it prefixed_keys.
My task is to use the values of this column to populate another column with the same values but with the prefixes trimmed off. Let's call it simple_keys.
I tried the following migration:
from django.db import migrations
import time
def backfill_simple_keys(apps, schema_editor):
Thing = apps.get_model('thing', 'Thing')
batch_size = 100000
number_of_batches_completed = 0
while Thing.objects.filter(simple_key__isnull=True).exists():
things = Thing.objects.filter(simple_key__isnull=True)[:batch_size]
for tng in things:
prefixed_key = tng.prefixed_key
if prefixed_key.startswith("prefix_A"):
simple_key = prefixed_key[len("prefix_A"):]
elif prefixed_key.startswith("prefix_BBB"):
simple_key = prefixed_key[len("prefix_BBB"):]
tng.simple_key = simple_key
Thing.objects.bulk_update(
things,
['simple_key'],
batch_size=batch_size
)
number_of_batches_completed += 1
print("Number of batches updated: ", number_of_batches_completed)
sleep_seconds = 3
time.sleep(sleep_seconds)
class Migration(migrations.Migration):
dependencies = [
('thing', '0030_add_index_to_simple_key'),
]
operations = [
migrations.RunPython(
backfill_simple_keys,
),
]
Each batch took about ~7 minutes to complete. Which would means it would take days to complete!
It also increased the latency of the DB which is bing used in production.
Since you're going to go through every record in that table anyway it makes sense to traverse it in one go using a server-side cursor.
Calling
Thing.objects.filter(simple_key__isnull=True)[:batch_size]
is going to be expensive especially as the index starts to grow.
Also the call above retrieves ALL fields from that table even if you are only going to use only 2-3 fields.
update_query = """UPDATE table SET simple_key = data.key
FROM (VALUES %s) AS data (id, key) WHERE table.id = data.id"""
conn = psycopg2.connect(DSN, cursor_factory=RealDictCursor)
cursor = conn.cursor(name="key_server_side_crs") # having a name makes it a SSC
update_cursor = conn.cursor() # regular cursor
cursor.itersize = 5000 # how many records to retrieve at a time
cursor.execute("SELECT id, prefixed_key, simple_key FROM table")
count = 0
batch = []
for row in cursor:
if not row["simple_key"]:
simple_key = calculate_simple_key(row["prefixed_key"])
batch.append[(row["id"], simple_key)]
if len(batch) >= 1000 # how many records to update at once
execute_values(update_cursor, update_query, batch, page_size=1000)
batch = []
time.sleep(0.1) # allow the DB to "breathe"
count += 1
if count % 100000 == 0: # print progress every 100K rows
print("processed %d rows", count)
The above is NOT tested so it's advisable to create a copy of a few million rows of the table and test it against it first.
You can also test various batch size settings (both for retrieve and update).

Inserting values in to table in mysql using python shows no syntax error , but the record is not getting updated on db

I'm a newbie to python and i am trying to insert some data in to a mysql table .Seems the query executed with out any issues, however i don't see any record added on to the table.
Any help would be greatly appreciated.
Cheers,
Aditya
connection = mysql.connector.connect(user='sandboxbeta2503', password='XXX',
host='myreleasebox.com',
database='iaas')
print ("Updating the history in bulk_notification_history")
cursor = connection.cursor()
timestamp = time.strftime("%Y-%m-%d %X")
notification_type = "Notify Inactive users"
usercount= 45
query = ("INSERT INTO iaas.bulk_notification_history"
"(nty_date,notification_type,user_count)"
"VALUES (%s,%s,%s)")
data = (time.strftime('%Y-%m-%d %H:%M:%S'),notification_type, usercount)
linker1 = cursor.execute(query,data)
print (linker1)
cursor.close()
connection.close()

How to improve sqlite database request and speed

I found put that I open the database at every request.Is there a way to simplify and improve this code to increase the sqlite speed?
name3 = ' '.join(name2)
import sqlite3
id = 0
location = ""
conn = sqlite3.connect("keywords.db")
c = conn.cursor()
c.execute('select * from kmedicals')
records = c.fetchall()
for record in records:
id = record[0]
location = record[15]
if id == name3:
print name3.capitalize(),':' '\n',location
break
sys.exit()
Do not use import in the middle of your program.
Open the database once at the start of your program.
Select only the records you actually need.
Select only the columns you actually need.
Do not use fetchall; read only the records you actually need.
Do not fetch into a temporary variable if you can use the cursor directly.
import sqlite3
# at startup
conn = sqlite3.connect("keywords.db")
def search_location(name2):
name3 = ' '.join(name2)
c = conn.cursor()
c.execute('SELECT location FROM kmedicals WHERE id = ?', (name3,))
for (location,) in c:
print name3.capitalize(),':' '\n',location
break
else:
pass # not found