I have written up the following initializing routine for an sqlite3 db table, but something tells me this is a) fragile, and/or b) Just a Bad Idea(tm).
The notion is that if the table is not present or it has not been initiailized, the try blocks will fault and the data will be created. In initial testing this works - although I am not seeing the defaults printed to console when the script runs: I get an empty tuple printed. Examining the database using the sqlite shell I see the data is present.
But that niggling feeling lingers that there is something very wrong with this approach. Thoughts? Opinions? Advice?
import sqlite3 as lite
import sys
def insert_defaults():
conn = lite.connect('db.sqlite')
with conn:
cur = conn.cursor()
defaults = (
('mykey','value for key one'),
('anotherkey','value for key two')
)
cur.executemany("INSERT INTO Settings(key,value) VALUES ( ?, ? )", defaults)
def initialize():
conn = lite.connect('db.sqlite')
settings = ()
try:
conn.row_factory = lite.Row
cur = conn.cursor()
cur.execute("SELECT * FROM Settings")
if cur.rowcount < 1:
insert_defaults()
cur.execute("SELECT * FROM Settings")
settings = cur.fetchall()
except lite.Error, e:
print "Error: %s" % e.args[0]
print "Trying to create missing table"
try:
cur.execute( "DROP TABLE IF EXISTS Settings" )
cur.execute("CREATE TABLE IF NOT EXISTS Settings (id INTEGER PRIMARY KEY, key TEXT NOT NULL, value TEXT)")
insert_defaults()
except lite.Error, e:
if conn:
conn.rollback()
print "Error: %s" % e.args[0]
sys.exit(1)
finally:
if conn:
conn.close()
return settings
if __name__ == "__main__":
print initialize()
Erik
Relying on exceptions to detect that the table does not exist is not very reliable because there could be other errors that have nothing to do with what you want to check.
The easiest way to ensure that the table is created is to just execute CREATE TABLE IF NOT EXISTS ... when your program is started; this will be ignored if the table already exists.
To check for some records existing, using a SELECT is fine.
However, if you have a UNIQUE or PRIMARY KEY constraint on the key column, you could just execute INSERT OR IGNORE INTO Settings....
You should not use a separate connection in insert_defaults(); this will lead to problems if you don't get your transaction commits correct.
Related
import mysql.connector
connection = mysql.connector.connect(user="REMOVED",
password="REMOVED",
host="REMOVED",
database="REMOVED")
cur = connection.cursor()
# Latitude - remove letter A
cur.execute("UPDATE tau._inm_exportados_test_csv SET latitud = REPLACE (latitud, 'a=','');")
print("Latitude change remove letter A - executed!")
# Longitude - remove letter A
cur.execute("UPDATE tau._inm_exportados_test_csv SET longitud = REPLACE (longitud, 'a=','');")
print("Longitude change remove letter A - executed!")
# Latitude - MODIFY COLUMN
cur.execute("ALTER TABLE tau._inm_exportados_test_csv MODIFY COLUMN latitud DECIMAL(10,6);")
print("Latitude - MODIFY COLUMN - executed!")
# Longitude - MODIFY COLUMN
cur.execute("ALTER TABLE tau._inm_exportados_test_csv MODIFY COLUMN longitud DECIMAL(10,6);")
print("Longitude - MODIFY COLUMN - executed!")
# Post Code data type change
cur.execute("ALTER TABLE tau._inm_exportados_test_csv MODIFY COLUMN codigo_postal varchar(255);)")
print("Post Code data type change to varchar(255) - executed!")
connection.commit()
cur.close()
connection.close()
I'm trying to make this simple list of statements work without success. What makes it more confusing is that the first four statements work whereas the final one doesn't work even when I comment out the rest! The final statement gets the following reponse:
mysql.connector.errors.InterfaceError: Use multi=True when executing multiple statements
The datatype for codigo_postal is int(11) unlike latitud and longitud which are varchar.
I have tried creating new connections, new cursors, new connections AND cursors. I have tried adding multi="True" and combining statements into one operation. I have tried adding multi="True" to each cur.execute() as both the second and third parameter. I have run the statement in Workbench to ensure the statement is valid and it works.
No success with it here though...
You can use commit after you executed DML (Data Manipulation Language) commands. Also using multi=True can be more convenient to complete this job, but you need to run the generator which created by execute. doc.
Ordinary method:
cur = connection.cursor()
def alter(state,msg):
try:
cur.execute(state)
connection.commit()
except Exception as e:
connection.rollback()
raise e
print(msg)
alter("ALTER TABLE address MODIFY COLUMN id int(15);","done")
alter("ALTER TABLE address MODIFY COLUMN email varchar(35);","done")
alter("ALTER TABLE address MODIFY COLUMN person_id int(35);","done")
With multi=True:
cur = connection.cursor()
def alter(state,msg):
result = cur.execute(state,multi=True)
result.send(None)
print(msg,result)
try:
alter("ALTER TABLE address MODIFY COLUMN id int(45)","done")
alter("ALTER TABLE address MODIFY COLUMN email varchar(25)","done")
alter("ALTER TABLE address MODIFY COLUMN person_id int(25);","done")
connection.commit()
except Exception as e:
connection.rollback()
raise e
I had the same problem.
I wanted my code to be clean and I wanted to have all my commands in a list and just run them in a sequence.
I found this link and this link and finally was able to write this code:
import mysql.connector as sql
from mysql.connector import Error
commands = [
'''
USE sakila;
SELECT * FROM actor;
''',
'''
USE sakila;
SELECT * FROM actor WHERE actor_id < 10;
'''
]
connection_config_dict = {
'user': 'username',
'password': 'password',
'host': '127.0.0.1',
}
try:
connection = sql.connect(**connection_config_dict)
if connection.is_connected():
db_Info = connection.get_server_info()
print("Connected to MySQL Server version ", db_Info, '\n')
cursor = connection.cursor()
for command in commands:
for result in cursor.execute(command, multi=True):
if result.with_rows:
print("Rows produced by statement '{}':".format(
result.statement))
print(result.fetchall())
else:
print("Number of rows affected by statement '{}': {}".format(
result.statement, result.rowcount), '\n')
record = cursor.fetchall()
except Error as e:
print("Error while connecting to MySQL", e, '\n')
finally:
if connection.is_connected():
cursor.close()
connection.close()
print("MySQL connection is closed", '\n')
I'm trying call stored procedure using following code
conn = ibm_db.connect("database","username","password")
sql = "CALL DB2INST1.KPI_VALIDATE()"
stmt = ibm_db.exec_immediate(conn, sql)
But this procedure does not return any rows & It will only return code. Now I need to handle error whether procedure run successfully or not. Could anyone help me how to handle this?
Thanks
For test purposes, I've created a table:
db2 "create table so(c1 int not null primary key)"
and my procedure will simply insert a row into this table - this will allow me to easily force an error with a duplicate key:
db2 "create or replace procedure so_proc(in insert_val int)
language sql
insert into so values(insert_val)"
db2 "call so_proc(1)"
Return Status = 0
db2 "call so_proc(1)"
SQL0803N One or more values in the INSERT statement, UPDATE statement, or
foreign key update caused by a DELETE statement are not valid because the
primary key, unique constraint or unique index identified by "1" constrains
table "DB2V115.SO" from having duplicate values for the index key.
SQLSTATE=23505
now with Python:
conn = ibm_db.connect("DATABASE=SAMPLE;HOSTNAME=localhost;PORT=61115;UID=db2v115;PWD=xxxxx;","","")
stmt = ibm_db.exec_immediate(conn, "CALL SO_PROC(2)")
stmt = ibm_db.exec_immediate(conn, "CALL SO_PROC(2)")
Exception Traceback (most recent call last)
<ipython-input-8-c1f4b252e70a> in <module>
----> 1 stmt = ibm_db.exec_immediate(conn, "CALL SO_PROC(2)")
Exception: [IBM][CLI Driver][DB2/LINUXX8664] SQL0803N One or more values in the INSERT statement, UPDATE statement, or foreign key update caused by a DELETE statement are not valid because the primary key, unique constraint or unique index identified by "1" constrains table "DB2V115.SO" from having duplicate values for the index key. SQLSTATE=23505 SQLCODE=-803
so if a procedure hits an exception then you'll get it, you just need to handle exception Try/Except block:
try:
stmt = ibm_db.exec_immediate(conn, "CALL SO_PROC(2)")
except Exception:
print("Procedure failed with sqlstate {}".format(ibm_db.stmt_error()))
print("Error {}".format(ibm_db.stmt_errormsg()))
Procedure failed with sqlstate 23505
Error [IBM][CLI Driver][DB2/LINUXX8664] SQL0803N One or more values in the INSERT statement, UPDATE statement, or foreign key update caused by a DELETE statement are not valid because the primary key, unique constraint or unique index identified by "1" constrains table "DB2V115.SO" from having duplicate values for the index key. SQLSTATE=23505 SQLCODE=-803
Or you are actually interested with CALL return code/status? E.g.:
create or replace procedure so_proc_v2(in insert_val int)
language sql
if not exists (select 1 from so where c1 = insert_val)
then
insert into so values(insert_val);
return 0;
else
return -1;
end if#
test:
db2 "call so_proc_v2(10)"
Return Status = 0
db2 "call so_proc_v2(10)"
Return Status = -1
then this is a bit tricky. With CLI trace enabled (I have ibm_db installed in my local path so it fetched CLI package there too):
export LD_LIBRARY_PATH=$HOME/.local/lib/python3.7/site-packages/clidriver/lib/
$HOME/.local/lib/python3.7/site-packages/clidriver/bin/db2trc on -cli -f /tmp/cli/trc
<run_code>
$HOME/.local/lib/python3.7/site-packages/clidriver/bin/db2trc off
$HOME/.local/lib/python3.7/site-packages/clidriver/bin/db2trc fmt -cli /tmp/cli.trc /tmp/cli.fmt
trace does show the returns status:
SQLExecute( hStmt=1:8 )
---> Time elapsed - -7.762688E+006 seconds
( Row=1, iPar=1, fCType=SQL_C_LONG, rgbValue=10 )
( return=-1 )
( COMMIT REQUESTED=1 )
( COMMIT REPLY RECEIVED=1 )
but I don't see anywhere in python-ibmdb API a way to fetch it... (e.g. ibm_dbcallproc doesn't have such option). Which means, that unless I'm missing something, you would have to raise an issue on Github to extent the API
So after my research on stackoverflow didn't bring me any further here ist my code (I cannot post the exact code , because this is a problem I have at work) and problem:
import mysql.connector
.
.
.
cnx = mysql.connector.connect(user, password, host, database)
cursor = cnx.cursor()
for-loop:
if condition:
cursor.execute("INSERT INTO table (the_columns) VALUES (%s)", (my_values))
cnx.commit()
I tried to insert manually already and it worked, but somehow my python code won't do the insert.
The manual insert:
INSERT INTO table (column1,...,column7) VALUES (string1,....,string6, now())
I have no error message, I can only look into the database and see that the new valiues aren't there.
Did anyone else face this problem? Can anyone suggest what could be the problem?
Might be because you dont have to put your variable between "(" ")" ?
Did you tried to put the value directly inside the sql, then instead of the variable containing it?
What do you mean by "manually"?
Anyway, you should put all your variables in an array, before passing this array as the second argument:
query = (
"INSERT INTO employees (emp_no, first_name, last_name, hire_date) VALUES (%s, %s, %s, %s)"
)
data = (2, 'Jane', 'Doe', datetime.date(2012, 3, 23))
cursor.execute(insert_stmt, data)
EDIT: I just checked, thats the main exemple if you google your problem... wait did you searched for this a little bit? That's the best wait to learn dude: search by yourself before asking for help.
Try turning your format variables into a tuple. If this does not work. Try creating the query as a variable seperately and printing it out and running it in your sql console directly. You might get more meaningful errors.
cnx = mysql.connector.connect(user, password, host, database)
cursor = cnx.cursor()
for-loop:
if condition:
cursor.execute("INSERT INTO table (the_columns) VALUES (%s)", (my_values,))
cnx.commit()
OR
cnx = mysql.connector.connect(user, password, host, database)
cursor = cnx.cursor()
for-loop:
if condition:
sql = "INSERT INTO table ({}) VALUES ({})".format(the_columns,my_values)
print(sql)
cursor.execute(sql)
cnx.commit()
I found put that I open the database at every request.Is there a way to simplify and improve this code to increase the sqlite speed?
name3 = ' '.join(name2)
import sqlite3
id = 0
location = ""
conn = sqlite3.connect("keywords.db")
c = conn.cursor()
c.execute('select * from kmedicals')
records = c.fetchall()
for record in records:
id = record[0]
location = record[15]
if id == name3:
print name3.capitalize(),':' '\n',location
break
sys.exit()
Do not use import in the middle of your program.
Open the database once at the start of your program.
Select only the records you actually need.
Select only the columns you actually need.
Do not use fetchall; read only the records you actually need.
Do not fetch into a temporary variable if you can use the cursor directly.
import sqlite3
# at startup
conn = sqlite3.connect("keywords.db")
def search_location(name2):
name3 = ' '.join(name2)
c = conn.cursor()
c.execute('SELECT location FROM kmedicals WHERE id = ?', (name3,))
for (location,) in c:
print name3.capitalize(),':' '\n',location
break
else:
pass # not found
from django.db import connection
q = 'some value'
sql1 = 'SELECT * FROM table WHERE field LIKE %%%s%%' % q
sql2 = 'SELECT * FROM table WHERE field LIKE %%'+ q +'%%'
cursor = connection.cursor()
cursor.execute( sql1 ) #why exception: IndexError: tuple index out of range ?
cursor.execute( sql2 ) #works ok
You need to QUOTE properly your SQL arguments.
And by quoting properly I mean using the quote facility provided by DBAPI, not adding a ' around your string, which is useless.
Correct code :
q = "%"+q+"%"
cursor.execute( 'SELECT * FROM table WHERE field LIKE %s', (q,) )
Really correct code :
q = "%"+q.replace("%","%%")+"%"
cursor.execute( 'SELECT * FROM table WHERE field LIKE %s', (q,) )
Suppose q = "a'bc"
First, rewrite this as "%a'bc%"
Then use it as a normal string argument. psycopg will rewrite it as '%a\'bc%' as it should.
If q may contain "%" and you want to search for it, then use the second one.
Using direct string manipulation will almost certainly lead to improper SQL that is vulnerable to SQL Injection attacks (see psycopg2's comments on the subject).
What I think you're looking to do is try and perform a LIKE '%some value%' in django, right?:
from django.db import connection
q = '%some value%'
cur = connection.cursor()
cur.execute("SELECT * FROM table WHERE field LIKE %(my_like)s", {'my_like': q})
As of psycopg2 2.4.1, the SQL that is executed on the server is:
SELECT * FROM table WHERE field LIKE '%some value%'
You need to QUOTE properly your SQL command:
sql1 = "SELECT * FROM table WHERE field LIKE '%%%s%%'" % q
sql2 = "SELECT * FROM table WHERE field LIKE '%"+ q +"%'"
And by quoting properly I mean using single quotes with LIKE expressions.