Python MySQLdb not inserting data - python-2.7

ubuntu version: 12.10
mysql server version: 5.5.29-0
python version: 2.7
I am trying to use MySQLdb to insert data into my localhost mysql server. I don't get any errors when I run the script but the data isn't enter into my table. I view tables with phpmyadmin.
I tried going back to basics and following a tutorial but same result. The weird thing is that I can create and delete tables but not enter data.
The code is from the tutorial even reports that 4 rows were inserted. What is preventing data from being entered into the table when the script reports everything is fine??
cursor = conn.cursor ()
cursor.execute ("DROP TABLE IF EXISTS animal")
cursor.execute ("""
CREATE TABLE animal
(
name CHAR(40),
category CHAR(40)
)
""")
cursor.execute ("""
INSERT INTO animal (name, category)
VALUES
('snake', 'reptile'),
('frog', 'amphibian'),
('tuna', 'fish'),
('racoon', 'mammal')
""")
print "%d rows were inserted" % cursor.rowcount

Add :
conn.commit()
at the bottom of your script.
On a side note, have a look at the following : http://mysql-python.sourceforge.net/MySQLdb.html

Related

Python2.7 & SQLite3: DELETE & SELECT with DATE(SUBSTR()) DO NOT working

Firstly, I have a table in SQLlite3 with two fields CAR (TEXT NOT NULL), checkout (TEXT NOT NULL)
car checkout
red %d%d/%m%m/%Y (for example 27/09/2021)
Second, I wrote a script which the structure is when I run it, all the entries that current date is equal or bigger than checkout to be deleted.
Third, in the same script with SELECT to check if the car is in the list and checkout is bigger than current date exclude from my available cars.
The code snippet makes the first step is the following:
try:
con = lite.connect(DB)
with con:
paper=[]
cur=con.cursor()
cur.execute("DELETE FROM CHECK_TABLE WHERE DATE(substr(checkout,7,4)||substr(checkout,4,2)||substr(checkout,1,2))<=DATE(strftime('%Y%m%d',date('now')))")
con.commit()
print('Entries with old dates deleted.')
except lite.Error as e:
print('Error connection: ',e)
The problem is that is not deleting anything. The strange behaviour is firstly that the SQL query works in DB Browser,
Image: Proof DB Browser in Windows 10 - Python2.7 - SQLite3
the second strange behaviour is that no error is raising and the third strange is that I tested two days ago and it worked normally! I really need your thoughts.
The same logic is in the following code snippet which is the the third step that I described above with SELECT command.
def ReadDateAndCar(car):
try:
con = lite.connect(DB)
with con:
paper=[]
cur=con.cursor()
cur.execute("SELECT DISTINCT car FROM CHECK_TABLE WHERE car='"+car+"' AND DATE(substr(checkout,7,4)||substr(checkout,4,2)||substr(checkout,1,2))<=DATE(strftime('%Y%m%d',date('now')))")
free_cars=cur.fetchall()
return free_cars
except lite.Error as e:
print('Error connection: ',e)
return 0
Exactly the same problems. SQL query works fine, no python error is raising, it worked few days ago. Can someone enlighten me?
Both your queries are wrong and they don't work in DB Browser either.
What you should do is store the dates with the ISO format YYYY-MM-DD, because this is the only text date format compatible with SQLite's datetime functions like date() and strftime() and it is comparable.
If you use any other format the result of these functions is null and this is what happens in your case.
The expressions substr(checkout,7,4)||substr(checkout,4,2)||substr(checkout,1,2) and strftime('%Y%m%d',date('now')) return dates in the format YYYYMMDD and if you use them inside date() or strftime() the result is null.
Since you obtain in both sides of the inequality dates in the format YYYYMMDD then they are directly comparable and you should not use the function date().
The condition should be:
substr(checkout, -4) || substr(checkout, 4, 2) || substr(checkout, 1, 2) <= strftime('%Y%m%d', 'now')

pyodbc and xmp toolkit adding metadata printing on command line but only adding first letter in metadata

Hi I'm having a problem adding metadata to some jpg images.
I am querying a filemaker database via pyodbc
sql = cur.execute("""SELECT division, attribute_20, brand_name FROM table WHERE item_number=?""", (prod))
row = cur.fetchone()
when I print(row[0]) i get an output of 'BRN' and the type is 'unicode'
However when I try too add this as metadata xmp.set_property(consts.XMP_NS_DC, u'DivisionName', row[0])
it only inputs the first letter. <dc:DivisionName>B</dc:DivisionName>
I have tried converting this to a string but it's the same problem.
Any help would be really appreciated!
Richard

How to handle unicode data in cx_Oracle and python 2.7?

I am using
Python 2.7
cx_Oracle 6.0.2
I am doing something like this in my code
import cx_Oracle
connection_string = "%s:%s/%s" % ("192.168.8.168", "1521", "xe")
connection = cx_Oracle.connect("system", "oracle", connection_string)
cur = connection.cursor()
print "Connection Version: {}".format(connection.version)
query = "select *from product_information"
cur.execute(query)
result = cur.fetchone()
print result
I got the output like this
Connection Version: 11.2.0.2.0
(1, u'????????????', 'test')
I am using following query to create table in oracle database
CREATE TABLE product_information
( product_id NUMBER(6)
, product_name NVARCHAR2(100)
, product_description VARCHAR2(1000));
I used the following query to insert data
insert into product_information values(2, 'दुःख', 'teting');
Edit 1
Query: SELECT * from NLS_DATABASE_PARAMETERS WHERE parameter IN ( 'NLS_LANGUAGE', 'NLS_TERRITORY', 'NLS_CHARACTERSET');
Result
NLS_LANGUAGE: AMERICAN, NLS_TERRITORY: AMERICA, NLS_CHARACTERSET:
AL32UTF8
I solved the problem.
First I added NLS_LANG=.AL32UTF8 as the environment variable in the system where Oracle is installed
Second I passed the encoding and nencoding parameter in connect function of cx_Oracle like below.
cx_Oracle.connect(username, password, connection_string,
encoding="UTF-8", nencoding="UTF-8")
This issue is also discussed here at https://github.com/oracle/python-cx_Oracle/issues/157

updating DATETIME causes sqlite error on BB10

I created my table with this query
CREATE TABLE SETTINGS(NAME VARCHAR(1050), VALUE VARCHAR(1550),CREATE_DATE_TIME DATETIME,UPDATE_DATE_TIME DATETIME, PRIMARY KEY(NAME))
Then I inserted data like this
INSERT INTO SETTINGS(NAME, VALUE ,CREATE_DATE_TIME ,UPDATE_DATE_TIME) VALUES('CellIDKey','Android#MoblLe.NAv',DATETIME('NOW'), DATETIME('NOW'))
At this point it works fine. Now if I want to run an update query like this,
UPDATE SETTINGS SET VALUE='Android#AfriG1s.MoblLe.NAv' CREATE_DATE_TIME=DATETIME('NOW') WHERE NAME='CellIDKey'
It shows the following error on console
QSqlError::type= "QSqlError::ConnectionError" , QSqlError::number= -1 , databaseText= "No query" , driverText= "Unable to fetch row"
But if I run this update query like this,
UPDATE SETTINGS SET VALUE='Android#AfriG1s.MoblLe.NAv' WHERE NAME='CellIDKey'
Now it works fine. I don't know what is wrong with the DATETIME('NOW') statement on update query.
This is not valid SQL:
UPDATE SETTINGS SET VALUE='Android#AfriG1s.MoblLe.NAv' CREATE_DATE_TIME=DATETIME('NOW') WHERE NAME='CellIDKey'
-- ---------------------------------------------------^ Missing comma!
The individual assignments in a SET need to be separated by commas like this:
UPDATE SETTINGS
SET VALUE='Android#AfriG1s.MoblLe.NAv', -- This comma is needed
CREATE_DATE_TIME=DATETIME('NOW')
WHERE NAME='CellIDKey'

Django PostgreSQL -- Drop all tables?

Here's how I can do it when MySQL is the backend,
cursor.execute('show tables')
rows = cursor.fetchall()
for row in rows:
cursor.execute('drop table %s; ' % row[0])
But how can I do it when postgresql is the backend?
cursor.execute("""SELECT table_name FROM information_schema.tables WHERE table_schema='public' AND table_type != 'VIEW' AND table_name NOT LIKE 'pg_ts_%%'""")
rows = cursor.fetchall()
for row in rows:
try:
cursor.execute('drop table %s cascade ' % row[0])
print "dropping %s" % row[0]
except:
print "couldn't drop %s" % row[0]
Courtesy of http://www.siafoo.net/snippet/85
You can use select * from pg_tables; get get a list of tables, although you probably want to exclude where schemaname <> 'pg_catalog'...
Based on another one of your recent questions, if you're trying to just drop all your django stuff, but don't have permission to drop the DB, can you just DROP the SCHEMA that Django has everything in?
Also on your drop, use CASCADE.
EDIT: Can you select * from information_schema.tables; ?
EDIT: Your column should be row[2] instead of row[0] and you need to specify which schema to look at with a WHERE schemaname = 'my_django_schema_here' clause.
EDIT: Or just SELECT table_name from pg_tables where schemaname = 'my_django_schema_here'; and row[0]
Documentation says that ./manage.py sqlclear Prints the DROP TABLE SQL statements for the given app name(s).
I use this script to clear the tables, I put it in a script called phoenixdb.sh because it burns the DB down and a new one rises from the ashes. I use this to prevent lots of migrations in the early dev portion of the project.
set -e
python manage.py dbshell <<EOF
DROP SCHEMA public CASCADE;
CREATE SCHEMA public;
EOF
python manage.py migrate
This wipes the tables from the db without deleting the db. Your Django user will need to own the schema though which you can setup with:
alter schema public owner to django-db-user-name;
And you might want to change the owner of the db as well
alter database django-db-name owner to django-db-user-name;
\dt is the equivalent command in postgres to list tables. Each row will contain values for (schema, Name, Type, Owner), so you have to use the second (row[1]) value.
Anyway, you solution will break (in MySQL and PostgreSQL) when foreign-key constraints are involved, and if there aren't any, you might get troubles with the sequences. So the best way is in my opinion to simply drop the whole database and call initdb again (which is also the more efficient solution).