I'm having some strange feeling abour sqlite3 parameters that I would like to expose to you.
This is my query and the fail message :
#query
'SELECT id FROM ? WHERE key = ? AND (userid = '0' OR userid = ?) ORDER BY userid DESC LIMIT 1;'
#error message, fails when calling sqlite3_prepare()
error: 'near "?": syntax error'
In my code it looks like:
// Query is a helper class, at creation it does an sqlite3_preprare()
Query q("SELECT id FROM ? WHERE key = ? AND (userid = 0 OR userid = ?) ORDER BY userid DESC LIMIT 1;");
// bind arguments
q.bindString(1, _db_name.c_str() ); // class member, the table name
q.bindString(2, key.c_str()); // function argument (std::string)
q.bindInt (3, currentID); // function argument (int)
q.execute();
I have the feeling that I can't use sqlite parameters for the table name, but I can't find the confirmation in the Sqlite3 C API.
Do you know what's wrong with my query?
Do I have to pre-process my SQL statement to include the table name before preparing the query?
Ooookay, should have looked more thoroughly on SO.
Answers:
- SQLite Parameters - Not allowing tablename as parameter
- Variable table name in sqlite
They are meant for Python, but I guess the same applies for C++.
tl;dr:
You can't pass the table name as a parameter.
If anyone have a link in the SQLite documentation where I have the confirmation of this, I'll gladly accept the answer.
I know this is super old already but since your query is just a string you can always append the table name like this in C++:
std::string queryString = "SELECT id FROM " + std::string(_db_name);
or in objective-C:
[#"SELECT id FROM " stringByAppendingString:_db_name];
Related
I have a C++Builder SQL Statement with a parameter like
UnicodeString SQLStatement = "INSERT INTO TABLENAME (DATETIME) VALUES (:dateTime)"
Can I add the parameter without quotes?
Usually I'd use
TADOQuery *query = new TADOQuery(NULL);
query->Parameters->CreateParameter("dateTime", ftString, pdInput, 255, DateTimeToStr(Now()));
which will eventually produce the SQL String
INSERT INTO TABLENAME (DATETIME) VALUES ('2022-01-14 14:33:00.000')
but because this is a legacy project (of course, it always is) and I have to maintain different database technologies, I need to be able to inject database specific date time conversion methods, so that the endresult would look like
INSERT INTO TABLENAME (DATETIME) VALUES (to_date('2022-01-14 14:33:00.000', 'dd.mm.yyyy hh24:mi:ss')))
If I try injecting this via my 'usual' method (because I don't think I can inject a second parameter into this one) it'd look like:
TADOQuery *query = new TADOQuery(NULL);
query->Parameters->CreateParameter("dateTime", ftInteger, pdInput, 255, "to_date('" + DateTimeToStr(Now()) + "', 'dd.mm.yyyy hh24:mi:ss')");
but of course the result would look like:
INSERT INTO TABLENAME (DATETIME) VALUES ('to_date('2022-01-14 14:33:00.000', 'dd.mm.yyyy hh24:mi:ss')'))
and therefore be invalid
Or is there another way to do this more cleanly and elegantly? Although I'd settle with 'working'.
I can work around this by preparing two SQL Statements and switch the statement when another database technology is but I just wanted to check if there is another way.
Why are you defining the parameter's DataType as ftInteger when your input value is clearly NOT an integer? You should be defining the DataType as ftDateTime instead, and then assigning Now() as-is to the parameter's Value. Let the database engine decide how it wants to format the date/time value in the final SQL per its own rules.
query->Parameters->CreateParameter("dateTime", ftDateTime, pdInput, 0, Now());
I have below python method which insert data in to a table. the first column is json_data and the 2nd column is file name. I am getting both the values to this function while calling this method from main.
def insert(sf_handler,data,file_name):
query = """INSERT INTO my_table (DATA,FILE_NAME)
(select (PARSE_JSON('%s'),'%s'))""" % {json.dumps(data),file_name)}
pd.read_sql(query,sf_handler)
But while executing this i am getting below error. Can someone help on this.
TypeError: not enough arguments for format string
I got the answer. Just type cast the file name and remove the flower bracket with parenthesis it will work.
query = """INSERT INTO my_table (DATA,FILE_NAME)
(select (PARSE_JSON('%s'),'%s'))""" % (json.dumps(data),str(file_name))
pd.read_sql(query,sf_handler)
Using the ignite C++ API, I'm trying to find a way to perform an SqlFieldsQuery to select a specific field, but would like to do this for a set of keys.
One way to do this, is to do the SqlFieldsQuery like this,
SqlFieldsQuery("select field from Table where _key in (" + keys_string + ")")
where the keys_string is the list of the keys as a comma separated string.
Unfortunately, this takes a very long time compared to just doing cache.GetAll(keys) for the set of keys, keys.
Is there an alternative, faster way of getting a specific field for a set of keys from an ignite cache?
EDIT:
After reading the answers, I tried changing the query to:
auto query = SqlFieldsQuery("select field from Table t join table(_key bigint = ?) i on t._key = i._key")
I then add the arguments from my set of keys like this:
for(const auto& key: keys) query.AddArgument(key);
but when running the query, I get the error:
Failed to bind parameter [idx=2, obj=159957, stmt=prep0: select field from Table t join table(_key bigint = ?) i on t._key = i._key {1: 159956}]
Clearly, this doesn't work because there is only one '?'.
So I then tried to pass a vector<int64_t> of the keys, but I got an error which basically says that std::vector<int64_t> did not specialize the ignite BinaryType. So I did this as defined here. When calling e.g.
writer.WriteInt64Array("data", data.data(), data.size())
I gave the field a arbitrary name "data". This then results in the error:
Failed to run map query remotely.
Unfortunately, the C++ API is neither well documented, nor complete, so I'm wondering if I'm missing something or that the API does not allow for passing an array as argument to the SqlFieldsQuery.
Query that uses IN clause doesn't always use indexes properly. The workaround for this is described here: https://apacheignite.readme.io/docs/sql-performance-and-debugging#sql-performance-and-usability-considerations
Also if you have an option to to GetAll instead and lookup by key directly, then you should use it. It will likely be more effective anyway.
Query with operator "IN" will not always use indexes. As a workaround, you can rewrite the query in the following way:
select field from Table t join table(id bigint = ?) i on t.id = i.id
and then invoke it like:
new SqlFieldsQuery(
"select field from Table t join table(id bigint = ?) i on t.id = i.id")
.setArgs(new Object[]{ new Integer[] {2, 3, 4} }))
Without saving SHA1 digest string in table directly. Is it possible to format the column in select statement ?
For example (Hope you know what i mean):
#item = Item.where(Digest::SHA1.hexdigest id.to_s:'356a192b7913b04c54574d18c28d46e6395428ab')
No, not the way you want it. The hexdigest method you're using won't be available at the database level. You could use database-specific functions though.
For example:
Item.where("LOWER(name) = ?", entered_name.downcase)
The LOWER() function will be available to the database so it can pass the name column to it.
For your case, I can suggest two solutions:
Obviously, store the encrypted field in the table. And then match.
key = '356a192b7913b04c54574d18c28d46e6395428ab'
Item.where(encrypted_id: key)
Iterate over all column values (ID, in your case) and find the one that matches:
all_item_ids = Item.pluck("CAST(id AS TEXT)")
item_id = all_item_ids.find{ |val| Digest::SHA1.hexdigest(val) == key }
Then you could use Item.find(item_id) to get the item or Item.where(id: item_id) to get an ActiveRecord::Relation object.
I need to write django raw query function to get the sum value and then write to the csn file.
I write my query
for time in Tracking_details.objects.raw('SELECT *,sum=SUM(work_time) FROM structure_tracking_details WHERE employee_id='+ employee_id + ' GROUP BY project_structure ') :
writer.writerow([ time.project_structure,time.sum ])
it tells
no such column: sum
How do I write the query correctly?
Replace sum=SUM(work_time) with SUM(work_time) AS sum.
BTW, employee_id='+ employee_id + ' is a very poor way of building queries. And you should not do it. It makes your query prone to SQL Injection as Django doesn't check whatever query you pass in raw() function. You can pass parameters to the raw query like this -
Tracking_details.objects.raw('SELECT *, SUM(work_time) AS sum FROM structure_tracking_details WHERE employee_id = %s GROUP BY project_structure', [employee_id])
More details.