I have the following sample code:
queries = []
q1 = select([columns]).where(table.c.id == #).limit(#)
queries.append(q1)
q2 = select([columns]).where(table.c.id == #).limit(#)
queries.append(q2)
final_query = union_all(*queries)
The generated SQL should be this:
(select columns from table where id = # limit #)
UNION ALL
(select columns from table where id = # limit #)
But, I'm getting
select columns from table where id = # limit #
UNION ALL
select columns from table where id = # limit #
I tried using subquery, as follows for my queries:
q1 = subquery(select([columns]).where(table.c.id == #).limit(#))
The generated query then looks like this:
SELECT UNION ALL SELECT UNION ALL
I also tried doing
q1 = select([columns]).where(table.c.id == #).limit(#)).subquery()
But, I get the error:
'Select' object has no attribute 'subquery'
Any help to get the desired output with my subqueries wrapped in parentheses?
Note: this is not a duplicate of this question, because I'm not using Session.
EDIT
Okay, this works, but I don't believe it is very efficient, and it's adding an extra select * from (my sub query), but it works.
q1 = select('*').select_from((select(columns).where(table.c.id == #).limit(#)).alias('q1'))
So, if anyone has any ideas to optimize, or let me know if this is as good as it gets. I would appreciate it.
The author of SQLAlchemy seems to be aware of this and mentions a workaround for it on the SQLAlchemy 1.1 changelog page. The general idea is to do .alias().select() on each select.
stmt1 = select([table1.c.x]).order_by(table1.c.y).limit(1).alias().select()
stmt2 = select([table2.c.x]).order_by(table2.c.y).limit(2).alias().select()
stmt = union(stmt1, stmt2)
Related
If I have created a Dates column in flask sqlalchemy and also stored some dates in it, how can I check if each and every one of these dates are between to dates that I choose
There are lots of ways to accomplish this. Here's one example.
The goal is to select all rows that have a date outside your desired range. If the result set is empty, all rows have a valid date. For good measure, we'll include any rows that don't have a date value in our "bad rows" query.
from sqlalchemy import select, or_
with Session.begin() as session:
my_start_date = '2022-01-01'
my_end_date = '2022-01-31'
query = select(MyTable).where(
or_(MyTable.date < my_start_date,
MyTable.date > my_end_date,
MyTable.date == Null)
)
results = session.execute(query).all()
Now you can take a look at the results and see what's up.
you can use between in your orm or plain query:
with Session.begin() as session:
my_start_date = '2022-01-01'
my_end_date = '2022-01-31'
q = session.query(table_name).filter(table_name.c.date.between(my_start_date,my_end_date))
.....
or
select(table_name).where(table_name.c.date.between(my_start_date, my_end_date))
I'm creating a report table type calendar where users can create back up by date select a filter that would filter out the table values depending on the user selected. (i.e. if they choose user1, then only back ups with user1 will show up)
I would like it to be when P106_BACK_UP_BY_USER = 0, the table shows all the values (aka getting rid of the "where" portion of the query.
Thank you for your help!
I'm having issues with trying to allow the user to see all the back ups of the table again (getting rid of the filtered value). My current query is this:
I would like it to be when P106_BACK_UP_BY_USER = 0, the table shows all the values (aka getting rid of the "where" portion of the query.
Thank you for your help!
You can use case when statements in your query's where condition as follows:
select *
from my_table
where my_table.created_by =
(select user_name from my_table2 where app_users_id =
case :P106_BACKUP_BY_USER when 0 then app_users_id
else :P106_BACKUP_BY_USER
end)
And for getting better help, please paste your code as text not as an image next time.
This should work too:
...
WHERE b.active_server = s.server_id
AND (:P106_BACK_UP_BY_USER = 0 OR
UPPER(b.created_by) =
(SELECT UPPER(user_name)
FROM eba_bt_app_users
WHERE app_users_id = :P106_BACK_UP_BY_USER
)
);
With model relations like this
Risk <-- RiskGroup --> Group
I am trying to achieve a query resembling this
SELECT risk.id,
(
SELECT array_agg(bg.name) as names
FROM hazards_riskgroup rgroup
JOIN (SELECT * FROM base_group ORDER BY name) as bg
on rgroup.group_id = bg.id
WHERE risk_id = risk.id
GROUP BY risk_id
ORDER BY risk_id
) AS conf
FROM hazards_risk risk
ORDER BY conf NULLS LAST
;
And I have gotten as far as
group_names = (
RiskGroup.objects
.filter(risk_id=OuterRef('pk'))
.annotate(names=ArrayAgg('group__name'))
.order_by()
.order_by("group__name")
.values('names')
)
qs = Risk.objects.annotate(
conf=Subquery(group_names)
).order_by(F('conf').asc(nulls_last=True))
But that wil produce something along the lines of this query
SELECT risk.id,
(SELECT ARRAY_AGG(U2."name") AS "names"
FROM "hazards_riskgroup" U0
INNER JOIN "base_group" U2 ON (U0."group_id" = U2."id")
WHERE U0."risk_id" = (risk.id)
GROUP BY U0."id", U2."name"
ORDER BY U2."name" ASC) AS "conf"
FROM hazards_risk risk
ORDER BY conf NULLS LAST
Notice that the generated GROUP BY becomes GROUP BY U0."id", U2."name", where I want just GROUP BY U2."name".
Best I've been able to gather, is that it is related to some default ordering on models, but as you can tell, I've already accounted for that by inserting .order_by().
So I'm a little lost. I tried annotating with Raw sql instead, but in that case, I can't seem to be able to reference risk.id which is important to get the right group names for each risk.
The sql subquery is:
SELECT *
FROM ( SELECT *
FROM article
ORDER BY Fid desc
LIMIT 0, 200
) as l
WHERE keyId = 1
AND typeId = 0
I tried this:
rets = Article.objects.order_by("-Fid").values('Fid')
kwargs['keyId'] = 1
kwargs['typeId'] = 0
re = Article.objects.filter(Fid__in=rets).filter(**kwargs).values()
But it's not working. Can anyone explain how I can do this?
In your case I guess you can resort to raw SQL (untested). Note that using raw SQL you have to know the real table and column names (just test the statement directly on the database first, to see if it flies).
For example:
Article.objects.raw("""SELECT * from (
SELECT * FROM yourapp_article
ORDER BY fid DESC
LIMIT 0, 200
) AS q1 WHERE key_id=1 AND type_id=0""")
[update]
wuent wrtote:
thanks for your help. But the raw SQL is not my wanted. I need keep my program orm style. – wuent
If you are used to more powerful/consistent ORMs like SQLAlchemy or even peewee, give up your hope. The Django ORM has a very crippled design and AFAIK you can't do this kind of thing using it - the first version of this answer started with a rant about this.
Looking at your query again, I got the impression that you do not need a subquery, try querying the table directly - my guess is the result will be the same.
How about this?
rets = Article.objects.order_by("-Fid").values_list('Fid', flat=True)
kwargs['keyId'] = 1
kwargs['typeId'] = 0
re = Article.objects.filter(Fid__in=rets).filter(**kwargs).values()
from django.db import connection
q = 'some value'
sql1 = 'SELECT * FROM table WHERE field LIKE %%%s%%' % q
sql2 = 'SELECT * FROM table WHERE field LIKE %%'+ q +'%%'
cursor = connection.cursor()
cursor.execute( sql1 ) #why exception: IndexError: tuple index out of range ?
cursor.execute( sql2 ) #works ok
You need to QUOTE properly your SQL arguments.
And by quoting properly I mean using the quote facility provided by DBAPI, not adding a ' around your string, which is useless.
Correct code :
q = "%"+q+"%"
cursor.execute( 'SELECT * FROM table WHERE field LIKE %s', (q,) )
Really correct code :
q = "%"+q.replace("%","%%")+"%"
cursor.execute( 'SELECT * FROM table WHERE field LIKE %s', (q,) )
Suppose q = "a'bc"
First, rewrite this as "%a'bc%"
Then use it as a normal string argument. psycopg will rewrite it as '%a\'bc%' as it should.
If q may contain "%" and you want to search for it, then use the second one.
Using direct string manipulation will almost certainly lead to improper SQL that is vulnerable to SQL Injection attacks (see psycopg2's comments on the subject).
What I think you're looking to do is try and perform a LIKE '%some value%' in django, right?:
from django.db import connection
q = '%some value%'
cur = connection.cursor()
cur.execute("SELECT * FROM table WHERE field LIKE %(my_like)s", {'my_like': q})
As of psycopg2 2.4.1, the SQL that is executed on the server is:
SELECT * FROM table WHERE field LIKE '%some value%'
You need to QUOTE properly your SQL command:
sql1 = "SELECT * FROM table WHERE field LIKE '%%%s%%'" % q
sql2 = "SELECT * FROM table WHERE field LIKE '%"+ q +"%'"
And by quoting properly I mean using single quotes with LIKE expressions.