Slick: get table name - slick-3.0

With a table definition like this one:
class Test(_tableTag: Tag) extends Table[TestRow](_tableTag, "test") { ... }
how can I get back the table name (Tag "test") from an instance of Test?
The thing is I can perfectly execute some queries like db run TableQuery[Test].result, but to write raw sql, I need the table name.

If you look at Slick's TableQuery ScalaDoc there is a method called baseTableRow which says:
def baseTableRow: E
Get the "raw" table row that represents the table itself, as opposed
to a Path for a variable of the table's type. This method should
generally not be called from user code.
So you go to E <: AbstractTable's "definition" (AbstractTable) Scaladoc and find what you need, namely val tableName: String. The trick here is to know where to look (possible implicit conversions and other stuff...), that is, how to navigate the Scala(Doc) rabbithole. xD

Related

Doctrine returning strange field names from query

I am using "doctrine/doctrine-orm-module": "^2.1" (it is a module for zend framework 3). I want to create a query which will return rows with field names (trivial, right?). But instead of exact names of fields I am getting this query result:
SELECT
u0_.id AS id_0, u0_.username AS username_1, u0_.email AS email_2,
u0_.first_name AS first_name_3, u0_.last_name AS last_name_4,
u0_.password AS password_5, u0_.status AS status_6, u0_.created AS created_7,
u0_.modified AS modified_8
FROM
user_item u0_
ORDER BY
u0_.id DESC
This query is generated by this code:
$entityManager = $this->getEntityManager();
$queryBuilder = $entityManager->createQueryBuilder();
$queryBuilder->select('u')
->from(UserItem::class, 'u')
->orderBy('u.id', 'DESC')
;
$query = $queryBuilder->getQuery();
echo $query->getSql();
print_r($query->getParameters());
die('|||');
What is the "0_" appending to the table name? What is appendings "_x" to the fields name?
How can I get normal fields and tables names without appended "_x"?
Just names, I'm assuming both the first_name and last_name as shown in that generated SQL, right?
I changed the order below, makes it easier to read / understand.
What you want to do is (pseudo code): Select from UserItem all the first & last names
So, write the code that way :)
$queryBuilder
->from(UserItem::class, 'u')
->select(['u.first_name', 'u.last_name'])
->orderBy('u.id', 'DESC'); // Might want to sort by either u.first_name or u.last_name
What's in the QueryBuilder?
->from(UserItem::class, 'u') - First parameter is the FQCN (Fully Qualified Class Name) of the Entity you wish to use with the QueryBuilder. Not required is the second parameter, which is an alias to use for this instance of the QueryBuilder to recognize the FQCN defined class by. (Off of the top of my head it defaults to snake_case'd names of the class, in this case "user_item")
->select(['u.first_name', 'u.last_name']) - Function takes a "mixed" param. Click through to its definition and you'll see the following in the function:
$selects = is_array($select) ? $select : func_get_args();
Which indicates that it will always pass the "$selects" on the next bit as an array. (Another hint is that $selects is plural)
->orderBy('u.id', 'DESC') - Creates a rule to order results by. If you click through to this function, you'll see that this one ends like so:
return $this->add('orderBy', $orderBy);
Meaning: you can add more than 1 order by.
When it comes to the generated DQL:
u0_ is the table alias as defined in the DQL, from your question: FROM user_item u0_, this will later be transformed to MySQL (usually) which will be the same. It sets u0_ as an alias for user_item.
The _* appended to property names is just plain the order of the columns as they've been created in the database (have a look, they'll be in that order).
Lastly, the fact you were receiving entire entities and not just the names (first_name & last_name) is due to ->select('u'). Because no property (or properties as shown above) is defined, Doctrine assumes you wish to receive the whole enchalada. Doing ->select('u.first_name') would then get you just the first names, and using an array as above would get you more than 1 property.
Hope that helped you out :)

Convert the value of a field in a django RawQueryset to a different django field type

I have a rather complex query that's generating a Django RawQuerySet. This specific query returns some fields that aren't part of the model that the RawQuerySet is based on, so I'm using .annotate(field_name=models.Value('field_name')) to attach it as an attribute to individual records in the RawQuerySet. The most important custom field is actually a uuid, which I use to compose URLs using Django's {% url %} functionality.
Here's the problem: I'm not using standard uuids inside my app, I'm using SmallUUIDs (compressed UUIDs.) These are stored in the database as native uuidfields then converted to shorter strings in python. So I need to somehow convert the uuid returned as part of the RawQuerySet to a SmallUUID for use inside a template to generate a URL.
My code looks somewhat like this:
query = "SELECT othertable.uuid_field as my_uuid FROM myapp_mymodel
JOIN othertable o ON myapp_mymodel.x = othertable.x"
MyModel.objects.annotate(
my_uuid=models.Value('my_uuid'),
).raw(query)
Now there is a logical solution here, there's an optional kwarg for models.Value called output_field, making the code look like this:
MyModel.objects.annotate(
my_uuid=models.Value('my_uuid', output_field=SmallUUIDField()),
).raw(query)
But it doesn't work! That kwarg is completely ignored and the type of the attribute is based on the type returned from the database and not what's in output_field. In my case, I'm getting a uuid output because Postgres is returning a UUID type, but if I were to change the query to SELECT cast othertable.uuid_field as text) as my_uuid I'd get the attribute in the format of a string. It appears that Django (at least version 1.11.12) doesn't actually care what is in that kwarg in this instance.
So here's what I'm thinking are my potential solutions, in no particular order:
Change the way the query is formatted somehow (either in Django or in the SQL)
Change the resulting RawQuerySet in some way before it's passed to the view
Change something inside the templates to convert the UUID to a smalluuid for use in the URL reverse process.
What's my best next steps here?
A couple of issues with your current approach:
Value() isn't doing what you think it is - your annotation is literally just annotating each row with the value "my_uuid" because that is what you have passed to it. It isn't looking up the field of that name (to do that you need to use F expressions).
Point 1 above doesn't matter anyway because as soon as you use raw() then the annotation is ignored - which is why you see no effect coming from it.
Bottom line is that trying to annotate a RawQuerySet isn't going to be easy. There is a translations argument that it accepts, but I can't think of a way to get that to work with the type of join you are using.
The next best suggestion that I can think of is that you just manually convert the field into a SmallUUID object when you need it - something like this:
from smalluuid import SmallUUID
objects = MyModel.objects.raw(query)
for o in objects:
# Take the hex string obtained from the database and convert it to a SmallUUID object.
# If your database has a built-in UUID type you will need to do
# SmallUUID(small=o.my_uuid) instead.
my_uuid = SmallUUID(hex=o.my_uuid)
(I'm doing this in a loop just to illustrate - depending on where you need this you can do it in a template tag or view).

ndb verify entity uniqueness in transaction

I've been trying to create entities with a property which should be unique or None something similar to:
class Thing(ndb.Model):
something = ndb.StringProperty()
unique_value = ndb.StringProperty()
Since ndb has no way to specify that a property should be unique it is only natural that I do this manually like this:
def validate_unique(the_thing):
if the_thing.unique_value and Thing.query(Thing.unique_value == the_thing.unique_value).get():
raise NotUniqueException
This works like a charm until I want to do this in an ndb transaction which I use for creating/updating entities. Like:
#ndb.transactional
def create(the_thing):
validate_unique(the_thing)
the_thing.put()
However ndb seems to only allow ancestor queries, the problem is my model does not have an ancestor/parent. I could do the following to prevent this error from popping up:
#ndb.non_transactional
def validate_unique(the_thing):
...
This feels a bit out of place, declaring something to be a transaction and then having one (important) part being done outside of the transaction. I'd like to know if this is the way to go or if there is a (better) alternative.
Also some explanation as to why ndb only allows ancestor queries would be nice.
Since your uniqueness check involves a (global) query it means it's subject to the datastore's eventual consistency, meaning it won't work as the query might not detect freshly created entities.
One option would be to switch to an ancestor query, if your expected usage allows you to use such data architecture, (or some other strongly consistent method) - more details in the same article.
Another option is to use an additional piece of data as a temporary cache, in which you'd store a list of all newly created entities for "a while" (giving them ample time to become visible in the global query) which you'd check in validate_unique() in addition to those from the query result. This would allow you to make the query outside the transaction and only enter the transaction if uniqueness is still possible, but the ultimate result is the manual check of the cache, inside the transaction (i.e. no query inside the transaction).
A 3rd option exists (with some extra storage consumption as the price), based on the datastore's enforcement of unique entity IDs for a certain entity model with the same parent (or no parent at all). You could have a model like this:
class Unique(ndb.Model): # will use the unique values as specified entity IDs!
something = ndb.BooleanProperty(default=False)
which you'd use like this (the example uses a Unique parent key, which allows re-using the model for multiple properties with unique values, you can drop the parent altogether if you don't need it):
#ndb.transactional
def create(the_thing):
if the_thing.unique_value:
parent_key = get_unique_parent_key()
exists = Unique.get_by_id(the_thing.unique_value, parent=parent_key)
if exists:
raise NotUniqueException
Unique(id=the_thing.unique_value, parent=parent_key).put()
the_thing.put()
def get_unique_parent_key():
parent_id = 'the_thing_unique_value'
parent_key = memcache.get(parent_id)
if not parent_key:
parent = Unique.get_by_id(parent_id)
if not parent:
parent = Unique(id=parent_id)
parent.put()
parent_key = parent.key
memcache.set(parent_id, parent_key)
return parent_key

SQLAlchemy Reflection Using Metaclass with Column Override

I have a set of dynamic database tables (Postgres 9.3 with PostGIS) that I am mapping using a python metaclass:
cls = type(str(tablename), (db.Model,), {'__tablename__':tablename})
where, db.Model is the db object via flask-sqlalchemy and tablename is a bit of unicode.
The cls is then added to an application wide dictionary current_app.class_references (using Flask's current_app) to avoid attempts to instantiate the class multiple times.
Each table contains a geometry column, wkb_geometry stored in Well Known Binary. I want to map these to use geoalchemy2 with the final goal of retrieving GeoJSON.
If I was declaring the table a priori, I would use:
class GeoPoly():
__tablename__ = 'somename'
wkb_geometry = db.Column(Geometry("POLYGON"))
#more columns...
Since I am trying to do this dynamically, I need to be able to override the reflection of cls1 with the known type.
Attempts:
Define the column explicitly, using the reflection override syntax.
cls = type(str(tablename), (db.Model,), {'__tablename__':tablename,
'wkb_geometry':db.Column(Geometry("POLYGON"))})
which returns the following on a fresh restart, i.e. the class has not yet been instantiated:
InvalidRequestError: Table 'tablename' is already defined for this MetaData instance. Specify 'extend_existing=True' to redefine options and columns on an existing Table object
Use mixins with the class defined above (sans tablename):
cls = type(str(tablename), (GeoPoly, db.Model), {'__tablename__':tablename})
Again MetaData issues.
Override the column definition attribute after the class is instantiated:
cls = type(str(tablename), (db.Model,), {'__tablename__':tablename})
current_app.class_references[tablename] = cls
cls.wkb_geometry = db.Column(Geometry("POLYGON"))
Which results in:
InvalidRequestError: Implicitly combining column tablename.wkb_geometry with column tablename.wkb_geometry under attribute 'wkb_geometry'. Please configure one or more attributes for these same-named columns explicitly.
Is it possible to use the metadata construction to support dynamic reflection **and* *override a column known will be available on all tables?
I'm not sure if I exactly follow what you're doing, but I've overridden reflected columns in the past inside my own __init__ method on a custom metaclass that inherits from DeclarativeMeta. Any time the new base class is used, it checks for a 'wkb_geometry' column name, and replaces it with (a copy of) the one you created.
import sqlalchemy as sa
from sqlalchemy.ext.declarative import DeclarativeMeta, declarative_base
wkb_geometry = db.Column(Geometry("POLYGON"))
class MyMeta(DeclarativeMeta):
def __init__(cls, clsname, parents, dct):
for key, val in dct.iteritems():
if isinstance(sa.Column) and key is 'wkb_geometry':
dct[key] = wkb_geometry.copy()
MyBase = declarative_base(metaclass=MyMeta)
cls = type(str(tablename), (MyBase,), {'__tablename__':tablename})
This may not exactly work for you, but it's an idea. You probably need to add db.Model to the MyBase tuple, for example.
This is what I use to customize a particular column while relying on autoload for everything else. The code below assumes an existing declarative Base object for a table named my_table. It loads the metadata for all columns but overrides the definition of a column named polygon:
class MyTable(Base):
__tablename__ = 'my_table'
__table_args__ = (Column(name='polygon', type=Geometry("POLYGON"),
{'autoload':True})
Other arguments to the Table constructor can be provided in the dictionary. Note that the dictionary must appear last in the list!
The SQLAlchemy documentation Using a Hybrid Approach with __table__ provides more details and examples.

JPQL 2.0 - query entities based on superclass entity field

I have Entity (not MappedSuperclass) Person (with id, name, surname).
I also have Entity Employee extends Person (with other attributes, unimportant).
Inheritance Strategy is single table.
Now I want to create a namedQuery like this:
SELECT emp FROM Employee emp WHERE emp.name = ?1
In the IDE I get:
the state field path emp.name cannot be resolved to a valid type
I think the problem is that the attribute belongs to the superclass entity.
So far, I haven't found any solution other than using the TYPE operator to perform a selective query on Employee instances.
I'd like to perform the query above. Is that possible?
I'm on EclipseLink/JPA 2.0
Your JPQL seems valid. Did you try it at runtime? It could just be an issue with your IDE.
(include your code)
Person has to be #MappedSuperclass.
http://www.objectdb.com/api/java/jpa/MappedSuperclass
Furthermore, you should use named parameters, e.g. :name instead of ?...