I have simple app that should return single record from Mongo database.
(def movie (m/fetch-one :movie
:where {:_id id}))
id is correct but i keep getting nil as a return from this.
Here is how my :_id looks like
:_id #<ObjectId 5245ca7d44aed3e864a1c830>
I guess my problem is here somewhere, but I just don't have enough experience with Clojure to find an error
In this case id passed to where is 5245ca7d44aed3e864a1c830
I think the problem is that your id is a string instead of being an ObjectId object. To create an ObjectId use the function object-id. Note that there is also a fetch-by-id fn
Related
I'm trying to create an event store where I have a table somewhat like this:
CREATE TABLE domain_events (
id serial PRIMARY KEY,
timestamp timestamptz,
entity_id int,
type text,
data jsonb
);
And I have a namespace like
(ns my-domain.domain-events)
(defrecord PurchaseOrderCreated
[id timestamp purchase-order-id created-by])
(defrecord PurchaseOrderCancelled
[id timestamp purchase-order-id cancelled-by])
So type is a string name for the fully qualified class name, something like my_domain.domain_events.PurchaseOrderCreated, which comes from getting the type from a record e.g. (type (->PurchaseOrderCreated)). I should note that (type the-event) actually produces a string prefixed with class such as class my_domain.domain_events.PurchaseOrderCreaated so I am trimming this off before storing in the DB.
I'm trying to figure out how I can retrieve these event rows from the database and rehydrate them to domain events. I feel like I'm close but just haven't been able to get all the pieces.
I've tried to use new to construct a new record but I seem to have a hard time converting the string classname to a record.
(new (resolve (symbol "my_domain.domain_events.PurchaseOrderCreated")) prop1 prop2 ...)
Plus I'm not sure how easy it's going to be to use new since my array-of-properties is going to need to be in the correct order. It may be better to use the map->PurchaseOrderCreated but I'm still not sure how to dynamically resolve this constructor based on the string classname.
Can anybody advise on what the best approach would be here?
The following should work but I'm not show if there's a more idiomatic way for it:
((resolve (symbol "my_domain.domain_events"
(str "map->"
"PurchaseOrderCreated")))
{:id 123})
symbol takes can take a ns:
https://clojuredocs.org/clojure.core/symbol
Given a query like this:
(def query '[:find ?tx ?date ?v ?op
:in $ ?e ?a
:where
[?e ?a ?v ?tx ?op]
[?tx :db/txInstant ?date]])
Where the entity and the attribute are being user provided, I can make a feature where I have an audit log on a per field basis. This works well.
My problem is enum fields. The ?v that comes back is a :db/id, not the enum value itself, and I'm not really sure how I'm supposed to figure out that the field the user provided was an enum field and I should treat the Long value returned as a reference to an enum.
I think what I need to do is make the query return the :db/valueType for the attribute in question, and then if it's a ref type look up the entity that it points to. But I'm not sure if thats the right approach, or even how to do that.
I got this working. I noticed that there is a (d/attribute db atto-key) function available in datomic, that will return the metadata fields about the attribute. So I can use that to check :db/valueType being equal to :db/ref and then call (d/ident db entity-id) to resolve the entity id down to its enum value.
I'm starting to develop a Datomic-backed Clojure app, and I'm wondering what's the best way to declare the schema, in order to address the following concerns:
Having a concise, readable representation for the schema
Ensuring the schema is installed and up-to-date prior to running a new version of my app.
Intuitively, my approach would be the following:
Declaring some helper functions to make schema declarations less verbose than with the raw maps
Automatically installing the schema as part of the initialization of the app (I'm not yet knowledgeable enough to know if that always works).
Is this the best way to go? How do people usually do it?
I Use Conformity for this see Conformity repository. There is also a very useful blogpost from Yeller Here which will guide you how to use Conformity.
Raw maps are verbose, but have some great advantages over using some high level api:
Schema is defined in transaction form, what you specify is transactable (assuming the word exists)
Your schema is not tied to a particular library or spec version, it will always work.
Your schema is serializable (edn) without calling a spec API.
So you can store and deploy your schema more easily in a distributed environment since it's in data-form and not in code-form.
For those reasons I use raw maps.
Automatically installing schema.
This I don't do either.
Usually when you make a change to your schema many things may be happening:
Add new attribute
Change existing attribute type
Create full-text for an attribute
Create new attribute from other values
Others
Which may need for you to change your existing data in some non obvious and not generic way, in a process which may take some time.
I do use some automatization for applying a list of schemas and schema changes, but always in a controlled "deployment" stage when more things regarding data updating may occur.
Assuming you have users.schema.edn and roles.schema.edn files:
(require '[datomic-manage.core :as manager])
(manager/create uri)
(manager/migrate uri [:users.schema
:roles.schema])
For #1, datomic-schema might be of help. I haven't used it, but the example looks promising.
My preference (and I'm biased, as the author of the library) lies with datomic-schema - It focusses on only doing the transformation to normal datomic schema - from there, you transact the schema as you would normally.
I am looking to use the same data to calculate schema migration between the live datomic instance and the definitions - so that the enums, types and cardinality gets changed to conform to your definition.
The important part (for me) of datomic-schema is that the exit path is very clean - If you find it doesn't support something (that I can't implement for whatever reason) down the line, you can dump your schema as plain edn, save it off and remove the dependency.
Conformity will be useful beyond that if you want to do some kind of data migration, or more specific migrations (cleaning up the data, or renaming to something else first).
Proposal: using transaction functions to make declaring schema attributes less verbose in EDN, this preserving the benefits of declaring your schema in EDN as demonstrated by #Guillermo Winkler's answer.
Example:
;; defining helper function
[{:db/id #db/id[:db.part/user]
:db/doc "Helper function for defining entity fields schema attributes in a concise way."
:db/ident :utils/field
:db/fn #db/fn {:lang :clojure
:require [datomic.api :as d]
:params [_ ident type doc opts]
:code [(cond-> {:db/cardinality :db.cardinality/one
:db/fulltext true
:db/index true
:db.install/_attribute :db.part/db
:db/id (d/tempid :db.part/db)
:db/ident ident
:db/valueType (condp get type
#{:db.type/string :string} :db.type/string
#{:db.type/boolean :boolean} :db.type/boolean
#{:db.type/long :long} :db.type/long
#{:db.type/bigint :bigint} :db.type/bigint
#{:db.type/float :float} :db.type/float
#{:db.type/double :double} :db.type/double
#{:db.type/bigdec :bigdec} :db.type/bigdec
#{:db.type/ref :ref} :db.type/ref
#{:db.type/instant :instant} :db.type/instant
#{:db.type/uuid :uuid} :db.type/uuid
#{:db.type/uri :uri} :db.type/uri
#{:db.type/bytes :bytes} :db.type/bytes
type)}
doc (assoc :db/doc doc)
opts (merge opts))]}}]
;; ... then (in a later transaction) using it to define application model attributes
[[:utils/field :person/name :string "A person's name" {:db/index true}]
[:utils/field :person/age :long "A person's name" nil]]
I would suggest using Tupelo Datomic to get started. I wrote this library to simplify Datomic schema creation and ease understanding, much like you allude in your question.
As an example, suppose we’re trying to keep track of information for the world’s premiere spy agency. Let’s create a few attributes that will apply to our heroes & villains (see the executable code in the unit test).
(:require [tupelo.datomic :as td]
[tupelo.schema :as ts])
; Create some new attributes. Required args are the attribute name (an optionally namespaced
; keyword) and the attribute type (full listing at http://docs.datomic.com/schema.html). We wrap
; the new attribute definitions in a transaction and immediately commit them into the DB.
(td/transact *conn* ; required required zero-or-more
; <attr name> <attr value type> <optional specs ...>
(td/new-attribute :person/name :db.type/string :db.unique/value) ; each name is unique
(td/new-attribute :person/secret-id :db.type/long :db.unique/value) ; each secret-id is unique
(td/new-attribute :weapon/type :db.type/ref :db.cardinality/many) ; one may have many weapons
(td/new-attribute :location :db.type/string) ; all default values
(td/new-attribute :favorite-weapon :db.type/keyword )) ; all default values
For the :weapon/type attribute, we want to use an enumerated type since there are only a limited number of choices available to our antagonists:
; Create some "enum" values. These are degenerate entities that serve the same purpose as an
; enumerated value in Java (these entities will never have any attributes). Again, we
; wrap our new enum values in a transaction and commit them into the DB.
(td/transact *conn*
(td/new-enum :weapon/gun)
(td/new-enum :weapon/knife)
(td/new-enum :weapon/guile)
(td/new-enum :weapon/wit))
Let’s create a few antagonists and load them into the DB. Note that we are just using plain Clojure values and literals here, and we don’t have to worry about any Datomic specific conversions.
; Create some antagonists and load them into the db. We can specify some of the attribute-value
; pairs at the time of creation, and add others later. Note that whenever we are adding multiple
; values for an attribute in a single step (e.g. :weapon/type), we must wrap all of the values
; in a set. Note that the set implies there can never be duplicate weapons for any one person.
; As before, we immediately commit the new entities into the DB.
(td/transact *conn*
(td/new-entity { :person/name "James Bond" :location "London" :weapon/type #{ :weapon/gun :weapon/wit } } )
(td/new-entity { :person/name "M" :location "London" :weapon/type #{ :weapon/gun :weapon/guile } } )
(td/new-entity { :person/name "Dr No" :location "Caribbean" :weapon/type :weapon/gun } ))
Enjoy!
Alan
In trying to update database tag data on posts I have a function that looks like
(defn add-tag-to-post [eid email tags]
(d/transact conn [{:db/id eid,
:author/email email,
:post/tag tags}]))
Unfortunately, this does not preserve tags (unless I were to query by time). I would like to simply append to the tags list instead of write a new one.
Example:
{:title "Straight edges",
:content "fold instead of tearing it. ",
:tags "scissor-less edges", ;;check out these awesome tags
:author "me#website.hax"
:eid 1759}
(add-tag-to-post 1759 "me#website.hax" "art")
;;desired behavior: adds tag "art" to the list of tags
(get-post-by-eid 1759)
;;returns
{:title "Straight edges",
:content "fold instead of tearing it. ",
:tags "art", ;;not cumulative tag addition ;/
:author "me#website.hax"
:eid 1759}
How can this be achieved?
Does it make more sense to simply query over the lifetime of the entity instead?
You'll need to make your :post/tag attribute have :cardinality/many - see :db/cardinality in http://docs.datomic.com/schema.html.
By default, attributes have :cardinality/one which automatically retracts old values when they are overwritten. :cardinality/many cancels out that behaviour.
I have the following code:
(defentity users
(database korma-db)
(has-many tags))
(defentity tags
(database korma-db)
(belongs-to users))
(-> (select* users)
(with tags)
(fields :address)
(where {:id 1})
(as-sql))
and it generates the following sql:
SELECT "users"."address" FROM "users" WHERE ("users"."id" = ?)
While I would expect it to include a join to the tags table, by merit of having the with macro applied. Clearly this isn't the case, but executing it will produce an empty :tags key in the single returned record.
Am I missing something here?
did you create the actual referential constraint on the database?
I think I had the same issue once and I fixed it by creating a foreign key when defining the field.
i.e. in PostgreSQL
CREATE TABLE tags (
...
users_id INTEGER REFERENCES users(id),
)