When trying to persist a list of node entities with a :threshold attribute defined thus in the schema:
{:db/id #db/id[:db.part/db]
:db/ident :node/threshold
:db/valueType :db.type/long
:db/cardinality :db.cardinality/one
:db/fulltext false
:db/doc "Threshold"
:db.install/_attribute :db.part/db}
i get the following error:
CompilerException java.util.concurrent.ExecutionException:
java.lang.IllegalArgumentException:
:db.error/wrong-type-for-attribute Value 90 is not a valid :int
for attribute :node/threshold
I use the following code:
(defn store-tree [tree]
#(d/transact dbconn/conn (into [] (vals tree))))
(store-tree parsed-tree-with-refs)
where tree is a map of node names to nodes.
Curiously enough, i took the EDN for the specific entity with :node threshold 90 from the REPL and manually transact'ed it, and it worked without any problems. I used this code:
#(d/transact dbconn/conn [{:db/id (d/tempid :db.part/user),
:node/threshold 90, :node/location "US"}])
Can someone please help?
Thanks,
Vitaliy.
Related
This question already has answers here:
Why does Datomic yield the same temporary ID twice in a row when iterating?
(2 answers)
Closed 6 years ago.
I am building Datomic transaction with this function, which I am then mapping over a list of input keywords:
(defn build-enum-transaction [inp]
(cond
(.contains (namespace (first inp)) "region")
[:db/add #db/id[:db.part/region] :db/ident (first inp)]
(.contains (namespace (first inp)) "sector")
[:db/add #db/id[:db.part/sector] :db/ident (first inp)]
(.contains (namespace (first inp)) "specialism")
[:db/add #db/id[:db.part/specialism] :db/ident (first inp)]))
(defn build-all-enum-transactions [inp]
(vec (map build-enum-transaction inp)))
The input data to the build-all-enum-transactions is:
([:region/EU]
[:region/UK]
[:region/NAFTA]
[:sector/NON-CYCLICALS]
[:sector/FINANCIALS]
[:specialism/INSURANCE]
[:specialism/VAT])
I get the following result:
[[:db/add #db/id[:db.part/region -1000289] :db/ident :region/EU]
[:db/add #db/id[:db.part/region -1000289] :db/ident :region/UK]
[:db/add #db/id[:db.part/region -1000289] :db/ident :region/NAFTA]
[:db/add #db/id[:db.part/sector -1000290] :db/ident :sector/NON-CYCLICALS]
[:db/add #db/id[:db.part/sector -1000290] :db/ident :sector/FINANCIALS]
[:db/add #db/id[:db.part/specialism -1000291] :db/ident :specialism/INSURANCE]
[:db/add #db/id[:db.part/specialism -1000291] :db/ident :specialism/VAT]]
As you can see, the :db.part/ should yield an incremental number for each however it only does so for each of the 'cond' clauses. Why is this? It appears as though the 'cond' is closing over the value and re-using it. Thanks.
You should use d/tempid to create a tempid at runtime. #db/id is a reader macro that will expand to a tempid when the program is read, i. e. compile time.
I've discovered an inconsistency between the way db.type/instant values are handled in the in (mem) and out (dev,sql...) of process transactor.
When using the in process mem transactor the underlying type of a db.type/instant value is preserved when transacting and retrieving. For example, (type-printer uri) below shows that a transacted java.util.TimeStamp is retrieved as a java.sql.TimeStamp. However in the out of process transactors it is retrieved as a java.util.Date.
In most cases this inconsistency does not cause problems, however when dealing with bad Date implementations it can cause all sorts of problems. In my case the fact that java.sql.TimeStamp does not have a symmetric equals method, meant that a bug in my code only appeared when I ran it in an out of process transactor...
Obviously Datomic does not have any control over external implementations of java.util.Date, so I think the 'Date sanitisation' approach of the out of process transactors is correct.
Really all I want is a sanity check before I raise it as an issue with the Datomic guys so if someone could take a look and see if you agree it would be greatly appreciated.
Thanks,
Matt.
(require '[datomic.api :as d])
(import '(java.sql Timestamp))
(defn type-printer [uri]
"Prints the type of a transacted java.sql.TimeStamp when retrieved from the databse"
(d/create-database uri)
(let [ts-att-schema {:db/id (d/tempid :db.part/db)
:db/ident :entity/ts
:db/valueType :db.type/instant
:db/cardinality :db.cardinality/one
:db.install/_attribute :db.part/db}
ts (Timestamp. 1000000000000)
entity {:db/id (d/tempid :db.part/user)
:entity/ts ts}
conn (d/connect uri)]
;transact schema and then entity with associated TimeStamp value
#(d/transact conn [ts-att-schema])
#(d/transact conn [entity])
(let [type (->> (d/db conn)
(d/q '[:find ?ts :where[?e :entity/ts ?ts]])
(first)
(first)
(type)
)]
(println "java.sql.TimeStamp is retrived as type" type))))
;TimeStamp type is preserved when being transacted and then retrieved
(type-printer "datomic:mem://test-db")
;TimeStamp is sanitised in the database and retrieved as a java.util.Date. This case obviously relies on a
;dev transactor running on localhost:4334. Have only verified for dev and sql transactors.
(type-printer "datomic:dev://localhost:4334/test-db")
So far my add-post-to-datomic method looks like
(defn add-post-to-datomic [title, content, useremail]
(d/transact conn [{:db/id (d/tempid :db.part/user),
:post/title title,
:post/content content,
:author/email useremail}]))
I would really like to add functionality to add a potentially variable number of tags.
In my awesome-schema.edn I have
{:db/id #db/id[:db.part/db]
:db/ident :post/tag
:db/valueType :db.type/string
:db/cardinality :db.cardinality/many
:db/doc "tag applied to a post"
:db.install/_attribute :db.part/db}
So it's okay if there are multiple ones in the db thanks to the cardinality set to many.
How can I write the above method to also work for a variable number of tags ?
Look at the Transactions Docs under "Cardinality many transactions". As long as you pass a set of tags this should work:
(defn add-post-to-datomic [title content user-email tag-set]
(d/transact conn [{:db/id (d/tempid :db.part/user)
:post/title title
:post/content content
:post/tag tag-set
:author/email user-email}]))
I'm getting into datomic and still don't grok it. How do I build a transaction that has references to a variable number of entities?
For example this creates a transaction with a child entity and a family entity with a child attribute that references the new child entity:
(defn insert-child [id child]
{:db/id #db/id id
:child/first-name (:first-name child)
:child/middle-name (:middle-name child)
:child/last-name (:last-name child)
:child/date-of-birth {:date-of-birth child}})
(defn insert-family [id]
(let [child-id #db/id[:db.part/user]]
(vector
(insert-child child-id
{:first-name "Richard"
:middle-name "M"
:last-name "Stallman"})
{:db/id id
:family/child child-id})))
(insert-family #db/id[:db.part/user])
=> [{:db/id #db/id[:db.part/user -1000012],
:child/first-name "Richard",
:child/middle-name "M",
:child/last-name "Stallman",
:child/date-of-birth nil}
{:db/id #db/id[:db.part/user -1000013],
:family/child #db/id[:db.part/user -1000012]}]
Notice I used let for child-id. I'm not sure how to write this such that I can map over insert-child while having a family entity that references each one.
I thought about using iterate over #db/id[:db.part/user] and the number of children then mapping over both the result of iterate and a vector of children. Seems kind of convoluted and #db/id[:db.part/user] isn't a function to iterate over to begin with.
Instead of using the macro form #db/id[:db.part/user] which is meant for EDN files and data literals, you should use d/tempid.
You could do something like this (using simplified child entities):
(ns family-tx
(:require [datomic.api :refer [q db] :as d]))
(def uri "datomic:mem://testfamily")
(d/delete-database uri)
(d/create-database uri)
(def conn (d/connect uri))
(def schema [
{:db/id (d/tempid :db.part/db)
:db/ident :first-name
:db/valueType :db.type/string
:db/cardinality :db.cardinality/one
:db.install/_attribute :db.part/db}
{:db/id (d/tempid :db.part/db)
:db/ident :last-name
:db/valueType :db.type/string
:db/cardinality :db.cardinality/one
:db.install/_attribute :db.part/db}
{:db/id (d/tempid :db.part/db)
:db/ident :family/child
:db/valueType :db.type/ref
:db/cardinality :db.cardinality/many
:db.install/_attribute :db.part/db}
])
#(d/transact conn schema)
(defn make-family-tx [kids]
(let [kids-tx (map #(into {:db/id (d/tempid :db.part/user)} %) kids)
kids-id (map :db/id kids-tx)]
(conj kids-tx {:db/id (d/tempid :db.part/user)
:family/child kids-id})))
(def kids [{:first-name "Billy" :last-name "Bob"}
{:first-name "Jim" :last-name "Beau"}
{:first-name "Junior" :last-name "Bacon"}])
#(d/transact conn (make-family-tx kids))
There are a few strategies for this discussed in the Transactions docs as well (see the "Identifying Entities" section).
I am getting a FileNotFoundException when using a database function that requires a namespace. I only get the error when using the persistent datomic free database but not when I'm using the memory database.
(ns test.core
(:use [datomic.api :only [q db] :as d]))
(def uris ["datomic:mem://test"
"datomic:free://localhost:4334/test"])
(map
d/delete-database uris)
(map
d/create-database uris)
(def conns (map d/connect uris))
(defn test-entity []
[{:db/id #db/id[:db.part/db]
:test/test "hello"}])
(def db-function
#db/fn {:lang :clojure
:params [database]
:requires [[test.core :as c]]
:code (c/test-entity)})
(map
#(d/transact % [{:db/id #db/id[:db.part/user]
:db/ident :db-function
:db/fn db-function}])
conns)
(map
#(d/transact % [{:db/id #db/id[:db.part/db]
:db/ident :test/test
:db/valueType :db.type/string
:db/cardinality :db.cardinality/one
:db.install/_attribute :db.part/db}])
conns)
(comment
(db-function nil)
(d/transact (first conns) [[:db-function]])
(d/transact (second conns) [[:db-function]]))
When you evaluate the first and second line in the comment, it's fine but when you evaluate the third line you get an exception.
Do I need to configure something in datomic so that it can "see" my project?
When you use an in memory database, the transactor runs in the same JVM instance as the peer, hence with the same classpath. But with the free database, the transactor is running in its own JVM instance, and is not aware about namespaces in the peers.
You can add jars to the transactor classpath by putting them in the lib/ folder.