I'm learning Clojure and facing a problem.
(def db-config {:dbtype "sqlite" :dbname "demo.db"})
(def db
(jdbc/get-datasource db-config))
(defn execute [sql]
(jdbc/execute! db sql
{:return-keys true}))
I have a web page which user can change db name.
The problem is how to load the db with dynamic db name?
example, there are 5 database files, "1.db", "2.db"..."5.db". After user click 5.db in web page. I should execute sql with 5.db.
I can do it in Python with a simple way. but I don't know the Clojure way to do it.
Make the db value an argument of the execute function:
(defn ->db [db-name]
(jdbc/get-datasource (assoc {:dbtype "sqlite"} :dbname db-name)))
(defn execute [db sql]
(jdbc/execute! db sql
{:return-keys true}))
(execute (->db "foo.db") sql)
Related
I'm running the dev db in datomic-free
However, I thought that it would write something to the database but when I do a pull, it's not there. this is my peer.clj
(ns dank.peer
(:require [datomic.api :as d :refer (q)]
[clojure.pprint :as pp]))
; Name the database as a uri
(def uri "datomic:mem://dank")
; Read the schema and seed data as strings
(def schema-tx (read-string (slurp "resources/dank/schema.edn")))
(def data-tx (read-string (slurp "resources/dank/seed-data.edn")))
; Initialize the db with above vars
(defn init-db
[]
(when (d/create-database uri)
(let
[conn (d/connect uri)]
#(d/transact conn schema-tx)
#(d/transact conn data-tx))))
(init-db)
(def conn (d/connect uri))
(def db (d/db conn))
(defn create
"Creates a new note"
[notes]
#(d/transact
conn
[{:db/id #db/id[:db.part/user]
:item/author "default"
:item/notes notes}]))
(defn get-notes
[author]
(d/pull-many db '[*]
(flatten (into []
(q '[:find ?e
:in $ ?author
:where
[?e :item/author ?author]]
db
author
)))))
(create "some note") ----> shows the whole transaction data structure with before and after.
(get-notes "default") ---> [] it is empty!
Am I missing something here?
mavbozo is correct in his response that when you call your get-notes function you are using a value of the database from before you transacted your new note.
If you were to call (def db (d/db conn)) again prior to get-notes you will see the new data in the database.
However, it's generally preferred to avoid dealing with connections or databases as global variables, as it prevents function behaviors from being determined strictly from their parameters. As you noticed, the behavior of get-notes depends on the state of the db variable in your global state. As a rule of thumb, it's better to pass a database value to functions that query or pull from the database and pass a connection to functions that transact against a database or monitor the state of a database over time (see http://docs.datomic.com/best-practices.html#consistent-db-value-for-unit-of-work for more discussion of individual values of a Datomic database).
In this case, your get-notes function could be something like:
(defn get-notes
[db author]
(d/q '[:find (pull ?e [*])
:in $ ?author
:where
[?e :item/author ?author]]
db
author))
Which you can then invoke via:
(get-notes (d/db conn) "default")
To query the most current version of the database. This approach also has the advantage that you can call get-notes with any database value (i.e. the most recent, one from a given time in the past, or a history db).
Similarly, I would recommend changing your create function to take a connection instead of using the globally defined conn.
As far as managing your connection, again it is best to avoid using a global variable whenever possible. Some preferred options might be to put the connection into an application state map or an atom that is passed throughout your program as appropriate. Another good option that is used frequently is to leverage Stuart Sierra's component library (https://github.com/stuartsierra/component) to manage the lifecycle of the connection.
t0, t1, t2 denote increasing time t where t0 precedes t1, t1 precedes t2
(d/db conn) returns current snapshot of datomic database when the call is made.
let's say in t0 you called (def db (d/db conn)). At time t0 the notes are empty. the db here is a snapshot of db time t0.
then you create something during time t1 using (create "some note"). At time t1, the notes at least contain 1 note.
then during time t2 you call (get-notes "default"). At time t2, the notes at least contain 1 note.
the problem is, the db inside get-notes still refers to database snapshot during time t0 that is the db you retrieve using (def db (d/db conn)) at t0. At time t0, there are still no notes. That's why get-notes return empty.
when you query a datomic database, you must specify which db you want to query: the db as of now, yesterday db, db from 1 hour ago, etc. Of course, it's quite easy to get db as of now, just call (d/db conn).
I find it useful to make db explicit for query functions such as changing your get-notes function declaration parameter to (defn get-notes [db author] ....).
I'm using Datomic and would like to pull entire entities from any number of points in time based on my query. The Datomic docs have some decent examples about how I can perform queries from two different database instances if I know those instances before the query is performed. However, I'd like my query to determine the number of "as-of" type database instances I need and then use those instances when pulling the entities. Here's what I have so far:
(defn pull-entities-at-change-points [entity-id]
(->>
(d/q
'[:find ?tx (pull ?dbs ?client [*])
:in $ [?dbs ...] ?client
:where
[?client ?attr-id ?value ?tx true]
[(datomic.api/ident $ ?attr-id) ?attr]
[(contains? #{:client/attr1 :client/attr2 :client/attr3} ?attr)]
[(datomic.api/tx->t ?tx) ?t]
[?tx :db/txInstant ?inst]]
(d/history (d/db db/conn))
(map #(d/as-of (d/db db/conn) %) [1009 1018])
entity-id)
(sort-by first)))
I'm trying to find all transactions wherein certain attributes on a :client entity changed and then pull the entity as it existed at those points in time. The line: (map #(d/as-of (d/db db/conn) %) [1009 1018]) is my attempt to created a sequence of database instances at two specific transactions where I know the client's attributes changed. Ideally, I'd like to do all of this in one query, but I'm not sure if that's possible.
Hopefully this makes sense, but let me know if you need more details.
I would split out the pull calls to be separate API calls instead of using them in the query. I would keep the query itself limited to getting the transactions of interest. One example solution for approaching this would be:
(defn pull-entities-at-change-points
[db eid]
(let
[hdb (d/history db)
txs (d/q '[:find [?tx ...]
:in $ [?attr ...] ?eid
:where
[?eid ?attr _ ?tx true]]
hdb
[:person/firstName :person/friends]
eid)
as-of-dbs (map #(d/as-of db %) txs)
pull-w-t (fn [as-of-db]
[(d/as-of-t as-of-db)
(d/pull as-of-db '[*] eid)])]
(map pull-w-t as-of-dbs)))
This function against a db I built with a toy schema would return results like:
([1010
{:db/id 17592186045418
:person/firstName "Gerry"
:person/friends [{:db/id 17592186045419} {:db/id 17592186045420}]}]
[1001
{:db/id 17592186045418
:person/firstName "Jerry"
:person/friends [{:db/id 17592186045419} {:db/id 17592186045420}]}])
A few points I'll comment on:
the function above takes a database value instead of getting databases from the ambient/global conn.
we map pull for each of various time t's.
using the pull API as an entry point rather than query is appropriate for cases where we have the entity and other information on hand and just want attributes or to traverse references.
the impetus to get everything done in one big query doesn't really exist in Datomic since the relevant segments will have been realized in the peer's cache. You're not, i.e., saving a round trip in using one query.
the collection binding form is preferred over contains and leverages query caching.
Using 'load-data' below from the Clojure repl (using 'util.clj' from the tutorial https://github.com/swannodette/om/wiki/Intermediate-Tutorial with a modified schema and initial data set) to load data into a new Datomic database, the data does not show up in the Datomic console.
However, I get no error message when performing the 'load-data' action from the repl.
The schema shows up as expected in the Datomic console. Using code unmodified from the tutorial, I can see both the schema and the data.
I must have a problem in the code that sets the initial data. But I don't know where it is since there is no error message.
How can I get error messages and other detail from an init transaction on a Datomic database?
Code:
(defn transact-all [conn f]
(doseq [txd (read-all f)]
(d/transact conn txd))
:done)
(defn load-schema []
(transact-all (get-conn) (io/resource "data/schema.edn")))
(defn load-data []
(transact-all (get-conn) (io/resource "data/initial.edn")))
;; Logging provides some comparison with the known working scenario, but so far I only can log entity id's:
(defn read-log []
(d/q '[:find ?e
:in ?log ?t1 ?t2
:where [(tx-ids ?log ?t1 ?t2) [?tx ...]]
[(tx-data ?log ?tx) [[?e]]]]
(d/log (get-conn)) #inst "2014-07-14" #inst "2015-07-01")
)
In Clojure you can use # or deref to get a transaction's results, e.g.:
#(d/transact conn txd)
The map it returns is described in the docs for d/transact:
http://docs.datomic.com/clojure/#datomic.api/transact
See in particular:
If the transaction aborts, attempts to get the future's value throw a java.util.concurrent.ExecutionException, wrapping a java.lang.Error containing error information. If the transaction times out, the call to transact itself will throw a RuntimeException. The transaction timeout can be set via the system property datomic.txTimeoutMsec, and defaults to 10000 (10 seconds).
Invalid transactions will also throw an IllegalArgumentException (or some other exception).
I want to created post as return value just after execute my db query function. Here is one example from my db functions:
(defn add-post-record [post]
(sql/with-connection
db
(sql/insert-record :post post )))
and what i need in my route is something like:
(def post (db/add-post-record {:title title
:body body
:owner user
:isdraft isdraft}))
Then i am gonna use this like: (:id post)
I am so new in clojure. This may be a very simple problem but i am stuck.
thank you.
I can not test this right now, but reading the documentation of insert-record and with-connection, I think something like:
(defn add-post-record [post]
(let [keys (sql/with-connection db
(sql/insert-record :post post ))]
(merge post keys))
It is not very clear to me what exactly the map returned by insert-record contains, try it out.
(sql/with-connection *db-atom* (insert-data value1 value2)
(sql/with-connection *db-atom* (read-tuple-as-map)))
From the above example, does the nested sql/with-connection open a new connection to the DB? Or does it use the one that was created earlier?
I would in general recommend using clojure.java.jdbc instead of clojure.contrib.sql because the latter is not supposed to work with clojure newer than 1.2.0.
in clojure.java.jdbc with-connection uses binding to add the connection to a map of connections in the db var for any wrapped calls, so the second one will overwrite the frist one.
from: jdbc.clj
(defn with-connection*
"Evaluates func in the context of a new connection to a database then
closes the connection."
[db-spec func]
(with-open [^java.sql.Connection con (get-connection db-spec)]
(binding [*db* (assoc *db* :connection con :level 0 :rollback (atom false))]
(func))))