(defn cypher
[query]
(let [result (-> *cypher* (.execute query))]
(for [row result
column (.entrySet row)]
{(keyword (.getKey column))
(Neo4jVertex. (.getValue column) *g*)})))
repl=> (cypher "start n=node:people('*:*') return n")
{:n #<Neo4jVertex v[1]>}
This query returns two results, yet I'm only able to ever see one using clojure.core/for. How should I be going about this?
The Neo4j docs have this example (which is what I'm trying to emulate):
for ( Map<String, Object> row : result )
{
for ( Entry<String, Object> column : row.entrySet() )
{
rows += column.getKey() + ": " + column.getValue() + "; ";
}
rows += "\n";
}
I think you need clojure.core/doseq (docs) instead.
user=> (doseq [row [1 2 3]]
#_=> [result [4 5 6]]
#_=> (println (str {:row row :result result}))))
{:row 1, :result 4}
{:row 1, :result 5}
{:row 1, :result 6}
{:row 2, :result 4}
{:row 2, :result 5}
{:row 2, :result 6}
{:row 3, :result 4}
{:row 3, :result 5}
{:row 3, :result 6}
So, adapted to your example, something like the following might work:
; ...
(doseq [row result]
[column (.entrySet row)]
(println (str {(keyword (.getKey column)) (Neo4jVertex. (.getValue column) *g*)}))))
; ...
Note that doseq returns nil; you'll have to call something with side effects like println in the body of the doseq form.
It looks like clojure.core/for does list comprehension, so something like the following actually returns a list:
user=> (for [row [1 2 3]
#_=> result [4 5 6]]
#_=> {:row row :result result})
({:row 1, :result 4} {:row 1, :result 5} {:row 1, :result 6} {:row 2, :result 4} {:row 2, :result 5} {:row 2, :result 6} {:row 3, :result 4} {:row 3, :result 5} {:row 3, :result 6})
Related
So basically how do I make a function given the input {:A 1 :B 2 :C {:X 5 :Y 5 :Z 5} :D 1} and the key :C
return {:A 1 :B 2 :C {:X 0 :Y 0 :Z 0} :D 1}? It's the same mapping but with the nested map all set to 0. Given that we know that the key :C has the nested values.
I'm very new to clojure and I'm struggling with loops and iterations so any help would be appreciated.
Thanks.
(defn with-zero-vals-at-key
[m k]
(update m k (fn [m2] (zipmap (keys m2) (repeat 0)))))
(with-zero-vals-at-key {:A 1 :B 2 :C {:X 5 :Y 5 :Z 5} :D 1} :C)
;; => {:A 1, :B 2, :C {:X 0, :Y 0, :Z 0}, :D 1}
;; OR
(defn with-zero-vals
[m]
(zipmap (keys m) (repeat 0)))
(update {:A 1 :B 2 :C {:X 5 :Y 5 :Z 5} :D 1}
:C
with-zero-vals)
;; => {:A 1, :B 2, :C {:X 0, :Y 0, :Z 0}, :D 1}
Given a vector:
(def vec [{:key 1, :value 10, :other "bla"}, {:key 2, :value 13, :other "bla"}, {:key 1, :value 7, :other "bla"}])
I'd like to iterate over each element and update :value with the sum of all :values to that point, so I would have:
[{:key 1, :value 10, :other "bla"}, {:key 2, :value 23, :other "bla"}, {:key 1, :value 30, :other "bla"}])
I've found this for printing the result, but I've tried to change the prn command to update-in, assoc-in in the code below (extracted from the link above) but I didn't work quite well.
(reduce (fn [total {:keys [key value]}]
(let [total (+ total value)]
(prn key total)
total))
0 vec)
I'm new to Clojure, how can I make it work?
If you want to get the running totals then the simplest way is to use reductions:
(reductions (fn [acc ele] (+ acc (:value ele)))
0
[{:key 1, :value 10, :other "bla"}, {:key 2, :value 13, :other "bla"}, {:key 1, :value 7, :other "bla"}])
;; => (0 10 23 30)
As you can see the function you pass to reductions has the same signature as the function you pass to a reduce. It is like you are asking for a reduce to be done every time a new element is reached. Another way of thinking about it is that every time a new accumulator is calculated it is kept, unlike with reduce where the caller only gets to see the result of the last calculation.
And so this is the code that would directly answer your question:
(->> [{:key 1, :value 10, :other "bla"}, {:key 2, :value 13, :other "bla"}, {:key 1, :value 7, :other "bla"}]
(reductions #(update %2 :value + (:value %1))
{:value 0})
next
vec)
;; => [{:key 1, :value 10, :other "bla"} {:key 2, :value 23, :other "bla"} {:key 1, :value 30, :other "bla"}]
You can accumulate the :values thus:
(reductions + (map :value v))
=> (10 23 30)
(I renamed the vector v to avoid tripping over clojure.core/vec.)
Then you can use mapv over assoc:
(let [value-sums (reductions + (map :value v))]
(mapv #(assoc %1 :value %2) v value-sums))
=> [{:key 1, :value 10, :other "bla"} {:key 2, :value 23, :other "bla"} {:key 1, :value 30, :other "bla"}]
If our input were to look something like this:
(({:a 1 :b 100} {:a 2 :b 300} {:a 4 :b 0}) ({:a 0 :b 10} {:a 4 :b 50}))
Our range that we would like to police over would be (0 1 2 3 4)
We would like the output to be:
(({:a 0 :b 0} {:a 1 :b 100} {:a 2 :b 300} {:a 3 :b 0} {:a 4 :b 0})
({:a 0 :b 10} {:a 1 :b 0} {:a 2 :b 0} {:a 3 :b 0} {:a 4 :b 50}))
Basically what it should do is look at the first list of maps then the second and so on and figure out what the range is for :a. We can easily do that with a min/max function. Now it creates a range and applies it to both lists. If an :a is missing on one list it adds in that :a with a :b of 0. (ie the addition of {:a 0 :b 0} or {:a 3 :b 0} in the first list. We have a function that can somewhat do it, but aren't quite there yet. Here it is:
(map
#(doseq [i (vec myRange)]
(if (some (fn [list] (= i list)) (map :a %))
nil
(println (conj % {:a i :b 0}))))
myList)
Obviously because of Clojures immutable data structures this function fails. If our input is something like:
(({:a 1, :b 1} {:a 2, :b 3} {:a 4, :b 5})
({:a 0, :b 3} {:a 4, :b 1}))
our output is:
(nil nil)
but if we println:
({:a 0, :b 0} {:a 1, :b 1} {:a 2, :b 3} {:a 4, :b 5})
({:a 3, :b 0} {:a 1, :b 1} {:a 2, :b 3} {:a 4, :b 5})
({:a 1, :b 0} {:a 0, :b 3} {:a 4, :b 1})
({:a 2, :b 0} {:a 0, :b 3} {:a 4, :b 1})
({:a 3, :b 0} {:a 0, :b 3} {:a 4, :b 1})
(nil nil)
We want the output to look like:
(({:a 0, :b 0} {:a 1, :b 1} {:a 2, :b 3} {:a 3, :b 0} {:a 4, :b 5})
({:a 0, :b 3} {:a 1, :b 0} {:a 2, :b 0} {:a 3, :b 0} {:a 4, :b 1}))
without the use of a println. Any suggestions?
the idea of working with immutable data in loop, is to pass the result of the latest iteration to the next one. You could do it with loop/recur, but in your case it is common to use reduce function (which is literally one of the cornerstones of functional programming):
(defn update-coll [range items]
(reduce (fn [items i] (if (some #(= (:a %) i) items)
items
(conj items {:a i :b 0})))
items range))
the first parameter to reduce "updates" items for every value of range (i), passing the updated value to the next iteration.
now you just have to map your input data with it:
(def input '(({:a 1 :b 100} {:a 2 :b 300} {:a 4 :b 0})
({:a 0 :b 10} {:a 4 :b 50})))
(map (comp (partial sort-by :a)
(partial update-coll [0 1 2 3 4]))
input)
output:
(({:a 0, :b 0} {:a 1, :b 100} {:a 2, :b 300}
{:a 3, :b 0} {:a 4, :b 0})
({:a 0, :b 10} {:a 1, :b 0} {:a 2, :b 0}
{:a 3, :b 0} {:a 4, :b 50}))
also you can do it without accumulation using clojure's sets:
(defn process-input [input r]
(let [r (map #(hash-map :a % :b 0) r)]
(map (fn [items] (into (apply sorted-set-by
#(compare (:a %1) (:a %2))
items)
r))
input)))
(process-input input [0 1 2 3 4])
output:
(#{{:b 0, :a 0} {:a 1, :b 100} {:a 2, :b 300}
{:b 0, :a 3} {:a 4, :b 0}}
#{{:a 0, :b 10} {:b 0, :a 1} {:b 0, :a 2}
{:b 0, :a 3} {:a 4, :b 50}})
My attempt:
(defn fill-in-missing [lists]
(let [[min max] (apply (juxt min max) (map :a (flatten lists)))]
(for [cur-list lists]
(for [i (range min (inc max))]
(merge {:a i :b 0}
(some #(when (= i (:a %)) %) cur-list))))))
To get the minimum and maximum values of :a I just collect every :a with map and flatten, then I use juxt so I can apply both the min and max functions to them at the same time.
Since we want two levels of nesting lists, I went for two for list comprehensions and tried to make an expression that would attempt to find the map in the input, or else return the default {:a i :b 0}.
I would like to implement a function that can group-by for multiple columns hierarchically. I can illustrate my requirement by the following tentative implementation for two columns:
(defn group-by-two-columns-hierarchically
[col1 col2 table]
(let [data-by-col1 ($group-by col1 table)
data-further-by-col2 (into {} (for [[k v] data-by-col1] [k ($group-by col2 v)]))
]
data-further-by-col2
))
I'm seeking help how to generalize on arbitrary number of columns.
(I understand that Incanter supports group-by for multiple columns but it only provides a structure not hierarchy, a map of composite key of multiple columns to value of datasets.)
Thanks for your help!
Note: to make Michał's solution work for incanter dataset, only a slight modification is needed, replacing "group-by" by "incanter.core/$group-by", illustrated by the following experiment:
(defn group-by*
"Similar to group-by, but takes a collection of functions and returns
a hierarchically grouped result."
[fs coll]
(if-let [f (first fs)]
(into {} (map (fn [[k vs]]
[k (group-by* (next fs) vs)])
(incanter.core/$group-by f coll)))
coll))
(def table (incanter.core/dataset ["x1" "x2" "x3"]
[[1 2 3]
[1 2 30]
[4 5 6]
[4 5 60]
[7 8 9]
]))
(group-by* [:x1 :x2] table)
=>
{{:x1 1} {{:x2 2}
| x1 | x2 | x3 |
|----+----+----|
| 1 | 2 | 3 |
| 1 | 2 | 30 |
},
{:x1 4} {{:x2 5}
| x1 | x2 | x3 |
|----+----+----|
| 4 | 5 | 6 |
| 4 | 5 | 60 |
},
{:x1 7} {{:x2 8}
| x1 | x2 | x3 |
|----+----+----|
| 7 | 8 | 9 |
}}
(defn group-by*
"Similar to group-by, but takes a collection of functions and returns
a hierarchically grouped result."
[fs coll]
(if-let [f (first fs)]
(into {} (map (fn [[k vs]]
[k (group-by* (next fs) vs)])
(group-by f coll)))
coll))
Example:
user> (group-by* [:foo :bar :quux]
[{:foo 1 :bar 1 :quux 1 :asdf 1}
{:foo 1 :bar 1 :quux 2 :asdf 2}
{:foo 1 :bar 2 :quux 1 :asdf 3}
{:foo 1 :bar 2 :quux 2 :asdf 4}
{:foo 2 :bar 1 :quux 1 :asdf 5}
{:foo 2 :bar 1 :quux 2 :asdf 6}
{:foo 2 :bar 2 :quux 1 :asdf 7}
{:foo 2 :bar 2 :quux 2 :asdf 8}
{:foo 1 :bar 1 :quux 1 :asdf 9}
{:foo 1 :bar 1 :quux 2 :asdf 10}
{:foo 1 :bar 2 :quux 1 :asdf 11}
{:foo 1 :bar 2 :quux 2 :asdf 12}
{:foo 2 :bar 1 :quux 1 :asdf 13}
{:foo 2 :bar 1 :quux 2 :asdf 14}
{:foo 2 :bar 2 :quux 1 :asdf 15}
{:foo 2 :bar 2 :quux 2 :asdf 16}])
{1 {1 {1 [{:asdf 1, :bar 1, :foo 1, :quux 1}
{:asdf 9, :bar 1, :foo 1, :quux 1}],
2 [{:asdf 2, :bar 1, :foo 1, :quux 2}
{:asdf 10, :bar 1, :foo 1, :quux 2}]},
2 {1 [{:asdf 3, :bar 2, :foo 1, :quux 1}
{:asdf 11, :bar 2, :foo 1, :quux 1}],
2 [{:asdf 4, :bar 2, :foo 1, :quux 2}
{:asdf 12, :bar 2, :foo 1, :quux 2}]}},
2 {1 {1 [{:asdf 5, :bar 1, :foo 2, :quux 1}
{:asdf 13, :bar 1, :foo 2, :quux 1}],
2 [{:asdf 6, :bar 1, :foo 2, :quux 2}
{:asdf 14, :bar 1, :foo 2, :quux 2}]},
2 {1 [{:asdf 7, :bar 2, :foo 2, :quux 1}
{:asdf 15, :bar 2, :foo 2, :quux 1}],
2 [{:asdf 8, :bar 2, :foo 2, :quux 2}
{:asdf 16, :bar 2, :foo 2, :quux 2}]}}}
What would be quickest way to transform this collection :
[[{:a 1} {:a 2} {:a 3}] [{:b 4} {:b 5} {:b 6}] [{:c 7} {:c 8} {:c 9}]]
into this collection? :
[{:a 1, :b 4, :c 7} {:a 2, :b 5, :c 8} {:a 3, :b 6, :c 9}]
I've come up with this but I still feel it could be shorter :
(map (partial apply merge)
(apply map vector collection))
Note that the numbers are randomly picked, just to show that the content of each val is unique...
(def data [[{:a 1} {:a 2} {:a 3}] [{:b 4} {:b 5} {:b 6}] [{:c 7} {:c 8} {:c 9}]])
(apply mapv merge data)
;=> [{:a 1, :c 7, :b 4} {:a 2, :c 8, :b 5} {:a 3, :c 9, :b 6}]