Why does this bit of Clojure code:
user=> (map (constantly (println "Loop it.")) (range 0 3))
Yield this output:
Loop it.
(nil nil nil)
I'd expect it to print "Loop it" three times as a side effect of evaluating the function three times.
constantly doesn't evaluate its argument multiple times. It's a function, not a macro, so the argument is evaluated exactly once before constantly runs. All constantly does is it takes its (evaluated) argument and returns a function that returns the given value every time it's called (without re-evaluating anything since, as I said, the argument is evaluated already before constantly even runs).
If all you want to do is to call (println "Loop it") for every element in the range, you should pass that in as the function to map instead of constantly. Note that you'll actually have to pass it in as a function, not an evaluated expression.
As sepp2k rightly points out constantly is a function, so its argument will only be evaluated once.
The idiomatic way to achieve what you are doing here would be to use doseq:
(doseq [i (range 0 3)]
(println "Loop it."))
Or alternatively dotimes (which is a little more concise and efficient in this particular case as you aren't actually using the sequence produced by range):
(dotimes [i 3]
(println "Loop it."))
Both of these solutions are non-lazy, which is probably what you want if you are just running some code for the side effects.
You can get a behavior close to your intent by usig repeatedly and a lambda expression.
For instance:
(repeatedly 3 #(println "Loop it"))
Unless you're at the REPL, this needs to be surrounded by a dorun or similar. repeatedly is lazy.
Related
I am wondering that lazy-seq returns a finite list or infinite list. There is an example,
(defn integers [n]
(cons n (lazy-seq (integers (inc n)))))
when I run like
(first integers 10)
or
(take 5 (integers 10))
the results are 10 and (10 11 12 13 14)
. However, when I run
(integers 10)
the process cannot print anything and cannot continue. Is there anyone who can tell me why and the usage of laza-seq. Thank you so much!
When you say that you are running
(integers 10)
what you're really doing is something like this:
user> (integers 10)
In other words, you're evaluating that form in a REPL (read-eval-print-loop).
The "read" step will convert from the string "(integers 10)" to the list (integers 10). Pretty straightforward.
The "eval" step will look up integers in the surrounding context, see that it is bound to a function, and evaluate that function with the parameter 10:
(cons 10 (lazy-seq (integers (inc 10))))
Since a lazy-seq isn't realized until it needs to be, simply evaluating this form will result in a clojure.lang.Cons object whose first element is 10 and whose rest element is a clojure.lang.LazySeq that hasn't been realized yet.
You can verify this with a simple def (no infinite hang):
user> (def my-integers (integers 10))
;=> #'user/my-integers
In the final "print" step, Clojure basically tries to convert the result of the form it just evaluated to a string, then print that string to the console. For a finite sequence, this is easy. It just keeps taking items from the sequence until there aren't any left, converts each item to a string, separates them by spaces, sticks some parentheses on the ends, and voilĂ :
user> (take 5 (integers 10))
;=> (10 11 12 13 14)
But as you've defined integers, there won't be a point at which there are no items left (well, at least until you get an integer overflow, but that could be remedied by using inc' instead of just inc). So Clojure is able to read and evaluate your input just fine, but it simply cannot print all the items of an infinite result.
When you try to print an unbounded lazy sequence, it will be completely realized, unless you limit *print-length*.
The lazy-seq macro never constructs a list, finite or infinite. It constructs a clojure.lang.LazySeq object. This is a nominal sequence that wraps a function of no arguments (commonly called a thunk) that evaluates to the actual sequence when called; but it isn't called until it has to be, and that's the purpose of the mechanism: to delay evaluating the actual sequence.
So you can pass endless sequences around as evaluated LazySeq objects, provided you never realise them. Your evaluation at the REPL invokes realisation, an endless process.
It's not returning anything because your integers function creates an infinite loop.
(defn integers [n]
(do (prn n)
(cons n (lazy-seq (integers (inc n))))))
Call it with (integers 10) and you'll see it counting forever.
I'm wondering what is considered to be a side-effect in predicates for fns like remove or filter. There seems to be a range of possibilities. Clearly, if the predicate writes to a file, this is a side-effect. But consider a situation like this:
(def *big-var-that-might-be-garbage-collected* ...)
(let [my-ref *big-var-that-might-be-garbage-collected*]
(defn my-pred
[x]
(some-operation-on my-ref x)))
Even if some-operation-on is merely a query that does not change state, the fact that my-pred retains a reference to *big... changes the state of the system in that the big var cannot be garbage collected. Is this also considered to be side-effect?
In my case, I'd like to write to a logging system in a predicate. Is this a side effect?
And why are side-effects in predicates discouraged exactly? Is it because filter and remove and their friends work lazily so that you cannot determine when the predicates are called (and - hence - when the side-effects happen)?
GC is not typically considered when evaluating if a function is pure or not, although many actions that make a function impure can have a GC effect.
Logging is a side effect, as is changing any state in the program or the world. A pure function takes data and returns data, without modifying anything else.
https://softwareengineering.stackexchange.com/questions/15269/why-are-side-effects-considered-evil-in-functional-programming covers why side effects are avoided in functional languages.
I found this link helpful
The problem is determining when, or even whether, the side-effects will occur on any given call to the function.
If you only care that the same inputs return the same answer, you are fine. Side-effects are dependent on how the function is executed.
For example,
(first (filter odd? (range 20)))
; 1
But if we arrange for odd? to print its argument as it goes:
(first (filter #(do (print %) (odd? %)) (range 20)))
It will print 012345678910111213141516171819 before returning 1!
The reason is that filter, where it can, deals with its sequence argument in chunks of 32 elements.
If we take the limit off the range:
(first (filter #(do (print %) (odd? %)) (range)))
... we get a full-size chunk printed: 012345678910111213141516171819012345678910111213141516171819202122232425262728293031
Just printing the argument is confusing. If the side effects are significant, things could go seriously awry.
What's the (most) idiomatic Clojure representation of no-op? I.e.,
(def r (ref {}))
...
(let [der #r]
(match [(:a der) (:b der)]
[nil nil] (do (fill-in-a) (fill-in-b))
[_ nil] (fill-in-b)
[nil _] (fill-in-a)
[_ _] ????))
Python has pass. What should I be using in Clojure?
ETA: I ask mostly because I've run into places (cond, e.g.) where not supplying anything causes an error. I realize that "most" of the time, an equivalent of pass isn't needed, but when it is, I'd like to know what's the most Clojuric.
I see the keyword :default used in cases like this fairly commonly.
It has the nice property of being recognizable in the output and or logs. This way when you see a log line like: "process completed :default" it's obvious that nothing actually ran. This takes advantage of the fact that keywords are truthy in Clojure so the default will be counted as a success.
There are no "statements" in Clojure, but there are an infinite number of ways to "do nothing". An empty do block (do), literally indicates that one is "doing nothing" and evaluates to nil. Also, I agree with the comment that the question itself indicates that you are not using Clojure in an idiomatic way, regardless of this specific stylistic question.
The most analogous thing that I can think of in Clojure to a "statement that does nothing" from imperative programming would be a function that does nothing. There are a couple of built-ins that can help you here: identity is a single-arg function that simply returns its argument, and constantly is a higher-order function that accepts a value, and returns a function that will accept any number of arguments and return that value. Both are useful as placeholders in situations where you need to pass a function but don't want that function to actually do much of anything. A simple example:
(defn twizzle [x]
(let [f (cond (even? x) (partial * 4)
(= 0 (rem x 3)) (partial + 2)
:else identity)]
(f (inc x))))
Rewriting this function to "do nothing" in the default case, while possible, would require an awkward rewrite without the use of identity.
Fairly new to lisps, but in looking into sequential integer generating code, I noticed that repeated calls to (gensym) would increase the number provided after the prefix by 3. I'm curious why that is the case.
user=> (gensym)
G__662
user=> (gensym)
G__665
user=> (gensym)
G__668
user=> (gensym)
G__671
user=> (gensym)
G__674
user=> (gensym)
G__677
I've seen and understand the combined use of atom and inc, but I'm new to the gensym function.
There are a number of correct answers here. One is: it doesn't!
user> (take 5 (repeatedly gensym))
(G__2173 G__2174 G__2175 G__2176 G__2177)
Another is: gensym doesn't make any guarantees as to the form of the symbols it generates, so you really shouldn't care whether they're sequential or not (or even if they contain numbers at all). You certainly shouldn't hijack gensym to produce an integer sequence.
Lastly: why does it increase by three in your example? Because each time you evaluate a form in the repl, the compiler has to create some gensyms of its own. Apparently, for the form (gensym), the number it needs to create is two.
It doesn't!
=> (str (gensym) (gensym))
"G__4027G__4028"
Looking at the source of gensym we can see that it uses clojure.lang.RT/nextID.
(defn gensym
([prefix-string] (. clojure.lang.Symbol (intern (str prefix-string (str (. clojure.lang.RT (nextID))))))))
The nextID function is also used in the LispReader. So when you repeatedly evaluate (gensym), the reader is probably using two IDs.
I clearly have something else going on in my process too, as if I wait any time between evaluations, more IDs are consumed and the gensym gaps further than just 3.
https://github.com/clojure/clojure/search?q=nextid
In trying to replicate some websockets examples I've run into some behavior I don't understand and can't seem to find documentation for. Simplified, here's an example I'm running in lein that's supposed to run a function for every element in a shared map once per second:
(def clients (atom {"a" "b" "c" "d" }))
(def ticker-agent (agent nil))
(defn execute [a]
(println "execute")
(let [ keys (keys #clients) ]
(println "keys= " keys )
(doseq [ x keys ] (println x)))
;(map (fn [k] (println k)) keys)) ;; replace doseq with this?
(Thread/sleep 1000)
(send *agent* execute))
(defn -main [& args]
(send ticker-agent execute)
)
If I run this with map I get
execute
keys= (a c)
execute
keys= (a c)
...
First confusing issue: I understand that I'm likely using map incorrectly because there's no return value, but does that mean the inner println is optimized away? Especially given that if I run this in a repl:
(map #(println %) '(1 2 3))
it works fine?
Second question - if I run this with doseq instead of map I can run into conditions where the execution agent stops (which I'd append here, but am having difficulty isolating/recreating). Clearly there's something I"m missing possibly relating to locking on the maps keyset? I was able to do this even moving the shared map out of an atom. Is there default syncrhonization on the clojure map?
map is lazy. This means that it does not calculate any result until the result is accessed from the data structure it reteruns. This means that it will not run anything if its result is not used.
When you use map from the repl the print stage of the repl accesses the data, which causes any side effects in your mapped function to be invoked. Inside a function, if the return value is not investigated, any side effects in the mapping function will not occur.
You can use doall to force full evaluation of a lazy sequence. You can use dorun if you don't need the result value but want to ensure all side effects are invoked. Also you can use mapv which is not lazy (because vectors are never lazy), and gives you an associative data structure, which is often useful (better random access performance, optimized for appending rather than prepending).
Edit: Regarding the second part of your question (moving this here from a comment).
No, there is nothing about doseq that would hang your execution, try checking the agent-error status of your agent to see if there is some exception, because agents stop executing and stop accepting new tasks by default if they hit an error condition. You can also use set-error-model and set-error-handler! to customize the agent's error handling behavior.