Clojure: lazy magic - clojure

Almost 2 identical programs to generate infinite lazy seqs of randoms.
The first doesn't crash. The second crash with OutOfMemoryError exception. Why?
;Return infinite lazy sequence of random numbers
(defn inf-rand[] (lazy-seq (cons (rand) (inf-rand))))
;Never returns. Burns the CPU but won't crash and lives forever.
(last (inf-rand))
But the following crash pretty quickly:
;Return infinite lazy sequence of random numbers
(defn inf-rand[] (lazy-seq (cons (rand) (inf-rand))))
(def r1 (inf-rand))
;Crash with "OutOfMemoryError"
(last r1)

I believe this is an example of "holding onto the head".
By making the reference r1 in the second example you open up the possibility of later saying something like (first r1) so you will end up storing the members of your lazy-seq as they are reified.
In the first case Clojure can determine that nothing will ever be done with earlier members of the infinite sequence so they can be disposed of and not consume memory.
I am still very much a Clojure beginner myself, any comments or corrections to my understanding or terminology greatly appreciated.

Related

Atomically swap an atom that stores an infinite lazy seq

Either I'm exhausted for the day and not able to think properly, or this is not possible, but I wanted to swap! an atom that refers to an infinite lazy seq with the rest of the seq that is currently in it
My program hangs for obvious reasons because compare-and-set! tries to check for previous and new seq equality before swapping. The equality check basically never terminates
Any clue on how to achieve this?
(def beyond-infinity (atom (repeat 1)))
(defn keep-pulling [] (swap! beyond-infinity #(rest %)))
EDIT
Previously my lazy seq was range in which it worked, but on REPL due to realization of the returned seq, my REPL was hung
Anyway, with repeat it still fails
The issue was because of my REPL trying to realize an infinite sequence.
Also the thing I mentioned about compare-and-set! doing a equality check of the seq is wrong. The compare-and-set! does a reference equality test and not of the value!

How to forget head(GC'd) for lazy-sequences in Clojure?

Let's say I have a huge lazy seq and I want to iterate it so I can process on the data that I get during the iteration.
The thing is I want to lose head(GC'd) of lazy seq(that processed) so I can work on seqs that have millions of data without having OutofMemoryException.
I have 3 examples that I'm not sure.
Could you provide best practices(examples) for that purpose?
Do these functions lose head?
Example 1
(defn lose-head-fn
[lazy-seq-coll]
(when (seq (take 1 lazy-seq-coll))
(do
;;do some processing...(take 10000 lazy-seq-coll)
(recur (drop 10000 lazy-seq-coll)))))
Example 2
(defn lose-head-fn
[lazy-seq-coll]
(loop [i lazy-seq-coll]
(when (seq (take 1 i))
(do
;;do some processing...(take 10000 i)
(recur (drop 10000 i))))))
Example 3
(doseq [i lazy-seq-coll]
;;do some processing...
)
Update: Also there is an explanation in this answer here
copy of my above comments
As far as I know, all of the above would lose head (first two are obvious, since you manually drop the head, while doseq's doc claims that it doesn't retain head).
That means that if the lazy-seq-coll you pass to the function isn't bound somewhere else with def or let and used later, there should be nothing to worry about. So (lose-head-fn (range)) won't eat all your memory, while
(def r (range))
(lose-head-fn r)
probably would.
And the only best practice I could think of is not to def possibly infinite (or just huge) sequences, because all of their realized items would live forever in the var.
In general, you must be careful not to retain a reference either locally or globally for a part of a lazy seq that precedes another which involves excessive computation.
For example:
(let [nums (range)
first-ten (take 10 nums)]
(+ (last first-ten) (nth nums 100000000)))
=> 100000009
This takes about 2 seconds on a modern machine. How about this though? The difference is the last line, where the order of arguments to + is swapped:
;; Don't do this!
(let [nums (range)
first-ten (take 10 nums)]
(+ (nth nums 100000000) (last first-ten)))
You'll hear your chassis/cpu fans come to life, and if you're running htop or similar, you'll see memory usage grow rather quickly (about 1G in the first several seconds for me).
What's going on?
Much like a linked list, elements in a lazy seq in clojure reference the portion of the seq that comes next. In the second example above, first-ten is needed for the second argument to +. Thus, even though nth is happy to hold no references to anything (after all, it's just finding an index in a long list), first-ten refers to a portion of the sequence that, as stated above, must hold onto references to the rest of the sequence.
The first example, by contrast, computes (last first-ten), and after this, first-ten is no longer used. Now the only reference to any portion of the lazy sequence is nums. As nth does its work, each portion of the list that it's finished with is no longer needed, and since nothing else refers to the list in this block, as nth walks the list, the memory taken by the sequence that has been examined can be garbage collected.
Consider this:
;; Don't do this!
(let [nums (range)]
(time (nth nums 1e8))
(time (nth nums 1e8)))
Why does this have a similar result as the second example above? Because the sequence will be cached (held in memory) on the first realization of it (the first (time (nth nums 1e8))), because nums is being used on the next line. If, instead, we use a different sequence for the second nth, then there is no need to cache the first one, so it can be discarded as it's processed:
(let [nums (range)]
(time (nth nums 1e8))
(time (nth (range) 1e8)))
"Elapsed time: 2127.814253 msecs"
"Elapsed time: 2042.608043 msecs"
So as you work with large lazy seqs, consider whether anything is still pointing to the list, and if anything is (global vars being a common one), then it will be held in memory.

Reducing a sequence into a shorter sequence while calling a function on each adjacent element

I've got a function that looks at two of these objects, does some mystery logic, and returns either one of them, or both (as a sequence).
I've got a sequence of these objects [o1 o2 o3 o4 ...], and I want to return a result of processing it like this:
call the mystery function on o1 and o2
keep the butlast of what you've got so far
take the last of the result of the previous mystery function, and call the mystery function on it, and o3
keep the butlast of what you've got so far
take the last of the result of the previous mystery function, and call the mystery function on it, and o4
keep the butlast of what you've got so far
take the last of the result of the previous mystery function, and call the mystery function on it, and oN
....
Here's what I've got so far:
; the % here is the input sequence
#(reduce update-algorithm [(first %)] (rest %))
(defn update-algorithm
[input-vector o2]
(apply conj (pop input-vector)
(mystery-function (peek input-vector) o2)))
What's an idiomatic way of writing this? I don't like the way that this looks. I think the apply conj is a little hard to read and so is the [(first %)] (rest %) on the first line.
into would be a better choice than apply conj.
I think [(first %)] (rest %) is just fine though. Probably the shortest way to write this and it makes it completely clear what the seed of the reduction and the sequence being reduced are.
Also, reduce is a perfect match to the task at hand, not only in the sense that it works, but also in the sense that the task is a reduction / fold. Similarly pop and peek do exactly the thing specified in the sense that it is their purpose to "keep the butlast" and "take the last" of what's been accumulated (in a vector). With the into change, the code basically tells the same story the spec does, and in fewer words to boot.
So, nope, no way to improve this, sorry. ;-)

Overflow while using recur in clojure

I have a simple prime number calculator in clojure (an inefficient algorithm, but I'm just trying to understand the behavior of recur for now). The code is:
(defn divisible [x,y] (= 0 (mod x y)))
(defn naive-primes [primes candidates]
(if (seq candidates)
(recur (conj primes (first candidates))
(remove (fn [x] (divisible x (first candidates))) candidates))
primes)
)
This works as long as I am not trying to find too many numbers. For example
(print (sort (naive-primes [] (range 2 2000))))
works. For anything requiring more recursion, I get an overflow error.
(print (sort (naive-primes [] (range 2 20000))))
will not work. In general, whether I use recur or call naive-primes again without the attempt at TCO doesn't appear to make any difference. Why am I getting errors for large recursions while using recur?
recur always uses tail recursion, regardless of whether you are recurring to a loop or a function head. The issue is the calls to remove. remove calls first to get the element from the underlying seq and checks to see if that element is valid. If the underlying seq was created by a call to remove, you get another call to first. If you call remove 20000 times on the same seq, calling first requires calling first 20000 times, and none of the calls can be tail recursive. Hence, the stack overflow error.
Changing (remove ...) to (doall (remove ...)) fixes the problem, since it prevents the infinite stacking of remove calls (each one gets fully applied immediately and returns a concrete seq, not a lazy seq). I think this method only ever keeps one candidates list in memory at one time, though I am not positive about this. If so, it isn't too space inefficient, and a bit of testing shows that it isn't actually much slower.

Clojure: reduce, reductions and infinite lists

Reduce and reductions let you accumulate state over a sequence.
Each element in the sequence will modify the accumulated state until
the end of the sequence is reached.
What are implications of calling reduce or reductions on an infinite list?
(def c (cycle [0]))
(reduce + c)
This will quickly throw an OutOfMemoryError. By the way, (reduce + (cycle [0])) does not throw an OutOfMemoryError (at least not for the time I waited). It never returns. Not sure why.
Is there any way to call reduce or reductions on an infinite list in a way that makes sense? The problem I see in the above example, is that eventually the evaluated part of the list becomes large enough to overflow the heap. Maybe an infinite list is not the right paradigm. Reducing over a generator, IO stream, or an event stream would make more sense. The value should not be kept after it's evaluated and used to modify the state.
It will never return because reduce takes a sequence and a function and applies the function until the input sequence is empty, only then can it know it has the final value.
Reduce on a truly infinite seq would not make a lot of sense unless it is producing a side effect like logging its progress.
In your first example you are first creating a var referencing an infinite sequence.
(def c (cycle [0]))
Then you are passing the contents of the var c to reduce which starts reading elements to update its state.
(reduce + c)
These elements can't be garbage collected because the var c holds a reference to the first of them, which in turn holds a reference to the second and so on. Eventually it reads as many as there is space in the heap and then OOM.
To keep from blowing the heap in your second example you are not keeping a reference to the data you have already used so the items on the seq returned by cycle are GCd as fast as they are produced and the accumulated result continues to get bigger. Eventually it would overflow a long and crash (clojure 1.3) or promote itself to a BigInteger and grow to the size of all the heap (clojure 1.2)
(reduce + (cycle [0]))
Arthur's answer is good as far as it goes, but it looks like he doesn't address your second question about reductions. reductions returns a lazy sequence of intermediate stages of what reduce would have returned if given a list only N elements long. So it's perfectly sensible to call reductions on an infinite list:
user=> (take 10 (reductions + (range)))
(0 1 3 6 10 15 21 28 36 45)
If you want to keep getting items from a list like an IO stream and keep state between runs, you cannot use doseq (without resorting to def's). Instead a good approach would be to use loop/recur this will allow you to avoid consuming too much stack space and will let you keep state, in your case:
(loop [c (cycle [0])]
(if (evaluate-some-condition (first c))
(do-something-with (first c) (recur (rest c)))
nil))
Of course compared to your case there is here a condition check to make sure we don't loop indefinitely.
As others have pointed out, it doesn't make sense to run reduce directly on an infinite sequence, since reduce is non-lazy and needs to consume the full sequence.
As an alternative for this kind of situation, here's a helpful function that reduces only the first n items in a sequence, implemented using recur for reasonable efficiency:
(defn counted-reduce
([n f s]
(counted-reduce (dec n) f (first s) (rest s) ))
([n f initial s]
(if (<= n 0)
initial
(recur (dec n) f (f initial (first s)) (rest s)))))
(counted-reduce 10000000 + (range))
=> 49999995000000