Goal ordering in Clojure's `core.logic` - clojure

The following Clojure code uses core.logic to solve the same logic problem with the same goals in two different orders. This choice of ordering causes one to finish quickly and the other to hang.
(use `clojure.core.logic)
;; Runs quickly. Prints (1 2 3).
(clojure.pprint/pprint (run* [q] (fresh [x] (== x [1,2,3])
(membero q x))))
;; Hangs
(clojure.pprint/pprint (run* [q] (fresh [x] (membero q x)
(== x [1,2,3]))))
Is there a general solution or common practice to avoid this problem?

Here is my understanding:
With core.logic, you want to reduce the search space as early as possible. If you put the membero constraint first, the run will start by searching the membero space, and backtrack on failure produced by the == constraint. But the membero space is HUGE, since neither q nor x is unified or at least bounded.
But if you put the == constraint first, you directly unify x with [1 2 3], and the search space for membero now is clearly bounded to the elements of x.

If you're going to use membero there is no general solution to this problem. Calling membero with fresh vars will cause it to generate all (read, infinite) possible lists for which q is a member. Of course lists larger than 3 don't apply - but since you've used run* it will continue blindly trying lists larger than count 3 even though each one will fail.
It's possible to write a better version of membero in newer versions of core.logic using the constraint infrastructure, but the details of how one might do this are likely to change over the coming months. Until there's a solid public api for defining constraints you're stuck with the kind of subtle ordering and non-termination issues that trouble Prolog.

Related

Is it possible to insert a Clojure statement in the middle of a series of core.logic calls, "à la Prolog"?

Prolog mixes logic (fallible) goals, and (infallible) processes like write/2 or assert/1. Is it possible to do so with Clojure's core.logic?
For example:
(pldb/with-db myFacts
(l/run* [x y]
(person x)
(println "we found a person!")
(likes y x)
)
)
It looks like this can't work because (println ...) is a Clojure function embedded in a core.logic run (which is supposed to contain only conditions/constraints I'm guessing).
More interestingly, I would like to be able to assert new facts in the midst of checking logical conditions like in Prolog:
liked(x,y):-
person(x),
assertz(likable(x)),
likes(y,x).
Which would approximatively look like this in core.logic I'm guessing:
(pldb/with-db myFacts
(l/run* [x y]
(person x)
(def newFacts (-> newFacts (pldb/db-fact likeable x)))
(likes y x)
)
)
Those examples are silly but, in general, the question is about inserting procedural commands in the midst of constraint checking in core.logic like it is done in Prolog.
One concrete application of that, for instance, would be the capability to collect arcs in the recursive traversal of a graph by using assertion. But, unless I am mistaken, that's only possible if we have the possible to mix assertion with logical conditions.
Thank you very much for your help and guidance in this.
UPDATE:
I think I might have figured it out: including a Clojure statement is fine but it must return true ; unlike in Prolog, those functions don't automatically "succeed". So all we need to add is:
(l/== (println "hello") true))
Does it sound correct?
FYI, this post was really helpful: https://ask.clojure.org/index.php/9546/can-core-logic-ask-questions

Is there a more idiomatic way to get N random elements of a collection in Clojure?

I’m currrently doing this: (repeatedly n #(rand-nth (seq coll))) but I suspect there might be a more idiomatic way, for 2 reasons:
I’ve found that there’s frequently a more concise and expressive alternative to using short anonymous functions, e.g. partial
the docstring for repeatedly says “presumably with side effects”, implying that it’s not intended to be used to produce values
I suppose I could figure out a way to use reduce but that seems like it would be tricky and less efficient, as it would have to process the entire collection, since reduce is not lazy.
An easy solution but not optimal for big collections could be:
(take n (shuffle coll))
Has the "advantage" of not repeating elements. Also you could implement a lazy-shuffle but it will involve more code.
I know it's not exactly what you're asking - but if you're doing a lot of sampling and statistical work, you might be interested in Incanter ([incanter "1.5.2"]).
Incanter provides the function sample, which provides options for sample size, and replacement.
(require '[incanter.stats :refer [sample]]))
(sample [1 2 3 4 5 6 7] :size 5 :replacement false)
; => (1 5 6 2 7)

How to solve math equations using core.logic

I tried typing in a query in core.logic:
(run* [q] (== 0 (+ (* q q) (* 4 q) 4)))
And the prompt says,
error: lvar cannot be cast to a number
In the event that i haven't completely misconcieved what logic programming is about, are there ways that this problem can be solved using core.logic?
You should read The Reasoned Schemer for ideas. Basically the way to do math in a logic program is to create list-based encodings of numbers, which the logic engine can grow as needed to try things out. I don't have the book handy, but it encodes integers as a list of bits, in some weird way I can't quite recall: maybe (1) represents 0, (0) is illegal, and the MSB is last in the list?
Anyway, that's a lot of work; David Nolen has also recently introduced something about finite domains into core.logic. I don't know how those work, but I think they simplify the problem a lot for you by letting you specify what kinds of numbers to consider as a solution to your problem.
So far as I can find core.logic can't do the algebra to solve this equation. It can do basic math though the inputs to that math need to be actual values not LVars because the math functions can't operate on these:
user> (run* [q]
(fresh [x]
(== x 1)
(project [x] (== q (+ (* x x) 4)))))
(5)
works when x has a clear value and fails when x does not:
user> (run* [q]
(fresh [x]
(== x q)
(project [x] (== q (+ (* x x) 4)))))
ClassCastException clojure.core.logic.LVar cannot be cast to java.lang.Number
core.logic in it's current form is not designed as an numerical equation solver - it is more appropriate for solving logical and relational expressions.
You basically have two practical routes for solving mathematical equations:
Analytical solvers - solutions can be found quite easily for simple cases e.g. quadratic equations like the one you have above, but start to get increasingly complex quite quickly and then become impossible/infeasible for many equations. This is a huge open research topic.
Numerical solvers - these techniques are much more general and can be used on pretty much any kind of equation. However the results are not exact and the algorithms can fail to find the right solution(s) if the equation has "nasty" features (discontinuities, odd gradients, complex sets of local minima etc.)
Equation solvers require special intelligence to understand the "rules" of mathematical equations, e.g. how to factor polynomial expressions (for analytic solutions) or how to estimate a derivative (for numerical solutions).
Some links that may be interesting:
Quadratic equation solver in Incanter
http://programming.wonderhowto.com/how-to/solve-equations-for-any-variable-with-clojure-1-1-380687/
Example of a simple numeric solver in Clojure (Newton-Raphson method)

Using lazy-seq without blowing the stack: is it possible to combine laziness with tail recursion?

To learn Clojure, I'm solving the problems at 4clojure. I'm currently cutting my teeth on question 164, where you are to enumerate (part of) the language a DFA accepts. An interesting condition is that the language may be infinite, so the solution has to be lazy (in that case, the test cases for the solution (take 2000 ....
I have a solution that works on my machine, but when I submit it on the website, it blows the stack (if I increase the amount of acceptable strings to be determined from 2000 to 20000, I also blow the stack locally, so it's a deficiency of my solution).
My solution[1] is:
(fn [dfa]
(let [start-state (dfa :start)
accept-states (dfa :accepts)
transitions (dfa :transitions)]
(letfn [
(accept-state? [state] (contains? accept-states state))
(follow-transitions-from [state prefix]
(lazy-seq (mapcat
(fn [pair] (enumerate-language (val pair) (str prefix (key pair))))
(transitions state))))
(enumerate-language [state prefix]
(if (accept-state? state)
(cons prefix (follow-transitions-from state prefix))
(follow-transitions-from state prefix)))
]
(enumerate-language start-state ""))
)
)
it accepts the DFA
'{:states #{q0 q1 q2 q3}
:alphabet #{a b c}
:start q0
:accepts #{q1 q2 q3}
:transitions {q0 {a q1}
q1 {b q2}
q2 {c q3}}}
and returns the language that DFA accepts (#{a ab abc}). However, when determining the first 2000 accepted strings of DFA
(take 2000 (f '{:states #{q0 q1}
:alphabet #{0 1}
:start q0
:accepts #{q0}
:transitions {q0 {0 q0, 1 q1}
q1 {0 q1, 1 q0}}}))
it blows the stack. Obviously I should restructure the solution to be tail recursive, but I don't see how that is possible. In particular, I don't see how it is even possible to combine laziness with tail-recursiveness (via either recur or trampoline). The lazy-seq function creates a closure, so using recur inside lazy-seq would use the closure as the recursion point. When using lazy-seq inside recur, the lazy-seq is always evaluated, because recur issues a function call that needs to evaluate its arguments.
When using trampoline,I don't see how I can iteratively construct a list whose elements can be lazily evaluated. As I have used it and see it used, trampoline can only return a value when it finally finishes (i.e. one of the trampolining functions does not return a function).
Other solutions are considered out of scope
I consider a different kind of solution to this 4Clojure problem out of scope of this question. I'm currently working on a solution using iterate, where each step only calculates the strings the 'next step' (following transitions from the current statew) accepts, so it doesn't recurse at all. You then only keep track of current states and the strings that got you into that state (which are the prefixes for the next states). What's proving difficult in that case is detecting when a DFA that accepts a finite language will no longer return any results. I haven't yet devised a proper stop-criterion for the take-while surrounding the iterate, but I'm pretty sure I'll manage to get this solution to work. For this question, I'm interested in the fundamental question: can laziness and tail-recursiveness be combined or is that fundamentally impossible?
[1] Note that there are some restrictions on the site, like not being able to use def and defn, which may explain some peculiarities of my code.
When using lazy-seq just make a regular function call instead of using recur. The laziness avoids the recursive stack consumption for which recur is otherwise used.
For example, a simplified version of repeat:
(defn repeat [x]
(lazy-seq (cons x (repeat x))))
The problem is that you are building something that looks like:
(mapcat f (mapcat f (mapcat f ...)))
Which is fine in principle, but the elements on the far right of this list don't get realized for a long time, and by the time you do realize them, they have a huge stack of lazy sequences that need to be forced in order to get a single element.
If you don't mind a spoiler, you can see my solution at https://gist.github.com/3124087. I'm doing two things differently than you are, and both are important:
Traversing the tree breadth-first. You don't want to get "stuck" in a loop from q0 to q0 if that's a non-accepting state. It looks like that's not a problem for the particular test case you're failing because of the order the transitions are passed to you, but the next test case after this does have that characteristic.
Using doall to force a sequence that I'm building lazily. Because I know many concats will build a very large stack, and I also know that the sequence will never be infinite, I force the whole thing as I build it, to prevent the layering of lazy sequences that causes the stack overflow.
Edit: In general you cannot combine lazy sequences with tail recursion. You can have one function that uses both of them, perhaps recurring when there's more work to be done before adding a single element, and lazy-recurring when there is a new element, but most of the time they have opposite goals and attempting to combine them incautiously will lead only to pain, and no particular improvements.

Computer algebra for Clojure

Short version:
I am interested in some Clojure code which will allow me to specify the transformations of x (e.g. permutations, rotations) under which the value of a function f(x) is invariant, so that I can efficiently generate a sequence of x's that satisfy r = f(x). Is there some development in computer algebra for Clojure?
For (a trivial) example
(defn #^{:domain #{3 4 7}
:range #{0,1,2}
:invariance-group :full}
f [x] (- x x))
I could call (preimage f #{0}) and it would efficiently return #{3 4 7}. Naturally, it would also be able to annotate the codomain correctly. Any suggestions?
Longer version:
I have a specific problem that makes me interested in finding out about development of computer algebra for Clojure. Can anyone point me to such a project? My specific problem involves finding all the combinations of words that satisfy F(x) = r, where F is a ranking function and r a positive integer. In my particular case f can be computed as a sum
F(x) = f(x[0]) + f(x[1]) + ... f(x[N-1])
Furthermore I have a set of disjoint sets S = {s_i}, such that f(a)=f(b) for a,b in s, s in S. So a strategy to generate all x such that F(x) = r should rely on this factorization of F and the invariance of f under each s_i. In words, I compute all permutations of sites containing elements of S that sum to r and compose them with all combinations of the elements in each s_i. This is done quite sloppily in the following:
(use 'clojure.contrib.combinatorics)
(use 'clojure.contrib.seq-utils)
(defn expand-counter [c]
(flatten (for [m c] (let [x (m 0) y (m 1)] (repeat y x)))))
(defn partition-by-rank-sum [A N f r]
(let [M (group-by f A)
image-A (set (keys M))
;integer-partition computes restricted integer partitions,
;returning a multiset as key value pairs
rank-partitions (integer-partition r (disj image-A 0))
]
(apply concat (for [part rank-partitions]
(let [k (- N (reduce + (vals part)))
rank-map (if (pos? k) (assoc part 0 k) part)
all-buckets (lex-permutations (expand-counter rank-map))
]
(apply concat (for [bucket all-buckets]
(let [val-bucket (map M bucket)
filled-buckets (apply cartesian-product val-bucket)]
(map vec filled-buckets)))))))))
This gets the job done but misses the underlying picture. For example, if the associative operation were a product instead of a sum I would have to rewrite portions.
The system below does not yet support combinatorics, though it would not be a huge effort to add them, loads of good code already exists, and this could be a good platform to graft it onto, since the basics are pretty sound. I hope a short plug is not inappropriate here, this is the only serious Clojure CAS I know of, but hey, what a system...
=======
It may be of interest to readers of this thread that Gerry Sussman's scmutils system is being ported to Clojure.
This is a very advanced CAS, offering things like automatic differentiation, literal functions, etc, much in the style of Maple.
It is used at MIT for advanced programs on dynamics and differential geometry, and a fair bit of electrical engineering stuff. It is also the system used in Sussman&Wisdom's "sequel" (LOL) to SICP, SICM (Structure and Interpretation of Classical Mechanics).
Although originally a Scheme program, this is not a direct translation, but a ground-up rewrite to take advantage of the best features of Clojure. It's been named sicmutils, both in honour of the original and of the book
This superb effort is the work of Colin Smith and you can find it at https://github.com/littleredcomputer/sicmutils .
I believe that this could form the basis of an amazing Computer Algebra System for Clojure, competitive with anything else available. Although it is quite a huge beast, as you can imagine, and tons of stuff remains to be ported, the basics are pretty much there, the system will differentiate, and handle literals and literal functions pretty well. It is a work in progress. The system also uses the "generic" approach advocated by Sussman, whereby operations can be applied to functions, creating a great abstraction that simplifies notation no end.
Here's a taster:
> (def unity (+ (square sin) (square cos)))
> (unity 2.0) ==> 1.0
> (unity 'x) ==> 1 ;; yes we can deal with symbols
> (def zero (D unity)) ;; Let's differentiate
> (zero 2.0) ==> 0
SicmUtils introduces two new vector types “up” and “down” (called “structures”), they work pretty much as you would expect vectors to, but have some special mathematical (covariant, contravariant) properties, and also some programming properties, in that they are executable!
> (def fnvec (up sin cos tan)) => fnvec
> (fnvec 1) ==> (up 0.8414709848078965 0.5403023058681398 1.5574077246549023)
> ;; differentiated
> ((D fnvec) 1) ==> (up 0.5403023058681398 -0.8414709848078965 3.425518820814759)
> ;; derivative with symbolic argument
> ((D fnvec) 'θ) ==> (up (cos θ) (* -1 (sin θ)) (/ 1 (expt (cos θ) 2)))
Partial differentiation is fully supported
> (defn ff [x y] (* (expt x 3)(expt y 5)))
> ((D ff) 'x 'y) ==> (down (* 3 (expt x 2) (expt y 5)) (* 5 (expt x 3) (expt y 4)))
> ;; i.e. vector of results wrt to both variables
The system also supports TeX output, polynomial factorization, and a host of other goodies. Lots of stuff, however, that could be easily implemented has not been done purely out of lack of human resources. Graphic output and a "notepad/worksheet" interface (using Clojure's Gorilla) are also being worked on.
I hope this has gone some way towards whetting your appetite enough to visit the site and give it a whirl. You don't even need Clojure, you could run it off the provided jar file.
There's Clojuratica, an interface between Clojure and Mathematica:
http://clojuratica.weebly.com/
See also this mailing list post by Clojuratica's author.
While not a CAS, Incanter also has several very nice features and might be a good reference/foundation to build your own ideas on.
Regarding "For example, if the associative operation were a product instead of a sum I would have to rewrite portions.": if you structure your code accordingly, couldn't you accomplish this by using higher-order functions and passing in the associative operation? Think map-reduce.
I am unaware of any computer algebra systems written in Clojure. However, for my rather simple mathematical needs I have found it often useful to use Maxima, which is written in lisp. It is possible to interact with Maxima using s-expressions or a higher-level representations, which can be really convenient. Maxima also has some rudimentary combinatorics functions which may be what you are looking for.
If you are hellbent on using Clojure, in the short term perhaps throwing your data back and forth between Maxima and Clojure would help you achieve your goals.
In the longer term, I would be interested in seeing what you come up with!