I have defined the following in one ns -
There is a services atom, and another function to add to that atom -
(ns ex.first)
(def services (atom []))
(defn add-service
[fns]
(swap! services conj fns))
In my code in another ns I do this -
(ns ex.second ..)
(add-service [fn1 fn2 fn3])
1) I am assuming that when I run my repl, which compiles the code, it should add the vector to the atom. However when I eval the #services it returns [].
2) The above works if I eval the (add-service [fn1 fn2 fn3]) in the repl.
3) I have also tried using converting the add-service fn to a macro. However still I find the #services to be empty.
So will appreciate if someone can help on these -
1) Why doesnt the add-service call add a vector to the atom on code compilation when -
add-service is defined as a fn.
add-service is defined as a macro.
2) How do I make it work :)
It depends on if you have set :aot to :all in project.clj or not.
If :aot is :all then the function call will execute as soon as repl is started, otherwise you will need to load the ex.second namespace (ex: using use). Loading will cause the ns to be compiled and the corresponding class to be loaded in jvm and the function call will get executed.
Also, the function call doesn't happen at the compile time, it happens when the compiled class (representing the namespace) is loaded by jvm.
UPDATED (based on comment):
If you make it a macro, then also you need to consider the aot thing.
If aot is set to compile the namespace then lein will create a jvm, load your code in it, call clojure compiler, which will read the code, execute the macro and compiles the code, at this time this jvm (which is used for compiling your code) will have services atom filled coz of macro execution, but this jvm was only for compilation. lein will then create another jvm for the run command and load the compiled class in that jvm but this jvm won't have the services filled in coz the classes doesn't have any code that fill it in.
If aot is not set then macro will work because the read,macroexpand,compile process will happen in the lein run jvm but only when you cause loading of the ns.
As far as "without loading the ns" is concern, what you can do is put the ex.second in the :aot of project.clj
Related
I want to create an uberjar of a leiningen app. My config is:
:uberjar {:omit-source true
:aot :all
:uberjar-name "myapp.jar"
:source-paths ["env/prod/clj" ]
:resource-paths ["env/prod/resources"]}
But upon doing lein uberjar, I find the the files in the project are being compiled, but the compilation is stuck on the file that contains most of the code, for ten minutes and counting. This file doesn't contain more than 140 lines.
TL;DR: never def side-effects
As stated in the comments:
... I just figured that this line: (defonce server (http/start-server server-handler {:port 8982})) is causing the hang.
Don't put stuff like that at top-level.
defonce only means it will not be re-def-ed once it's there (so in this case
it would prevent some "port already in use" error on reloading.
Ways out of that dilemma
Write a function, that starts this server. Then call that from your main. For
development you can run that function from the REPL or you can sprinkle some
reload/restart logic in your user-ns.
Another option could be using delay: it will only execute the code once it gets derefed.
The more "binding of resources" you have to deal with, the more some systematic
approach will give your application a better structure. E.g. take a look at:
weavejester/integrant
stuartsierra/component
tolitius/mount
So why is putting blocking things or side-effects in a def problematic?
The way the Clojure compiler works, is by actually "running" the code. So compile basically is:
enable generation byte code and write it out as .class files
load the namespace and "run" it
This means, that at compile time, the top level side-effects are executed.
Therefor blocking operations in a def, will block compilation (which is quite
obvious), or your CI server will fail to compile, because it can not connect to
the database etc.
A great explanation of how the code generation in Clojure works:
What Are All These Class Files Even About? And Other Stories - Gary Fredericks
I'm using tools.namespace to provide smart reloading of namespaces on the REPL. However, when calling refresh or refresh-all, it throws an error.
user=> (require '[clojure.tools.namespace.repl :as tn])
user=> (tn/refresh)
:reloading (ep31.common ep31.routes ep31.config ep31.application user ep31.common-test ep31.example-test)
:error-while-loading user
java.lang.Exception: No namespace: ep31.config, compiling:(user.clj:1:1)
And it seems to end up in this weird state where (require ep31.config) works without an error, but afterwards the namespace isn't actually defined.
I kind of figured this out, this seems to be a combination of circumstances
there were AOT compiled classes left in target/classes from doing lein uberjar previously
tools.namespace doesn't function correctly when loaded namespaces are AOT compiled
target/classes is by default on the classpath
So long story short, if you did a jar/uberjar build before, then remove target/ and things should start working again.
The question I haven't been able to solve yet is why target/classes is on the classpath to begin with. I'm suspecting it's being added by Leiningen, but haven't found yet where or why it's happening.
I learned this the hard way, documentation for :target-path says (https://github.com/technomancy/leiningen/blob/master/sample.project.clj#L309-L313):
;; All generated files will be placed in :target-path. In order to avoid
;; cross-profile contamination (for instance, uberjar classes interfering
;; with development), it's recommended to include %s in in your custom
;; :target-path, which will splice in names of the currently active profiles.
:target-path "target/%s/"
I guess there has to be legacy reasons that :target-path "target/%s/" isn't the default.
How to create a REPL for Clojure for which the code is reloadable?
I can create a new project, and get a REPL up and running:
lein new app stack
cd stack
lein repl
(-main)
Doing the above should get you "Hello, World!".
I would like to stay in the REPL, change the code to println "Howdy partner!", and then just (-main) again. Either auto-reloading or (perhaps even better) simple manual reloading (for instance with a command like (r)) would make the environment complete.
It seems with lein I'm already getting into the right namespace (any namespace but the user namespace from which you then have to (in-ns 'some-ns) is the right namespace!). The only unanswered part is code reloading - either manual or auto.
As it happens I previously asked how to do this with boot.
For manual reloading you can use the same trick as posted in the boot answer, which is to have a function that does the reloading for you:
(defn r [] (require 'stack.core :reload))
Once the above function is part of the stack.core namespace, you can call it from the REPL. Pretty simple - the namespace stack.core has a function in it which reloads itself.
Make code changes from the editor, reload with (r), then run again...
There's also the lein-autoreload plugin for automatic reloading.
I'm working on a library that finds dependencies within the source tree during application startup and I'm trying to write an integration test to ensure it works. I've got fixture files in my test namespaces, and the test starts and succeeds just fine.
To be sure that the tests don't affect future runs, I added an "after" handler (in midje) that uses remove-ns to remove the test fixture namespaces.
On the next load, the tests fail because the namespaces are missing.
It seems as if remove-ns not only removes the namespace, it makes it impossible to use require to load it into the same running VM afterwards. I note that there is a "use with caution" note on remove-ns with no explanation.
I've verified that manually running require does not, indeed, seem to be able to re-load a namespace that has been removed:
user=>(test.util.fixtures.A/f)
{:item 1}
user=> (remove-ns 'test.util.fixtures.A)
#<Namespace test.util.fixtures.A>
user=> (test.util.fixtures.A/f)
ClassNotFoundException test.util.fixtures.A
user=> (require 'test.util.fixtures.A)
nil
user=> (test.util.fixtures.A/f)
ClassNotFoundException test.util.fixtures.A
Anyone understand why this is happening?
I traced through the source, and it ends up that require calls load-libs, which in turn calls load-lib, which in turn checks a global atom (the line is loaded (contains? #*loaded-libs* lib)).
Reading further, it seems that once something is loaded, you can specify the :reload option to the library loader. Now I remember seeing the :reload, so the solution was to put :reload in the require:
user=> (require 'test.util.fixtures.A :reload)
nil
user=> (test.util.fixtures.A/f)
{:item 1}
While my lein new app project runs merrily inside the Light Table, lein uberjar won't work. Curiously, it behaves exactly like a classic Pascal compiler: it can't resolve references ahead of definitions. Another curiosity: yesterday it worked. I am not aware of fooling with anything sensitive.
Google says that the subj symptoms are quite commonplace; I tried whatever helped other people in the same (?) plight, but to no avail. By the way, usually they blame it on software bugs: "get the latest version of Leiningen and Clojure". I've got 2.5.0 and 1.6.
The project (main file) is here: https://github.com/Tyrn/pcc/blob/master/src/pcc/core.clj
As it is, parsed-args can't be resolved inside build-album; if I move the -main function to the top of the file, 'No such var' happens to cli-options inside -main. No amount of fiddling with explicit use of namespaces makes any difference.
Again, inside the Light Table everything runs fine.
Using def inside of a function is not idiomatic, especially if there is no reason to have it as a global variable. Just pass it as a function parameter:
(let [parsed-args (parse-opts ...)]
...
(build-album parsed-args))
If you really need global state, you can use e.g. a promise (alternatively, an atom):
(defonce parsed-args (promise))
...
(deliver parsed-args (parse-opts ...))
However, Clojure files are read from top to bottom, and yes, functions not having access to bindings introduced later in the file is by design. You can use declare to tell the parser what to expect:
(declare ^:dynamic *parsed-args*)
(defn build-album ...)
(def ^:dynamic *parsed-args* ...)
...
(binding [*parsed-args* (parse-opts ...)]
(build-album))
TL;DR: If not necessary, avoid global state; if necessary, minimize it.