Is it feasible to add only this one Leiningen namespace as a dependency in a project (standard project and not lein template)?
I found this namespace is originated from lein-newnew (now deprecated) which means at one time this was possible.
I know that I could use whole Leiningen as a dependency and refer only those namespaces that are needed but it doesn't look so optimal - whole Leiningen would be packed in uberjar and I need just few functions from a namespace.
As far as I know, there is no way to just import a namespace from a project (ex. from clojars, maven etc.). It would probably be quite a bit of trouble (think about dependencies, eventual configuration, the namespace may not even be part of the public api...).
So I would either :
depend on the whole other project, fortunately in your case you can depend on leiningen-core
copy-paste the code into your project (assuming the respective licences allow it of course). This has the added benefit that you will be able to modify it to suit your needs.
You can actually see an example of this "in the wild", in the boot-new repository.
Related
I am just started on clojure and have a very basic question about core.clj.
What is a convention about it? What code should go there? Public API?
It is generated by leiningen when creating a project.
I have looked into source code of some libraries and this file is also present there (per package?).
Thanks
There's no defined meaning for it, but in an application it's often where you'll find the entry point, the main- function. For libraries, foo.core is often the namespace users will import to get your main functionality.
You don't have to do it that way, but it's a semi-predictable place to have the 'central bit' of whatever it is you're writing - even if your actual logic and algorithmic code is somewhere else.
Leiningen generates foo.core because it needs to pick a name and core is generic enough that it probably wont' be wrong. It's a style decision, but I typically opt to rename core.clj to a name that is actually meaningful for my project.
It's just a template file emitted by Leiningen. If I run lein new foo there is no specific or standard meaning to the namespace foo.core. It is entirely legal for the namespace foo to be the main API container for a project. It's just a free file for you to start working in.
That said, if you have a project foo, it is expected although not enforced that all your code for that one project exist in the foo.* namespace. Leiningen will allow you to build a project with the files src/foo.clj and src/bar.clj and it is entirely reasonable for foo to require bar, however when packaging your code for distribution this is probably a bad idea.
I have traditionally used the same folder structure for production and test code as demonstrated below:
src/myproject/core.clj
test/myproject/core_test.clj
For test files I have added _test in the filename.
I recently noticed that several projects follow this structure (this is also what Leiningen generates by default):
src/myproject/core.clj
test/myproject/test/core.clj
Is there a convention regarding this or some clear advantage of using one over the other?
I believe this is just convention - I don't think there is any technical advantage either way.
I personally prefer the first version for entirely non-technical reasons:
It seems redundant to have two "test" directories in the path.
It can cause confusion to have the test .clj files with the same names as the main .clj files
Sometimes you want to create tests that don't perfectly line up with specific namespaces, e.g. full_system_test.clj for end-to-end testing
It's easier to pattern-match on all the *_test.clj files
Also worth noting that the Maven standard directory layout convention is also used in quite a few Clojure projects (this can be handy if you build polyglot projects that also contain Java source code):
src/main/clojure/myproject/core.clj
src/test/clojure/myproject/core_test.clj
src/main/resources/....
src/test/resources/....
Using TDD, I'm considering creating an (throw-away) empty project as Test-harness/container for each new class I create. So that it exists in a little private bubble.
When I have a dependency and need to get something else from the wider project then I have to do some work to add it into my clean project file and I'm forced into thinking about this dependency. Assuming my class has a single responsibility then I ought not to have to do this very much.
Another benefit is an almost instant compile / test / edit cycle.
Once I'm happy with the class, I can then add it to the main project/solution.
Has anyone done anything similar before or is this crazy?
I have not done this in general, create an empty project to test a new class, although it could happen if I don't want to modify the current projects in my editor.
The advantages could be :
sure not to modify the main project, or commit by accident
dependencies are none, with certaintly
The drawbacks could be :
cost some time ...
as soon as you want to add one dependency on your main project, you instantly get all the classes in that project ... not what you want
thinking about dependencies is usual, we normally don't need an empty project to do so
some tools check your project dependencies to verify they follow a set of rules, it could be better to use of those (as that could be used not only when starting a class, but also later on).
the private bubble concept can also by found as import statements.
current development environments on current machines already give you extra-fast operations ... if not, you could do something about it (tell us more ...)
when ok, you would need to copy to your regular project your main and your test class. This can cost you time, especially as the package might not be adequate (simplest possible in your early case because your project is empty, but adequate to your regular project later).
Overall, I'm afraid this would not be a timesaver... :-(
I have been to a presentation for using Endeavour. One of the concepts they depended highly upon was decoupling as you suggest:
service in seperate solution with its own testing harness
Endeavour is in a nutshell a powerfull development environment / plugin for VS which helps archieving these things. Among a lot of other stuff it also hooks into / creates a nightly build from SourceSafe to define which dll's are building and places those in a shared folder.
When you create code which depends on an other service you don't reference the VS project but the compiled DLL in the shared folder.
By doing this a few of the drawbacks suggested by KLE are resolved:
Projects depending on your code reference the DLL instead of your project (build time win)
When your project fails to build it will not break integration; they depend upon a DLL which is still available from last working build
All classes visible - nope
Middle ground:
You REALLY have to think about dependency's, more then in 'simple' setups.
Still costs time
But ofcourse there is also a downside:
its not easy to detect circular dependency's
I am currently in the process of thinking how to archieve the benefits of this setup without the full-blown install of Endeavour because its a pretty massive product which does really much (which you won't all need).
Our team is struggling with issues around the idea of a library repository. I have used Maven, and I am familiar with Ivy. We have little doubt that for third-party jars, a common repository that is integrated into the build system has tremendous advantage.
The struggle is around how to handle artifacts that are strictly internal. We have many artifacts that use a handful of different build tools, including Maven, but essentially they are all part of one one product (one team responsible for all of it). Unfortunately, we are not currently producing a single artifact per project, but we're headed in that direction. Developers do and will check out all the projects.
We see two options:
1) Treat all artifacts even internal ones as any third-party jar. Each jar gets built and published to the repository, and other artifact projects refer to the repository for all projects.
2) Each project refers to other "sibling" projects directly. There is a "master project" that triggers the builds for all other projects with an appropriate dependency order. In the IDE (eclipse), each projects refers to it's dependent project (source) directly. The build tools look into the sibling project referencing a .jar.
It's very clear that the open-source world is moving towards the repository model. But it seems to us that their needs may be different. Most such projects are very independent and we strongly suspect users are seldom making changes across projects. There are frequent upgrades that are now easier for clients to track and be aware of.
However, it does add a burden in that you have to separately publish changes. In our case, we just want to commit to source control (something we do 20-50 times a day).
I'm aware that Maven might solve all these problems, but the team is NOT going to convert everything to Maven. Other than maven, what do you recommend (and why)?
It's not necessary to choose only one of your options. I successfully use both in combination. If a project consists of multiple modules, they are all built together, and then delivered to the repository. However, the upload only happens for "official" or "release" builds. The ongoing development builds are done at the developers' machines. You don't have to use Maven for this. Ivy would work or even a manual process. The "artifact repository" could be one of the excellent products available or just a filesystem mount point.
It's all about figuring out the appropriate component boundaries.
When you say "developers do and will check out all projects", that's fine. You never know when you might want to make a change; there's no harm in having a working copy ready. But do you really want to make every developer to build every artifact locally, even if they do not need to change it? Really, is every developer changing every single artifact?
Or maybe you just don't have a very big product, or a very big team. There's nothing wrong with developing in a single project with many sub-projects (which may themselves have sub-modules). Everybody works on everything together. A single "build all" from the top does the job. Simple, works, fast (enough). So what would you use a shared repository for in this case? Probably nothing.
Maybe you need to wait until your project/team is bigger before you see any benefit from splitting things up.
But I guarantee you this: you have some fundamental components, which are depended on (directly or indirectly) by many projects but do not themselves depend on any internal artifacts. Ideally, these components wouldn't change very much in a typical development cycle. My advice to you is twofold:
set up an internal repository since you already agree that you will benefit from doing so, if only for third-party jars.
consider separating the most fundamental project into a separate build, and delivering its artifact(s) to the repository, then having the rest of the system reference it as if it were a third-party artifact.
If a split like this works, you'll find that you're rebuilding that fundamental piece only when needed, and the build cycle for the rest of the project (for your developers) is faster by a corresponding amount: win-win. Also, you'll be able to use a different delivery schedule (if desired) for the fundamental component(s), making the changes to it more deliberate and controlled (which is as it should be for such a component). This helps facilitate growth.
If a project produces multiple jar files (and many do) I would carefully consider (case by case) whether or not each jar will ever be reused by other projects.
If yes, that jar should go into the repository as a library, because it facilitates reuse by allowing you to specify it as a dependency.
If no, it would be a so-called "multi-project" in which the whole project is built of the parts. The individual jars probably do not need to be stored individually in the repo. Just the final completed artifact.
Gradle is definitely a candidate for you: It can use Ant tasks and call Ant scripts, understands Maven pom files, and handles multi-projects well. Maven can do much of this as well, and if you have lots of patience, Ant+Ivy can.
Simple question. I'm new to Clojure.
How can I use one file from my project in another file? Basically how can I include, import, or require another file? Not from libraries but fro my own code.
Thanks,
Alex
Normally you'll want to use the same method that you use with library code, which is to use / require your namespaces (through an ns form at the top of the file and sometimes the use / require functions at the REPL). For this to work, you have to make sure they are on the classpath. A short guide to that:
Follow the usual Clojure project structure: a src/ directory containing all your source files, where file src/foo/bar/baz.clj defines a namespace called foo.bar.baz. Note that you must maintain the directory structure / namespace name structure correspondence; things won't work otherwise. Also note that you must not use the _ character in namespace names or the - character (the hyphen) in filenames and whenever you use _ in filenames you must use a - in namespace names (and the other way around.) Finally, the directory hierarchy will be slightly more complicated with Maven projects, but don't worry about this for now (unless you're already a proficient user of Maven, in which case this won't be a problem for you).
Also see this answer of mine to an earlier SO question about Java classpath handling with Clojure for a more detailed step-by-step explanation of the filesystem hierarchy / classpath hierarchy correspondence.
If your code from the foo.bar namespace needs to use code from the foo.quux.baz namespace, do something like (ns foo.bar (:require [foo.quux.baz :as baz])) in foo/bar.clj and call functions from baz as baz/some-function. Or you can put (:use foo.quux.baz) in the ns form instead to call them directly (without the namespace qualifier, e.g. some-function). That's exactly the same thing as what you'd do for library code.
When working with your project's code from the REPL, make sure you include the src/ directory (the directory itself, not any of the files therein) on the classpath. You should probably consider using some tool to automate the REPL setup (including classpath management) for you; Leiningen is very popular with Clojurians and there are plugins for using Maven with Clojure too.
Warning: Your JVM-launching command might (in fact, probably will) recognise an environment variable called $CLASSPATH. As for its relationship to your Clojure projects, well, basically there should be none. More likely than not, your projects will require a different classpath each, with some possibly using versions of certain jars incompatible with those required by others (notably if you're using Clojure 1.1 -- latest stable release -- for some projects, while experimenting with 1.2 snapshots with others). Thus the correct way of managing the classpath is to prepare a minimal version for each project and pass that to the JVM launching command. As mentioned previously, you should invest some time in learning to use a good tool (like the above mentioned Leiningen) to set up the classpath for you as soon as possible so you don't need to care about this yourself.
(As a side note, you might have to add more than just the src/ directory and your jars to the classpath in some scenarios, e.g. if you plan on calling compile to produce .class files, you'll have to put the target directory on the classpath too. That's beyond the scope of this question, though.)
BTW, I've started this answer with the word "normally", because you could also use things like load & in-ns to split a single namespace into multiple files. Most of the time this won't be what you really want to do, though; just use a well thought out namespace layout instead.