Running leiningen for storm projects offline - clojure

I am trying to compile some projects on the twitter storm platform using leiningen. The servers I am working on do not have access to internet. I was wondering if it is possible to work offline by making a local repository by downloading all dependencies.

Short answer is yes, you should be able to do this by just putting the dependencies where lein expects to find them and then it will use them instead of going out to download them. That said, I've never tried this.
It looks like the answer to this question pretty much points to how to do it:
Run lein deps on a machine that is connected to the Internet
Copy $HOME/.m2/repository to your server
and that should take care of it. However, I have not tried this, so there may be some problem with this method that I have not foreseen.

Related

Starting my first Django project on Ubuntu VPS

I have been thinking long and hard about making this post and after hours of Google searches I couldn't come up with any good sources so thought I'd ask here.
I am relatively new to coding, started early this year and started a software programming degree so I am super keen to learn. I have managed to make a fully working project that works on shared hosting but doesn't allow me to use latest packages and modules that is why I upgraded to a VPS. But editing my project on cPanel on a shared hosting was alot less scary than what I'm attempting now.
I've recently purchased a VPS to host my first django project I'm building for my father, the project is basically a gallery that allows him to upload a blog and images. I had a standard shared hosting plan which was fine but I couldn't use latest python and django on it.
So what I want to ask is; what is the common practice for starting off with building a project on ubuntu? In my head it was building it on VSCode and just transferring it to ubuntu, linking it to my domain and BAM.
I've found writing the code very tedious and difficult on ubuntu terminal, using cd to go in and out of folders, copy and paste not working etc. so Is writing it on local pc using VSCode acceptable?
How would static files be stored for my father when he uploads his images/blogs, do they store on the VPS or do I need to link something like AWS which is what I really wanted to avoid when getting a VPS.
I would even appreciate just a step by step list of a common procedure for a project as I have described above.
I appreciate anybody and everybody who is willing to give up some time to help me here.
Many Thanks.
Try this guide here. It is a beginner friendly approach to setup django:
https://www.digitalocean.com/community/tutorials/how-to-set-up-django-with-postgres-nginx-and-gunicorn-on-ubuntu-22-04

Can ember-cli watch and build automatically without running the server?

Title is pretty much my question. I'm serving the dist directory differently and would still like the benefit of auto-builds but I don't need to run the server. I looked in the docs and the cli help but didn't see anything specific. I know the cli help doesn't contain everything because it doesn't list ember build which is available.
If I understand correctly you are wanting the ember build command to watch for changes in the file tree and rebuild on a change?
They implemented ember build --watch a while back which will trigger when a file changes. Tested just now and it worked on 0.2.7. Not sure what version it came in though. Let me know if this is not the answer you are looking for.

Working on a multiple leiningen project

I'm building a web apps that uses http-kit and clojurescript. At some point, I want to separate the front and back each into a lein project on its own. The scenario is:
For the front, if on development mode, uses a lein ring server to serve directory and the app will display a mock data.
The back will serve any resources/public in the front lein project.
I'm thinking of doing a nested lein project, but not sure how to handle it. Any suggestion or pointer is very much appreciated.
The way your question is posted it is not very clear "why two projects?". It seems that you may be confusing different projects with different profiles, e.g. .. classpaths?
In order to work with multiple lein projects at once, similar to the way you would work with master pom => child poms with maven, there is a lein-sub.
But in your case, it seems that what you want is different lein profiles, classpaths. Take a look at lein profiles documentation, it should solve "in dev do this, in test do this, in production do this" behavior you are after.

How do you configure proprietary dependencies for Leiningen?

We're working on a project that has some Clojure-Java interop. At this point we have a single class that has a variety of dependencies which we put into a user library in Eclipse for development, but of course that doesn't help when using Leiningen (2.x). Most of our dependencies are proprietary, so they aren't on a repository somewhere.
What is the easiest/right way to do this?
I've seen leiningen - how to add dependencies for local jars?, but it appears to be out of date?
Update: So I made a local maven repository for my jar following these instructions and the lein deployment docs on github, and edited my project.clj file like this:
:dependencies [[...]
[usc "0.1.0"]]
:repositories {"usc" "file://maven_repository"}
Where maven_repository is under the project directory (hence not using file:///). When I ran "lein deps"--I got this message:
Retrieving usc/usc/0.1.0/usc-0.1.0.pom from usc
Could not transfer artifact usc:usc:pom:0.1.0 from/to usc (file://maven_repository): no supported algorithms found
This could be due to a typo in :dependencies or network issues.
Could not resolve dependencies
What is meant by "no supported algorithms found" and how do I fix it?
Update2: Found the last bit of the answer here.
add them as a dependency to your leiningen project. You can make up the names and versions.
then run lein deps and the error message when it fails to find it will give you the exact command to run so you can install the jar to your local repo then sould you decide to use a shared repo you can use this same process to put your dependencies there.
#Arthur's answer is good but I figured I'd flesh it out a bit more since it leaves some details lacking.
Always keep in mind Repeatability. If you don't make it so that anyone who needs access to the artifacts can get access to the artifacts in a standard way, you're asking for support hell.
The documentation on deployment is a good place to go to find out everything you need to know about deploying your artifacts. Since you're in a polyglot environment you probably can't have lein take care of deploying all your artifacts but at least you can get your clojure specific jars up into S3 or even a file share if you like. The rest of your artifacts will have to use Maven or Ant directly to upload the artifacts to the Maven repo on the file server or S3. At my current company we are using technomancy's excellent s3 wagon private to great effect for hosting our closed source artifacts and clojars for hosting anything that we can open-source.
What #Arthur is referring to is doing a lein install. All that does is install a copy of the current project into your local .m2 directory so that other projects on your box can reference them. Unless you have configured your install of maven to use a shared directory for your .m2 folder (maybe not a bad idea in your environment?), this will mean that anyone else who checks out your project will not be able to build it. If you wanted to go this route, you need to set the localRepository node in your $M2_HOME/conf/settings.xml to be the shared location that the rest of your team has access to. See the docs for more information.
YMMV but I've found it best to use Maven rather than Leiningen when you are working with Polyglot Clojure / Java projects.
It's mainly because the Java based tools (Eclipse etc.) understand Maven projects but don't really understand Leiningen projects. It's getting slowly better with the excellent Counterclockwise Clojure plugin, but the integration still isn't quite good enough yet for an efficient IDE based workflow.
On the repository side of things, I'd suggest setting up a private shared Maven repository. You're going to need it sooner or later if you plan to manage a complex set of dependencies within your team: might as well bite the bullet and get it done now.

Django Compressor on a multi-server deployment

I've been fortunate enough to discover django_compressor and implemented it within our stack, which deploys to many servers (Currently 6, but growing as we deploy smaller virtual machines.)
Now this is all fine and dandy if you're using django_compressor at its finest. Compressing raw CSS/JS code
However, say now I want introduce some type of pre-compiler. Let's say for this example it is LESS (css). The thought process for this is fairly simple:
Install node, npm, and the less package onto the server.
Add less to your precompilers!
COMPRESS_PRECOMPILERS = ( ('text/less', 'lessc {infile} {outfile}'), )
Now you deploy, and your server compiles the less file. Everything is fantastic!
Now let's add 8 more servers to that and you have to install node, npm, and less on each server?
This is where something doesn't seem right, and I feel like I'm missing something. I believe the Django community has run into this problem before.
My thoughts thus far have been:
Use a post-commit hook to compile the CSS on the developers machine. This means that via django_compressor, we link to the compiled static file in the HTML, and our repository contains both the compiled and non-compiled versions. My only downside to this is it ends up not using half of the benefits of django_compressor and may be tedious for developers?
Suck it up and make node, npm, and less part of the server stack.
Update
I did some additional looking around and it seems that using the COMPRESS_OFFLINE flag (or just --force) with the management command will produce an offline manifest file that does what I need (only tested locally). So setting this up with a pre-deploy hook likes to be the answer.
Of course, still open to other ideas :-)
Coupled with the tips in the comments about COMPRESS_OFFLINE, you could look at django-staticfiles' storage stuff. You can host the static files on amazon s3, for instance, so hosting it all on one static-hosting server and using that from all your servers could also be a nice solution. You wouldn't need to do anything with the static (and compressed) files on the individual servers.
Alternative solution regarding the multiple servers: I've made a custom fabric (docs.fabfile.org) script that installs/configures stuff on our servers. I've only recently started using coffeescript and less, but those two are definitively ending up in my fabfile. That solves the installation problem for me.
(Alternatives to a fabfile are things like a custom debian package with standard dependencies. Or chef or puppet or something similar.)
you can use puppet for the task