Saleor comes with react by default. Can we have a 'basic' Saleor without frontend frameworks? - django

I wanted to use Saleor for a "single-type-product" e-commerce website. I installed Saleor yesterday but it has a huge requirements file, including react.js and many other frontend stuff. I don't want them. Is it possible to install Saleor without anything else? Or at least, with the least other requirements?

Yes, you can simply install the backend and expose the graphql api endpoint to consume data from you front-end. Or go regular django-style and use templated views. Simply clone the repo here https://github.com/mirumee/saleor.git, install, migrate and run the django server.

Yes. You can install saleor only the backend part.
To do so,
run git clone https://github.com/mirumee/saleor.git
after that,
create a docker-compose.yml file if you would like to run with docker-compose
otherwise you are good to go.

Related

Deployment of Django in local network while continuing development

The question is in the title.
I need to deploy a Django application in a local network (i still don't know how to do but i suppose it's quite easy) but i still need to develop it. My question is how to do to allow users to use the application while i'm still developing it ?
Is it a solution to keep to versions of the application, one deployed and one in development ? In this way, I can replace the application deployed by the newly developed one when I finish coding it.
Another question concerns the database, can I still modify the database if I just add new models without touching existing ones ?
Thank you in advance,
This is a good blog entry that covers deploying Django using Heroku. I'll give you a quick rundown of what makes all the different technologies important:
Git
Git, or any other version control system, is certainly not required. Why it lends itself to deploying Django projects is that your usually distributing your application by source, i.e. your not compiling it or packaging it as an egg. Usually you'll organize your Git repository as such that updating your application on the server only requires you to do a checkout of the latest sources--nothing else.
virtualenv and pip
This, again, is not a strict requirement, but I'd strongly suggest you take the time to familiarize yourself with virtualenv and pip if you already haven't done so, since it's going to make deploying your Python applications across different runtime environments, local or remote, a breeze.
Basically, your project will need to have at least Django and Gunicorn available on the Python path, possibly even a database driver. What that means is that every time you try to deploy your application somewhere you'll have to install Python and do the easy_install dance all over.
virtualenv will redistribute a Python installation, which in turn means that the new Python instance will, by default, have it's very own Python path configuration relative to the installation. pip is like easy_install on steroids, since it supports checking out Python dependencies directly from code repositories and supports a requirements file format with which you can install and configure all of your dependencies in one fell swoop.
With virtualenv and pip, all you'd need to do is have a simple text file with all your dependencies that can be parsed with pip and an installed Python distribution on the machine. From there you just do git checkout repo /app/path; easy_install virtualenv; virtualenv /app/path; ./app/path/scripts/activate; pip install -r /app/path/requirements.txt. Voila, Gunicorn, Django and all other dependencies are then installed and available immediately. When you run the Gunicorn Django script with the Python instance in /app/path/scripts, then the script will immediately have access to the Gunicorn sources and it will be able to locate your Django project which will have access to Django and other dependencies as well.
Gunicorn
This is the actual Python application that will manage your Django instance and provide an HTTP interface that exposes it to HTTP clients. It'll start several worker processes which will all be distinct Python virtual machines loaded with the sources of your application and it's dependencies. The main Gunicorn process will in turn take charge of managing which worker processes manage which requests for maximum throughput.
The basic principle of wiring Nginx and Gunicorn
The most important thing to observe is that Nginx and Gunicorn are separate processes that you manage independently.
The Nginx Web server will be publicly exposed, i.e. it will be directly accessible over the internet. For requests to static media, such as actual images, CSS stylesheets, JavaScript sources and PDF files accessible via the filesystem, Nginx will take charge of returning them in the response body to HTTP clients if you configure it to look for files on the path where you configured your project to collect static media.
Any other request should be proxied to your Gunicorn instance. It will be configured to listen to HTTP requests on a certain port on the loopback interface, so you'll using Nginx as a revers proxy to http://127.0.0.1:8080 for requests to your Django instance.
This is the basic rundown for deploying your Django projects into production that should satisfy the needs of 95% Django projects running out there. While I did reference Nginx and Gunicorn, it's the usual approach when it comes to setting up any Web server to act as a reverse-proxy to a Python WSGI server.

How to make a Django server portable?

My web server depends on nginx, django, and a lot of python dependencies. I'm wondering if there is a way to create a portable image/script that I can run in a new server and quickly get it up and running.
Is Docker relevant to this?
You should always use git to manage your code. With git you could get your django project in the other server quickly. But just that.
Also you have to migrate your database. Every DB engine have dump options for doing this.
Do not forget to move your static assests. Probably, you've all of them in one directory.
What about your nginx, database installation and configuration? Here is relevant Docker.
With all of this, you should migrate successfully.

Producing a WAR file from a django project with SQLite

I have a django project (a django module/app some other modules that are used from the django one) that uses SQLite. This project is for a University course and now I am asked to supply it in such a way so that it may be installed on some server in our faculty. I'm not the one who's going to install it, and I will not be contacted in case of failure, so I am looking for the easiest, simplest way to supply the project for installation.
I have come across django-jython which supposedly allows one to create WAR files from django projects. However, in the Database Backends section, it says:
SQLite3
Experimental. By now, use it only if you are working on improving it. Or if you are really adventurous.
My overall goal is to deliver this project and I would appreciate any helpful advice. In particular:
Is there another way to pack a django project into a WAR file that supports SQLite?
Is it safe to use SQLite with django-jython in spite of this warning? If so, then how?
Is there any other simple way to pack a django project so that it'll be a piece of cake to install?
If the above answers are "no", then what does it take to change the configuration of the project to use MySQL instead?
You should look into Fabric for easy deployment. I haven't used it myself, but I've heard good things.
I've also had good success quickly and easily setting up servers using Gunicorn with Nginx as a reverse-proxy.
As others have said, using virtualenv, with pip, can quickly
get all your dependencies installed via requirements.txt (from virtualenv).
Some of these blog posts may help:
Tools of the modern Python hacker - virtualenv, pip, fabric
Basic Django Deployment - virtualenv, pip, fabric, rsync
Easy Django Deployment - very quick nginx and gunicorn setup
Edit:
As I reread your post I saw your last bullet point/question. Django is designed to be loosely coupled, meaning that there shouldn't (in most cases) be reasons that one app is dependant on sqlite vs mysql. If you don't need to save the data in the db, changing to MySQL is as easy as starting a mysql server on your machine, and changing the settings.py of your django project. This SO question may help

Pydev + Django workflow. Local(test) + remote synchronization. Using git with django

I'm new to django and my very first project is my blog. I wonder how django developers who use pydev normally synchronize with remote hosting server, updating their sites?
I also would like to know, how do you combine usage of git with a django project? Should I just make a repository for the entire project?
At my company we've got an entire git repository for each project, including the Django sources that are put in the PYTHONPATH for each project, making Django versions project dependant. The folder structure is something like:
/.git
/projectname/app1
/projectname/app2
/projectname/manage.py
/django-lib/django/...
As django-lib is not a Python module, we include both / and /django-lib in the PYTHONPATH. If your project is becoming large, you might want to consider using git submodules on your apps.
We've also setup several servers to support the developers. There's a testing server running a central testing database and a setup including Apache with WSGI to make testing on a real server possible, which sometimes is a bit different then the local manage.py the developers use before committing their changes.
The testing server is updated with the master branch of our git repository. We've made several scripts to allow all developers to do this without letting them login to the server via SSH, but that is just during pre-release. After release, that server will become our staging server, and we'll remove all scripts from it to make it just like our production server.
Every developer has setup their local project to make sure that it communicates with the central testing database, containing several test data. I myself push my changes from the commandline, but you could also use EGit for this.
When we've got a release, we put it in a separate branch, called 'release' (obviously) and the production server will pull only from that branch. This is done via SSH, but I don't really know how your server setup looks like, so I guess that that last step is entirely up to you.
I hope that this has helped you a bit. I won't say that this is the best workflow possible, but it works for us and you should figure out what works for you.
Most experienced Django developers use pip(or distribute) and virtualenv deal with all the python packages you might need for your Django projects (including Django itself).
Personally, all I keep in my projects git repository is a bunch of segregated requirements lists generated by pip :
. ~/Dev/environs/$PROJECT_NAME/bin/activate
pip freeze > ./docs/requirements/main.list
I'm fairly sure most django developers would be familiar with Fabric, which I use for :
streamlining local interaction with git and,
pushing to our central repository,
pulling from our production or test server
touching the wsgi on the relevant server
and pretty much any other kind of task you might find yourself using ssh terminal session for.
For those cases where I need to make changes to someone elses django application in order to make it work or suit our purposes, I :
fork it on github,
clone from my forked repo
make the changes
push it up to my own repo
and provide merge requests to the original repo owner
This way, i have a repo where i can use pip requirement lists to keep pulling from until the original application owner gets their own repo updated.

How do I install an open source django-application?

I want to install an existing django app, djangopeople.net. The code is at http://github.com/simonw/djangopeople.net.
I installed django and I understand how to create a new django project.
But how do you deploy an existing app? I know how this works in Rails or Symfony, but I don't really get the django concept here.
Where do I put the files? Which scripts do I run?
Thanks for the steps.
Why is this any different from deploying your own applications? Just put them somewhere in your PYTHONPATH and set up mod_wsgi or whatever to serve them.