Creation of a e-commerce platform [closed] - opencart

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I am a web developer. I got a project to build a 'build your own site' concept platform. for example buildabazaar or bigcommerce
I tried magento, axiscart but it was not user friendly and suitable to customer needs. But OpenCart is one which meets all mu customer's requirement. Its easily modifiable.
As i am building this for multiple customers , can u guide me how to configure OpenCart to handle multiple customers with admin panel with all the admin panel options for individual customers.

This is very easy. OpenCart supports Multi carts - mutli stores on different domains; you can even limit the user to that particular cart - so create users for that cart admin panel only; however it is a bit tricky and can get messy if not done properly. Can you explain exactly what you wish to do and I will produce a step by step.
The other thing; opencart although is blazingly fast. Scaling needs to be considered. We use multi stores on opencart for some of our projects and use Rackspace cloud servers with 4-6 opencart instances per server which is load balanced, CDN for storage of static files and of course dns load balanced. The memory I would recommend for each store is 2gb with a separate database cloud instance 1 master db at 16gb and 4-5 slave load balanced for traffic at 4-8gb
This depends on traffic and how it grows. The current setup we have can handle up to 4000 orders per day easily. We manage an average 3,500 on average so our application setup is always ready for high demand. We could essentially host opencarts for users as you want to do with effecting our network..
Create user reg form
Create and update nginx/htaccess with subdomain info
Create and copy that info to an installer script based on that subdomain
Make the opencart install and config based on that subdomain and data
Send user an email abuot cname records so they can have store.com rather that substore3434.yourecomservice.com
It would be something like this. Lot of validation. cURL usage and cron jobs. I personally would use CPANEL/WHM Apache server (I know people are going to negative me on this) but using Nginx would be a nightmare as CPANEL automates scripts using fantastico... you can use nginx later if your project is a success..

Related

What is the average cost of hosting a django app? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed last year.
Improve this question
Due to current RBI guidelines on recurring payments for standing instructions I am unable to use Heroku which is great for small apps. Therefore I have to choose other platoforms. I have narrowed down my choice to two platforms aws and digital ocean.
overview of my django website :
The website which I made for my client is not that big. In this website a user registers, chooses some plan and then book an intructor to teach him/ her driving. A user after loging in has to accept an agreement and also has an udate plan page. Thats it on the user side. Now I use celery which uses redis, to send emails such as otp, registration successful, change password, contracts and updated contracts (the contracts' email are send both to the client and the user as per the clients demand). As you can see I have to use celery and redis because there is a lot email work that has to be done by the website. And the database i am using is Postgresql.
Now comming to traffic, we cannot predict what will be the number of visitors on the site, but we accept maximum of 10 registrations per month.
Therefore I want to know what will be the monthly cost of running this django website on aws and digital ocean. Not an accurate cost but atleast an average assumption will be helpful.
Note the redis server is neccessary otherwise it will really slow down the website. And the database is Postgresql.
Thank you.
So AWS is a vast ocean and it has lot of options to solve any problem. With that being said you can host your application starting from 0$ to 100s of $s. If your account is new then you can host your application on Free Tier and don't have to pay anything.
If your looking for cost efficient solution then AWS Lightsail is another option. Lightsail offers fixed monthly cost resources and is good way to start with AWS.
EC2 instances is also an option but I would suggest to host it on Fargate (less maintenance). And use AWS Elasticache for your Redis requirements. Using managed services will cost you more but it's reliable and highly scalable solution compared to self hosted solutions.
Depending on which services you go with, calculate your cost via this calculator https://calculator.aws/

Server Architecture/Automatic Deployment - How to set up a SAAS web application? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I wrote a web application, and I would like to sell it as software-as-a-service. I'm using Amazon Web Services to host the production environment, currently.
I'm hosting the applications internally on clients servers/intranet, or having them sign up for Amazon to purchase their own EC2/RDS containers.
This setup process takes forever in that there are several manual steps involved in working like this. I have to set up the entire stack. I setup an EC2 & RDS instance, setup Route53, pull the source from git, run migrations, setup chron, and finally let nginx know that the site is up.
The Issue
Ideally, I'd like for users to be able to go to the website, sign up, and have access to their own implementation of this application. I'd like to automate this process. They should then be able to access the application via a url structured like so:
customername.primary-domain.com
You can see this happening in many applications utilizing 'the cloud'. For example, Office365/Sharepoint will spin up a customers own version of that application.
I'd like this to be shared hosting, in that there are several clients on an EC2 instance.
Questions
How can I accomplish this in a reasonable, affordable way?
What software options are there to solve this?
What are some different options other than the 3 I have listed here?
For the VM option, what is good to use? Docker, Vagrant? What are other options?
What are these companies using to do this? How is Salesforce, Microsoft, or any of you automating this?
Thoughts on how to accomplish this
I know very little about this topic. I did some brain storming about what I already know (basically nothing involving any scale) and I came up with the following ways I could think of to accomplish this:
Application Level Switching
A user would sign up, and all the records they create will have a foreign key to their profile. This seems like the easiest approach, but I don't believe it would scale very well, not to mention a terrifying security risk. I'd like to keep client's data as isolated as possible.
Web Server Approach
When a user signs up, an external process will create a folder on the web server and pull the application source code, and run database migrations. This script would create the corresponding records with nginx. This seems like the easiest to manage and support and scale, however I dislike that there is a shared database sever.
Virtualization
Use Docker or Vagrant (Which is best? Are there other options?) to run multiple production-ready installations on the EC2 instance. This could allow me to ship with MySQL installed on the container, removing the need for a global MySQL install, or RDS entirely. This feels like the most attractive option, but seems like the trickiest to set up, in that it seems very technical, and potentially a nightmare to support.
edit:
You could decouple the database from the application and run the DB in a different container than the application. This is probably the best way to do this but seems even more of an advanced setup. Is there something which already does this?
Summary
Cloud wizards, please help me understand your magic.

Search Engine Necessary? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
In my application, I have a bunch of service providers in my database offering various services. I need a user to be able to search through these service providers by either name, location, or both. I also need a user to be able to filter the providers by different criteria, based on multiple attributes.
I am trying to decide if I could implement this simply with database queries or if a more robust solution (i.e. a search engine) would better suit my needs.
Please let me know the pros and cons of either solution, and which you think would be best to go with.
I am writing my application in Django 1.7, using a PostGIS database, and would use django-haystack with elasticsearch if a search engine is the way to go here.
Buddy,It seems that you are working on a search intensive application.Now my opinion in this regard is as follows-:
1)If u use search intensive queries directly with the database,Then automatically overhead is gonna be very high as each time a separate criteria based query is to be fired to the database engine from your django.Each time query is to be built with seperate parameters and is to be built to fire at the backend database engine. Consequence is it will make you highly dependent on the availability of database server.Things can go more worse if database server will be located in some remote location.As overhead of network connectivity will be another addendum to this.
2)You should try to implement a server side caching system like redis that is a in-memory nosql database (sometimes also called a data structure server) that will beat all the problems I discussed in my previous point.Read more about it here.
3)To powerpack your search.Read about Apache Solr enter link description here.A lucene based search library this will power pack your search to the next level.
4)Last but not least go with case studies of biggies like facebook,twitter etc regarding how they are managing their infrastructure.You will get even more better idea.
Any doubts or suggestions.Kindly comment cheers :-)

Microsoft Sync Framework 2.1 Scope Create Remove Issues [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
This is my first application using Microsoft Sync Framework 2.1 so you assume i don't know any thing of it.
The question is i need to synchronize tables- these number of tables to synchronize increase or decrease as the Database changes.
Like wise the direction of tables is also random in nature sometimes it is up or down or bi-directional. Even the rules vary
As we have large no of clients/distributors so the no of tables to synchronize for UserA may be different for UserB and even the direction.
As we need to create Scopes and what i find out we need to create a new Scope for every change and for every User Tables is it right?
So Example we have 100 tables 10 Users and 3 directions then the possibility of no of scopes will be above 3000
How to the number of scope effect the DB performance?
Even i dont know how can i remove scopes for the tables that are deleted in DB? or that i choose not to synchronize and even for the user also.
I found out there is something called as Deprovisioning but dont know how to use it.
Moreover i need to apply filters to the tables also so in that case do i need to create a new Scope again or not? I don't know how to create filters as the samples i downloaded does not have any example of filters?
Any help/sample/link is highly appreciated
a scope is a collection of tables that are sync together in a single sync session. how many tables to include is up to you.
have a look at this link for some guidance: Sync Framework Scope and SQL Azure Data Sync Dataset Considerations
I suggest you go thru the documentation and the tutorials/walkthroughs first. The documentation actually gets installed with the framework.
if you have trouble finding them, here's the corresponding links:
How to: Use Synchronization Scopes
How to: Provision and Deprovision Synchronization Scopes and Templates (SQL Server)
How to: Filter Data for Database Synchronization (SQL Server)
if you want to understand further what provisioning actually does, have a look at this: Sync Framework Provisioning
you might want to specify what databases are you actually synching

Are there any less costly alternatives to Amazon's Relational Database Services (RDS)? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I have the following requirement. I have with me a database containing the contact and address details of at least 2000 members of my school alumni organization. We want to store all that information in a relation model so that
This data can be created and edited on demand.
This data is always backed up and should be simple to restore in case the master copy becomes unusable.
All sensitive personal information residing in this database is guaranteed to be available only to authorized users.
This database won't be online in the first 6 months. It will become online only after a website is built on top of it.
I am not a DBA and I don't want to spend time doing things like backups. I thought Amazon's RDS with it's automatic backup facility was the perfect solution for our needs. The only problem is that being a voluntary organization we cannot spare the monthly $100 to $150 fees this service demands.
So my question is, are there any less costlier alternatives to Amazon's RDS?
In your case of just contact and address data I would choose Amazon SimpleDB. I know SimpleDB might not be suitable for a large number of tables with relationships and all, but for your kind of data I think SimpleDB is sufficient. And costs is much much cheaper than Amazon RDS.
I also wanted to use RDS, but the smallest db size costs $80 p/month.
With out a bit more info I may be way off base here. but 2000 names addresses etc. is not a large DB and I would have thought that the possible use of Amazons RDS was a bit "overkill" to say the least.
Depending on how (and who) you want view edit etc. there are a number of free or almost free alternatives.
One method may be to set up /use a hosting package that has something like phpMyAdmin linked to a mySQL DB. Doing this it is possible to access and edit etc. the DB without having a website front end. Not pretty (like a website front end) but practical. A good host should also back up for you.
Another is to look at Google Documents. OK not really a database more a spread sheet, but very much on the lines of Excel. You can share Google docs with invited people and even set up a small website via Google Docs. This is a free method, but may not be that practical depending on your needs.
Have you taken a look at Microsoft SQL Azure? You can use it free for something like 90 days and then if you only need a 1GB db it would only be about $10 a month.
You mention backup so I thought I would talk about that as well. They way SQL Azure works is that it automatically creates 2 additional copies of your database on different machines in the data center. If one of the machines or db's become unavailable it automatically fails over to one of the other db's.
If you need anything above that you can also use the copy command to backup the database.
You can check
http://www.enciva.com/postgresql9-hosting.htm
and
http://www.acugis.com/postgresql-hosting.htm
They work for Postgres and MySQL.
For a frankly tiny db of that size I'd seriously look at http://www.sqlite.org/
it's inprocess, easy to constantly .dump off to S3 and you can use update hooks to keep checkpoints after updates.
backups/restores are almost the equivalent of windows batchfiles and wgets
good encryption using http://sqlcipher.net/
standard OS Filesystem and user level ACLs control security.
running a file backed db makes sense given the fragility of a normal EC2 backed RDBMS to EBS gremlins.
there are exclusions from to SQL92 (no real showstoppers), but given the project cost sensitivity and the RPO and RTO's of an alumni database, I reckon it's a good bet.