Our website has about 200K images stored in sitecore database now. It runs more slowly than before. Does this large numbers of images stored in database will slow down the whole site?
If yes, how can I improve our image storage?
Thanks very much, our sitecore version is 6.2.
Have you considered setting up a CDN for your static assets? That would reduce load on your site and should speed it up.
Otherwise you might look at optimising the databases. Have a look at the Sitecore Optimisation Guide http://sdn.sitecore.net/upload/sitecore6/64/cms_tuning_guide_sc60-64-a4.pdf
In general, it depends on whether front-end or back-end is slow.
If you experience issues even when the site is not loaded with huge number of requests - then you should probably upgrade hardware.
If it's caused by high website load - there are two rather simple options:
1) Use dedicated image server for Sitecore http://pentialized.dk/2011/01/02/dedicated-image-server-in-sitecore-part-2/
2) Integrate Media Library with CDN, CloudFront is really simple and powerfull, see the example here: http://herskind.co.uk/blog/2012/04/using-cloudfront-for-sitecore-media-content
Related
My open cart website is too slow so how can I solve the issue. When I open the admin side its working good and fast but when I open front-end site then its to load my site. I remove my image folder and check whats problem but again the same problem. if anyone has a solution so please tell me how to resolve it.
The simplest is to use a Cache module that would implement speed optimization features like image compression, caching, file minification and CDN
here are two top extensions:
Carbon Cache - OpenCart speed Optimization for Google page speed
Nitro pack
check for errors
You need to check if you have any issues in your Log file (system/storage/logs/error.log) and if you do, correct them. errors can cause speed drop.
flatten category structure
Also, a heavy category structure with lots of subcategories can influence your speed as well so try to keep your categories flat (3 subcategories deep max)
too many modifications are bad
Modifications like OCMOD can also create speed issues since they can implement bad code directly into the core. so watch out for those.
find a better hosting
And last but not least - your hosting provider. Check if your hosting is actually providing you with a fast server. We often suggest opting for Digital Ocean since the guys have done a good job providing high-quality servers for a reasonable price. if you want $10 when signing up with DO and feel comfortable if we also get $10, use our Digital Ocean link, or just visit their website digitalocean.com
We need our Sitecore web application to process 60-80 web requests per second. We are using Sitecore 7.0. We have tried a 1 Webserver + 1 Database server deployment, but it only processes 20-25 requests per second. Web server queues up all the other requests in the Memory. As we increase the load, memory fills up.(We did all Sitecore performance enhancements recommended). We need 4X performance to reach the goal :).
Will it be possible to achieve this goal by upgrading the existing server, or do we have to add more web servers in production environment.
Note: We are using Lucene indexing as well.
Here are some things you can consider without changing overall architecture of your deployment
CDN to offload media and static asset requests
This leaves your content delivery server available to handle important content queries and display logic.
Example www.cloudflare.com
Configure and use Sitecore's built-in caching
This is from the guide:
Investigation and configuration of the Sitecore Caches is broken down
into multiple tasks. This way each task is more focused and
simplified. The focus is on configuration and tuning of the Sitecore
Database Caches (prefetch, data, and item caches.)
For configuration
of the output rendering caching properties, the customer should be
made aware of both the Sitecore Cache Configuration Reference and the
Sitecore Presentation Component Reference as to how properly enable
and the properties to expire these caches.
Check out the Sitecore Tuning Guide
Find Slow Queries or Controls
It sounds like your application follows Sitecore best practices, but I leave this note in for anyone that might find this answer. Use Sitecore's built-in Debug mode to identify the slowest running controls and sublayouts. Additionally, if you have Analytics set up there is a "Slow Pages" report that might give you some information on where your application is slowing down.
Those things being said, if you're prepared to provision additional servers and set up a load-balanced environment then read on.
Separate Content Delivery and Content Management
To me the first logical step before load-balancing content delivery servers is to separate the content management from the equation. This is pretty easy and the Scaling Guide walks you through getting the HistoryEngine set up to keep those Lucene indexes up to date.
Set up Load Balancer with 2 or more Content Delivery servers
Once you've done the first step this can be as easy as cloning your content delivery server and adding it to your load balancer "pool". There are a couple of things to consider here like: Does your web application allow users to log in? So you'll need to worry about sticky sessions or machine keys. Does your web application use file media instead of blob media? I haven't had to deal with this, but I understand that's another consideration.
Scale your SQL solution
I've seen applications with up to four load balanced content delivery servers and the SQL Server did not have a problem - I think this will be unique to each case depending on a lot of factors: horsepower and tuning of SQL Server, content model of your application, complexity of your queries, caching configuration on content delivery servers, etc. Again, the Scaling Guide covers SQL Mirroring and Failover, so that is going to be your first stop on getting that going.
Finally, I would say contact Sitecore. These guys have probably seen more of what's gone right and what's gone wrong with installations and could get you on the right path. Good luck!
This answer written from a Sitecore developer perspective:
Bottom line: You need to figure out exactly where your performance bottleneck is. That is going to take some digging, but will be very worthwhile. You should definitely be able to serve 60-80 requests/s without any trouble... but of course that makes a lot of assumptions about the nature of your site and the requests.
For my site, I found Sitecore's caching implementation to be sub-par... I created some very simple and aggressive application-specific caches in my app and this made all the difference in the world. For instance, we have 900+ "Partner" items where our sites' advertisements live... and simply putting all these objects in an array in the Application object sped up page requests significantly. Finding an object in a Hashtable indexed by its Item.Name or ID is going to be a lot faster than Sitecore.Context.Database.GetItem("/itempath") or a SelectItems() call (at least, that's my experience). If your architecture and data set will allow this strategy, we've had good experience with it.
Another thing to watch out for is XSLT renderings. Personally, I avoid them completely in favor of ASP.NET UserControls. The XSLT rendering is just slow. As much as 10x slower than a native UserControl rendering the same HTML. So if you have a few of these... replace with some custom code and you'll see a world of difference.
I would like to prepare my website for a possible influx in traffic. This is my first time using Django as a framework, so I'm unsure of the modifications that should be made to assure that I'm ready and won't go down. What are some of the common things one can do to prepare a Django website for production-level traffic?
I'm also wondering what to expect in terms of traffic numbers. I'm currently hosted at Webfaction with 600GB/month of traffic. Will this quickly run out? Are there statistics on how big 'slashdotted' events are?
Use memcache and caching middleware.
Be sure to offload serving statics.
Use CDN for statics. This doesn't directly affect Django, but will reduce your network traffic.
Anything beyond that — read up what others are using:
Scaling Django Web Apps By Mike Malone
Instagram Architecture
DISQUS Architecture
Since you are at Webfaction you have an easy answer for handling your statics:
Create a Static-only application. (Not the Static CGI/PHP app)
Add it under you current website.
Put all of your statics under it (or symlink to them, which is what I do).
This will serve all statics through their nginx frontend -- blindingly fast.
Regarding your bandwidth allocation:
You don't say what type of content you are offering. If it is anything even slightly vanilla you are unlikely to approach 600GB/mo. I have one customer who offers adult-oriented videos teaching tantric sex techniques and their video bandwidth (for both free & member-only videos) is about 400-450GB/mo. The HTML portion of the site (with tons of images) runs about 50-60GB/mo.
I have been visiting some sites hosted on GAE and I found them to be very slow.
Pretty much all of them take longer than usual to load.
Time: (in seconds) [ YSlow ]
9.9 giftag.com
3.1 hotskills.net
1.9 jeeyo.net
1.5 appspot.com
Is it that App Engine Cloud is too slow, Bigtable is too slow ... or what?
You're using the YSlow plugin to measure this, and YSlow tells you why the site is slow (the cunning name is the clue). For example, in the case of gifttag.com, YSlow reports that:
This page has 9 external Javascript
scripts. Try combining them into one.
This page has 3 external stylesheets.
Try combining them into one. This page
has 13 external background images. Try
combining them with CSS sprites.
So it's get an 'E' grade for that. That's going to kill the perceived load performance of the site.
None of this has anything to do with appengine.
YSlow has nothing to do with the speed of the web app on the server side since it's a completely client side speed measurement (css, javascript, browser rendering, image loading, etc). But on the other side, I have heard that your application may be slow on App Engine if doesn't have much hits and traffic. This makes the App Engine not to cache the python runtime environment (have cold start), so this can make significant difference in performance of applications with low traffic.
Analysis: Google App Engine alluring, will be hard to escape
GAE's data access is in the order of seconds compared to a database which is measured in milliseconds. The difference is that BigTable scales to the millions of concurrent access due to the inherent isolation level of Read Uncommitted and the relaxed consistency.
No RDBMS can compute with that and still give consistency guarantees. To be honest, you don't really want to because for some applications you want strong guarantees over scalability.
We're hosting a django service for some clients using really really poor and intermittent connectivity. Satellite and GPRS connectivity in parts of Africa that haven't benefited from the recent fiber cables making landfall.
I've consolidated the javascripts and used minificatied versions, tried to clean up the stylesheets, and what not...
Like a good django implementer, I'm letting apache serve up all the static information like css and JS and other static media. I've enabled apache modules deflate (for gzip) and expired to try to minimize retransmission of the javascript packages (mainly jQuery's huge cost). I've also enabled django's gzip middleware (but that doesn't seem to do much in combination with apache's deflate).
Main question - what else is there to do to optimize bandwidth utilization?
Are there django optimizations in headers or what not to make sure that "already seen data" will not travel over the network?
The django caching framework seems to be tailored towards server optimization (minimize hitting the database) - how does that translate to actual bandwidth utilization?
what other tweaks on apache are there to make sure that the browser won't try to get data it already has?
Some of your optimizations are important for wringing better performance out of your server, but don't confuse them with optimizing bandwidth utilization. In other words gzip/deflate are relevant but Apache serving static content is not (even though it is important).
Now, for your problem you need to look at three things: how much data is being sent, how many connections are required to get the data, and how good are the connections.
You mostly have the first area covered by using deflate/gzip, expires, minimization of javascript etc. so I can only add one or two things you might not know about. First, you should upgrade to Django 1.1, if you haven't already, because it has better support for ETags/Expires headers for your Django views. You probably already have those headers working properly for static data from Apache but if you're using older Django they (probably) aren't being set properly on your dynamic views.
For the next area, number of connections, you need to consolidate your javascript and css files into as few files as possible to reduce the number of connections. Also very helpful can be consolidating your image files into a single "sprite" image. There are a few Django projects to handle this aspect: django-compress, django-media-bundler (which is the only one that will create image sprites), and you can also see this SO answer.
For the last area of how good are the connections you should look at global CDN as suggested by Alex, or at the very least host your site at an ISP closer to your users. This could be tough for Africa, which in my experience can't even get decent connectivity into European ISP's (at least southern Africa... northern Africa might be better).
You could delegate jQuery to a CDN which may have better connectivity with Africa, e.g., google (and, it's a free service!-). Beyond that I recommend anything every written (or spoken on video, there's a lot of that!-) by Steve Souders -- while his talks and books and essays are invaluable to EVERY web developer I think they're particularly precious to ones serving a low-bandwidth audience (e.g., one of his tips in his latest books and talks is about a substantial fraction of the world's browsers NOT getting compression benefits from deflate or gzip -- it's not so much about the browsers themselves, but about proxies and firewalls doing things wrong, so "manual compression" is STILL important then!).
This is definitely not an area I've had a lot of experience in, but looking into Django's ConditionalGetMiddleware might prove useful. I think it might help you solve the first of your bullet points.
EDIT: This might be a good place to start: http://docs.djangoproject.com/en/dev/topics/conditional-view-processing/