WSO2 API Manager Store WebApp is slow - wso2

I am deploying the WSO2 API Manager based on the published docker-compose available at:
https://github.com/wso2/docker-apim
I've been able to, with a little assistance get everything working as expected, apart from one thing that is troubling: I have no problem accessing Carbon, Publisher or the Store web applikations, however, for some reason, the Store is extremely slow to respond, borderline useless, often timing out or taking minutes to load a simple view. This makes working with the entire solution cumbersome and it will never be acceptable among the potential users.
Apart from setting up the IP-addresses for the components there's nothing changed from the published compose image. I've tried accessing it from various clients, down to using a web browser on the host and it's all the same. The Publisher and Carbon works flawlessly with no performance issues. Does anyone have any idea, I'm a bit at my wits end.
Cheers

Related

Railo Express for a portable web app on a USB stick

Here's my scenario. I am writing a web app for a client that needs to be portable, i.e. they need to plug it into different PCs (Windows) and have it simply work. Life would have been easier if they could just put it up on a domain, but no can do in this case, cause internet access might not always be available. So, I am trying out Railo Express with Jetty (http://www.getrailo.org/index.cfm/download/) which has everything I need. I actually managed to install (well, copy and configure really) the package on a USB stick, created a new site in the "/webapps" folder and wired that up, then downloaded the drivers for SQLITE and got that connected and working just fine.
This is not going to be a very intense web app at all, or does it need many users connected to it (max 2-3 at a time). I use Bootstrap and other than a Dashboard with a couple of graphs, all the pages are basically forms and read/write to the SQLITE db.
So, while everything seems to work do you think this is a viable solution? It seems to work fine, but will I run into any issues, like perhaps performance or compatibility issues with the different PCs the client might be using? And is there a better way of doing this?
EDIT:
Thanks for replying guys. Here's some more info to hopefully clear things out. I should have been more specific as to why use a portable web app. The app is for a car wash business to log the business going through. There is basically one computer at the counter where things will be accessed from (and the USB will be attached here), and possibly one iPod at the entrance where cars going in will be logged by the attendant (and will connect to the local computer via wireless). The reason for portability? They want to take the stick home with them and review stats, so it's either a full installation on the computer and a backup on the stick (extra work), or just everything on the stick. The reason for not simply going online and making things easier for everyone: tricky internet reception, which would mean downtime of the app.
From your descriptions it looks like a simple and not very intensive application. Based on my experience with Railo Express, I think you have the power needed to run this.
What I would do is to install the application on the computer at the counter since that is the main hub (you mention the iPad connecting wireless). Use the stick as a backup and before they take it home, make sure the stick is updated with data. You might also consider designed the app so that there is separation between writing data and consuming it (e.g. people at home running reports).
Will the app on the stick run at home, most likely it will work, or if you run into some problems will not be hard to fix.

Sitecore performance enhancements

We need our Sitecore web application to process 60-80 web requests per second. We are using Sitecore 7.0. We have tried a 1 Webserver + 1 Database server deployment, but it only processes 20-25 requests per second. Web server queues up all the other requests in the Memory. As we increase the load, memory fills up.(We did all Sitecore performance enhancements recommended). We need 4X performance to reach the goal :).
Will it be possible to achieve this goal by upgrading the existing server, or do we have to add more web servers in production environment.
Note: We are using Lucene indexing as well.
Here are some things you can consider without changing overall architecture of your deployment
CDN to offload media and static asset requests
This leaves your content delivery server available to handle important content queries and display logic.
Example www.cloudflare.com
Configure and use Sitecore's built-in caching
This is from the guide:
Investigation and configuration of the Sitecore Caches is broken down
into multiple tasks. This way each task is more focused and
simplified. The focus is on configuration and tuning of the Sitecore
Database Caches (prefetch, data, and item caches.)
For configuration
of the output rendering caching properties, the customer should be
made aware of both the Sitecore Cache Configuration Reference and the
Sitecore Presentation Component Reference as to how properly enable
and the properties to expire these caches.
Check out the Sitecore Tuning Guide
Find Slow Queries or Controls
It sounds like your application follows Sitecore best practices, but I leave this note in for anyone that might find this answer. Use Sitecore's built-in Debug mode to identify the slowest running controls and sublayouts. Additionally, if you have Analytics set up there is a "Slow Pages" report that might give you some information on where your application is slowing down.
Those things being said, if you're prepared to provision additional servers and set up a load-balanced environment then read on.
Separate Content Delivery and Content Management
To me the first logical step before load-balancing content delivery servers is to separate the content management from the equation. This is pretty easy and the Scaling Guide walks you through getting the HistoryEngine set up to keep those Lucene indexes up to date.
Set up Load Balancer with 2 or more Content Delivery servers
Once you've done the first step this can be as easy as cloning your content delivery server and adding it to your load balancer "pool". There are a couple of things to consider here like: Does your web application allow users to log in? So you'll need to worry about sticky sessions or machine keys. Does your web application use file media instead of blob media? I haven't had to deal with this, but I understand that's another consideration.
Scale your SQL solution
I've seen applications with up to four load balanced content delivery servers and the SQL Server did not have a problem - I think this will be unique to each case depending on a lot of factors: horsepower and tuning of SQL Server, content model of your application, complexity of your queries, caching configuration on content delivery servers, etc. Again, the Scaling Guide covers SQL Mirroring and Failover, so that is going to be your first stop on getting that going.
Finally, I would say contact Sitecore. These guys have probably seen more of what's gone right and what's gone wrong with installations and could get you on the right path. Good luck!
This answer written from a Sitecore developer perspective:
Bottom line: You need to figure out exactly where your performance bottleneck is. That is going to take some digging, but will be very worthwhile. You should definitely be able to serve 60-80 requests/s without any trouble... but of course that makes a lot of assumptions about the nature of your site and the requests.
For my site, I found Sitecore's caching implementation to be sub-par... I created some very simple and aggressive application-specific caches in my app and this made all the difference in the world. For instance, we have 900+ "Partner" items where our sites' advertisements live... and simply putting all these objects in an array in the Application object sped up page requests significantly. Finding an object in a Hashtable indexed by its Item.Name or ID is going to be a lot faster than Sitecore.Context.Database.GetItem("/itempath") or a SelectItems() call (at least, that's my experience). If your architecture and data set will allow this strategy, we've had good experience with it.
Another thing to watch out for is XSLT renderings. Personally, I avoid them completely in favor of ASP.NET UserControls. The XSLT rendering is just slow. As much as 10x slower than a native UserControl rendering the same HTML. So if you have a few of these... replace with some custom code and you'll see a world of difference.

Django hosting on ep.io

is there someone who has expirience in hosting django applications on ep.io?
Waht are the pros/cons on it?
I'm currently using ep.io, I'm still in development with my app but I have an app deployed and running.
When you use a service like this you go into it knowing that it isn't going to be the perfect solution for every case. Knowing the pros and cons before hand will help set your expectations so that you aren't disappointed later on.
ep.io is still very young and I believe still in beta, and isn't available to the general public. To be totally fair to them, it is still a work in progress and some of these pros and cons may change as they roll out new features. I will try and come back and update this post as the new versions become available, and my experience with the service continues.
So far I am really pleased with what they have, they took the most annoying part of developing an application and made it better. If you have a simple blog app, it should be a breeze to deploy it, and probably not cost that much to host.
Pros:
Server Management: You don't have to worry about your server setup at all, it handles everything for you. With a VPS, you would need to worry about making sure the server is up to date with security patches, and all that fun stuff, with this, you don't worry about anything, they take care of all that for you.
deployment: It makes deploying an app and having it up and running really quickly. deploying a new version of an app is a piece of cake, I just need to run one maybe two commands, and it handles everything for me.
Pricing: you are only charged for what you use, so if you have a very low traffic website, it might not cost you anything at all.
Scaling: They handle scaling and load balancing for you out of the box, no need for you to worry about that. You still need to write your application so that it can scale efficiently, but if you do, they will handle the rest.
Background tasks: They have support for cronjobs as well as background workers using celery.
Customer support: I had a few questions, sent them an email, and had an answer really fast, they have been great, so much better then I would have expected. If you run your own VPS, you really don't have anyone to talk to, so this is a major plus.
Cons:
DB access: You don't have direct access to the database, you can get to the psql shell, but you can't connect an external client gui. This makes doing somethings a little more difficult or slow. But you can still use the django admin or fixtures to do a lot of things.
Limited services available: It currently only supports Postgresql and redis, so if you want to use MySQL, memcached, mongodb,etc you are out of luck.
low level c libs: You can't install any dependencies that you want, similar to google app engine, they have some of the common c libs installed already, and if you want something different that isn't already installed you will need to contact them to get it added. http://www.ep.io/docs/runtime/#python-libraries
email: You can't send or recieve email, which means you will need to depend on a 3rd party for that, which is probably good practice anyway, but it just means more money.
file system: You have a more limited file system available to you, and because of the distributed nature of the system you will need to be very careful when working from files. You can't (unless i missed it) connect to your account via (s)ftp to upload files, you will need to connect via the ep.io command line tool and either do an rsync or a push of a repo to get files up there.
Update: for more info see my blog post on my experiences with ep.io : http://kencochrane.net/blog/2011/04/my-experiences-with-epio/
Update: Epio closed down on May 31st 2012

Best approach(es) or technolog(y/ies) for this specific problem?

I have a web-based interface for handing invoices, customer records and other transaction records which interacts currently with a database of all the aforementioned stored upon the same machine. As you can imagine, this is quite a simple set-up consisting of a web-app (PHP) and a database (MySQL). However, the ideal scenario is to keep the records on the machine they are currently on (easy) and move the web-app to another server within the same network (again, easy) ... but in addition, provide facilities on a public-facing website for managing accounts by customers and so forth. The problem is this - the public-facing web server is located in a completely separate location as it is a dedicated server provided by a well-known ISP.
What would be the best way to enable the records to be accessible from this other server whilst ensuring that all communications are secure. Speed is not a huge factor, although any outages on either side should be handled gracefully. Initially my thoughts went towards web services (XML-RPC/SOAP/Hessian), but these options seem to present difficulties (security being the main one, overcomplexity as well).
The web-app must remain PHP-based. The public-facing site is likely to be PHP-based as well, although Python (likely using Django) is another option. The introduction of any other technologies (Java etc) is not a problem, although it is preferred if they be Linux-friendly (so .NET would not be the best fit here).
Apologies if this question is somewhat verbose and vague. I am testing the water somewhat in regards to this kind of problem. Any advice or suggestions gratefully received.
I've done something similar. You can expose a web service to the internet that will do the database access, but requests to the service must match a strong hashed and salted password (which will be secured on the ISP's server in the DMZ.)
Either this or some sort of public/private key encryption scheme.
OK, this might seem a bit silly, but what if you just used mysql replication?
Instead of using all sorts of fancy web services, just have a master sql server on one machine, then have it replicate to another server that holds the slave sql server as well as the web app

Identifying ASP.NET web service references

At my day job we have load balanced web servers which talk to load balanced app servers via web services (and lately WCF). At any given time, we have 4-6 different teams that have the ability to add new web sites or services or consume existing services. We probably have about 20-30 different web applications and corresponding services.
Unfortunately, given that we have no centralized control over this due to competing priorities, org structures, project timelines, financial buckets, etc., it is quite a mess. We have a variety of services that are reused, but a bunch that are specific to a front-end.
Ideally we would have better control over this situation, and we are trying to get control over it, but that is taking a while. One thing we would like to do is find out more about what all of the inter-relationships between web sites and the app servers.
I have used Reflector to find dependencies among assemblies, but would like to be able to see the traffic patterns between services.
What are the options for trying to map out web service relationships? For the most part, we are mainly talking about internal services (web to app, app to app, batch to app, etc.). Off the top of my head, I can think of two ways to approach it:
Analyze assemblies for any web references. The drawback here is that not everything is a web reference and I'm not sure how WCF connections are listed. However, this would at least be a start for finding 80% of the connections. Does anyone know of any tools that can do that analysis? Like I said, I've used Reflector for assembly references but can't find anything for web references.
Possibly tap into IIS and passively monitor the traffic coming in and out and somehow figure out what is being called and where from. We are looking at enterprise tools that could help but it would be a while before they are implemented (and cost a lot). But is there anything out there that could help out quickly and cheaply? One tool in particular (AmberPoint) can tap into IIS on the servers and monitor inbound and outbound traffic, adds a little special sauce and begin to build a map of the traffic. Very nice, but costs a bundle.
I know, I know, how the heck did you get into this mess in the first place? Beats me, just trying to help us get control of it and get out of it.
Thanks,
Matt
The easiest way is to look through the logs, but if that doesn't include the referrer than you may also want to monitor what is going out from your web to the app server. You can use tools like Wireshark or Microsoft Network Monitor to see this traffic.
The other "solution" and I use this loosely is to bind a specific web server to app server and then run through a bundle and see what it is hitting on the app server. You could probably do this in a test environment to lesson the effects on the users of the site.
You need a service registry (UDDI??)... If you had a means to catalog these services and their consumers, it would make this job of dependency discovery a lot easier. That is not an easy solution, though. It takes time and documentation to get a catalog in place.
I think the quickest solution would be to query your IIS logs and find source URLs which originate from your own servers. You would at least be able to track down which servers your consumers are coming from.
Also, if you already have some kind of authentication mechanism in place, you could trace who is using a particular service based on login.
You are right about AmberPoint. There are other tools that catalog the service traffic and provide reports showing what is happening to your services. Systinet, SOA Software and Actional also has a products similar to Amberpoint but Amberpoint has a free-ware version, I believe.