AWS RDS Read Replica - amazon-web-services

I have dotNet core application and it's MySQL RDS database configured in London AWS region, used by UK users.
I now have users connecting from Australia who are experiencing slow performance when using the UK app.
I would like to improve the performance for these users and thinking of creating a Read only replica of the database as 90% of DB activity is read.
How will the application know whether to use the Sydney Read only replica or the London Read/Write ? Does it somehow detect the user is in Sydney and just direct them to the closest ? Or do I also need to spin up a version of my app over there for that to work?

I have worked extensively on such use cases. You can always use Route53 to re-direct users to any specific endpoints. I am assuming that your users are NOT interacting directly with database. They would be hitting some application end point. For your scenario; you will have to host your application in australia also ( along with read replica). Now, inside application for read queries you will connect to read replica; while for write queries your application will write to UK database. While the application deployed on UK server; will use same UK server for both. All of this is achieved using property files having database URLs. And you deploy in different locations using different property files. This property file will have following:
For UK:
readQueries=uk.mysql.master
writeQueries=uk.mysql.master
For Australia:
readQueries=aus.mysql.read
writeQueries=uk.mysql.master
I will recommend you to change title of this question to something better which tells the exact problem. Present title is more of a tag.

Related

Zip images on web server and return the url

I am currently looking for a way to improve the traffic flow of an app.
Currently the user uploads his data via the app, using Google Cloud Platform as storage provider. Other users can then download this data again.
This works well so far, but since the download traffic at GCP is relatively expensive I had the idea to outsource this to a cheap web server.
The idea is that the user requests the file(s) at GCP. There it is checked if the file(s) are already on the web server. If not, the file(s) will be uploaded to the server.
At the server the files are zipped and the link is sent back to GCP, where it is emailed to the user.
TL:DR My question is, how can i zip a specific selection of files on a web server without nodejs etc. and send the link of the generated file back to GCP
I'm open for other ideas aswell
This is a particular case, covered by Google Cloud CDN (Content Delivery Network) service.
As you can read here, there already is a way to connect the CDN to a Storage bucket, and it will do exactly what you've thought to do with your own web server. The only difference is that it's already production ready. It handles cache misses, cache hits and so on.
You can compare the prices: here you can find CDN prices, and here you can find Storage prices. The important difference is that Storage costs per TB of Egress, meanwhile CDN costs per 10TB of Egress, and the price is still lower.
Of course, you can still stick to your idea. I would implement it by developing a REST API. The API, with just one endpoint will serve the file, if it is present on the web server. If it is not present, it will:
perform a redirect to the direct link for the file hosted in Storage;
start to fetch the file form Storage and put it in the cache.
You would still need to handle the cache: what happens when somebody changes a file? That's something related to the way you're working with those files, so it strictly depends on your app functional domain, and in any case, Cloud CDN would solve it without any further development.

Can i open a website through an Amazon Web Service?

Is it possible to open a website,like facebook.com for example, on an amazon web service?
My objective is to automate a certain task in a game and to do so without having to be online on my computer. The point is to spend less time on that game, but to not be left behind on the progress. (I'm building a bot to automate the daily tasks there, just need to know if i can now leave everything running on amazon)
Another project i want to do is to automate access to my email account and perform certain tasks depending on the emails i receive.
You get the point, i tried searching on google but i only find results about creating or hosting your own website in there and not about accessing existing websites and using automation in them.
It sounds like what you want is a virtual private server - basically a computer in the cloud that you control and is always on.
AWS have a service called LightSail for this kind of purpose. Under the hood lightsail just uses EC2, but lightsail takes away a lot of the options and configuration to provide a simpler 'click and go' kind of service.
Once you have a server you can schedule regular tasks. Depending on the complexity of your needs, you could look at using Cron as a scheduler and curl for you http requests.
For the specifics of any project you have I would suggest opening a new question with details of what you are trying to do, the reading you have done, and examples of any code you have tried.

How to synchronize the local DynamoDb and Amazon DynamoDb web service

Hello, thanks for your viewing my question first!
I am running the Amazon dynamoDb locally and all databases are saved locally. With the local dynamoDb, I have to show everything with a lot of code, but I feel the interface at web service is much better, in which I can perform operations and see the tables directly and clearly:
So may I ask how can connect them, then I can practice the coding and check the status easily?
Looking forward to your reply and thank you so much!
Sincerely
You cannot connect them as they are completely separate databases. However, you can put a simple user interface on top of your local DynamoDB database.
I use the SQLite Browser: http://sqlitebrowser.org/. Once you have it installed, open the .db file located in the folder where you are running DynamoDBLocal.jar. You should be able to see all your tables and the data within them. You won't be able to see DynamoDB specific things like your provisioned capacity, but I think this will give you enough of what you're looking for.
Does this help?

Opencart: Master product list, multiple slave sites possible?

One of my B2B partners asked me a question today. I enjoyed the whooshing sound it made as it went over my head, since I've never even looked at a line of code from OpenCart. I suggested to him that I could post it here, where the experts live.
Check it out:
I am looking at opencart as a solution for a development project that we have and I wanted to know if anyone knows if it is possible to setup OpenCart as a master/slave configuration. All sites will be on the same server with different IPs and different domain names so I would be using the same core database for the master site and each slave/product site.
What I am looking for is a way to manage all the products from one login so we don't have multiple shops that need to be managed. Is there a way to group products so I could group them by brand so they are only shown on the site they are meant for? I just wanted to check with the community to see if anyone is familiar with this setup.
It is supported since OC 1.4.9 I guess - and it's called multistore.
You have one eshop, one administration, one database, one set of products, categories, etc.
You manage the data in the master administration and set which data is visible at what slave store. Each slave store has to be an alias domain/subdomain to the main master domain.
Result: it is possible and not so hard to achieve. You can then have different settings (localization, theme, product settings, order settings, ...) per slave store.
Further, it supports multi domains, multi languages and multi currencies. With regards to the answer. shadyyx has already posted. I am furthering with regards to the technical side.
web server loadbalanced via dns:
opencart webserver1
php, nginx, apc, memcache
multiple stores on this server. The files and content is cloned to two other servers and they clone and sync every 1 hour. They get hits for high traffic.
opencart webserver2 - slave, cloned of webserver1
opencart webserver3 - slave, cloned of webserver1
cdn1 - rackspace cloud files - images, css, javascript
cdn2 - rackspace cloud files - xml, html, videos
database loadbalanced:
database mysql master server 1
database mysql slave server 2
database mysql slave server 3
So you can see. Opencart is a good solution to go with. Handling multiple stores on the same servers and or different servers. Working together.

Bare minimum for a Sitecore content delivery set-up

We currently have a single installation multi-site setup, hosted in Europe, and are looking to move content delivery for a single site to China. This is partly for SEO purposes and partly to improve content delivery performance there. Content management performance isn't an issue.
Given that we'll be having to transfer data between two separate hosting companies we'd like to limit both how much gets sent, and if possible not send any data we wouldn't be happy to publish.
We have Sitecore analytics enabled, so this might be a complicating factor.
I've read the scaling guide, which suggests we'll need a minimum of both web and core databases in the new CD environment. They do suggest that if there is no extranet security configured it is possible to do without the core database in a pure CD environment.
Does anyone have any experience with this? What are the benefits/pitfalls? What is the bare minimum installation we can get away with?
Edit: Sitecore.NET 6.4.1 (rev. 111003)
Like divamatrix said, knowing the version number is essential.
But even though the older versions can run without the Core, I would stick to an installation that includes the Core so you will have less trouble upgrading in the future.
What you need on the Content Delivery side is:
Web database
Core database
Analytics database
Then on the Content Management side you need your usual:
Master database
Web database
Core database
Analytics database
Then setup SQL replication between the Core databases.
Analytics can be configure to run reports using data from CD and store them on CM.
You also need to setup Web Deployment for file replication between the instances.
Besides all this you need some extra configuration as is explained in the Scaling Guide.
If you are not using Sitecore 6.4 or higher, I would recommend upgrading first. Once you got this setup properly it will work like a charm!
To answer your question, older versions of Sitecore worked without the Core database. You didn't say which version of Sitecore you're using, but if it's anything current, the answer is going to be that you need a web database and a core database. Also, having analytics enabled is definitely a consideration you need to look at. You should probably look at setting up an your analytics database local to your CD hosting as this database can see a lot of traffic depending on the traffic of your site. You can have publishing set up to either publish to a local web database and then replicate or you can just let publishing should handle the transfer of data between your CM and CD environment.