Email Server on Amazon EC2 [closed] - amazon-web-services

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
We have an email server running postfix on AWS m1.medium instance. We push out roughly 150,000 emails a week (30,000 emails a day). We do not want to use Amazon SES for some business reasons. It usually takes more than 2 hours for each day's send and we want to reduce this. What suggestions do you have in terms of increasing the AWS instance type / class? There is a number of instance classes and we cannot figure out which class / type would be ideal for our situation. Any suggestions?

For your use case, instance size probably does not matter. 30,000 emails over a two hour period is not a lot either of terms of CPU, disk or network requirements.
Most likely you will see improvements by better overlapping of email send requests. This can be accomplished thru software design improvements, or simply splitting your sends via multiple EC2 instances.
Of course I am making a lot of assumptions here as you did not provide any statistics on what you are sending, etc.

Since you didn't specify any specifics, I presume that this is some kind of customer relationship thing (newsletter, etc), which will send lots of similar, or even identical emails (bulk emailing).
The problem you're inevitable going to run with in is, your mails getting classified / treated as unsolicited. The symptom you describe as
It usually takes more than 2 hours for each day's send and we want to reduce this.
Sounds a lot like greylisting and/or tarpitting to me. If this is actually the issue, then except making your bulk email look less like spam and your mail delivery system behave like a dumb mass mailer there's little you can do about it.
See also this Q&A: https://webmasters.stackexchange.com/a/19170

Related

Passing messages from AWS to company site [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 7 years ago.
Improve this question
I am looking for a way to pass log events from AWS application to my company site.
The thing is that the AWS application is 100% firewalled from everything except only one IP address because it's encryption related service.
I just don't know what service I should use to do this. There's so many services so I do really have no idea what is it.
I think I'd just use simple message service, does this makes sense? The thing is there's plenty of events (let's say 1M per day), so I don't want big extra costs for this.
Sorry for the generic question, but I think it's quite concrete - "What is the most optimal way to pass event message from AWS when volume is approx 1M per day each 256 bytes on average?".
I'd like to connect to AWS service instead to any of the EC2 hosts...
On both sides I have tomcats with AWS-SDK.
I just want to avoid rewriting. Maybe I should do it with S3? The files are immutable, but I could upload files every 1h. I don't need real-time events. I just need to have logfiles on site for analysis of user experience and that customers can access it, but having log in 1M chunks would either require further assembling etc, I am really confused, sorry.
Kinesis is good for streaming event data. S3 is good if you already have files that you want stored.

Which one is better to user between Parse, Firebase and AWS Cognito? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I am willing to use synchronisation service for my application. But I want to choose the best one. I want to know which one is better among all these. My application will run on Android , IOS , Windows and Web.
I am going with Firebase because I tested it. It is giving me fast results and it is also allowing me to work offline. Is it better or I will go with Parse or AWS Cognito?
I Also have an option of Google Cloud. Does Google Cloud provides service like Firebase? And are realtime updates possible with Parse as like Firebase?
Codeek has a good point that this question is opinion based, so take my answer with a grain of salt.
I have experience with both Parse and Firebase, but not with Cognito.
In my experience, Parse is better when working with large relationship-based databases. (I.E. databases where multiple classes of objects are pointing to each other and interact.) In this system, it is easy to store a lot of data very succinctly, but working with this data is done via snapshots. This means that you can take a snapshot of the data, edit it, and then refresh the server with the updated snapshot. This is perfect for things like my delivery application where only one user is updating the orders on our server at any one time.
Firebase implements a model-observer scheme, and so it is much better for applications that are highly interactive. For instance, I have used Firebase for creating a real-time game of hot potato. The advantage here is that changes to the data on the server are automatically pushed out to all devices that have registered as listeners (functionality not available on Parse from my experience). This keeps all users on the same page all the time. The downside is that the database is structured in a hierarchal manner and doesn't have defined "objects". Rather, it is structured via key/value pairs where parent keys cannot have an associated value. To illustrate this, a sample structure for storing a game on my database went something like this:
-Games
--1
---Users
----1 = "example#gmail.com"
----2 = "example2#gmail.com"
---PotatoHolder = 1
---TimeRemaining = 30
---Loser = -1
Cognito I am not familiar with, so I'll allow someone else to explain how that database system is designed.
In summary, codeek is correct that this is an opinion-based question, but for two of your options a good rule of thumb from my experience is that Parse is fantastic for large relationship databases in conjunction with single-user applications (i.e. single-player or turn based games). Firebase is more suited to hierarchal data systems in conjunction with real-time multiplayer applications.
I hope this helps! If you could post a little more about what kind of application you are trying to build then perhaps I, or someone else, could provide a little more guidance.
Expanded Answer: Although this question has been marked as off topic, to answer Nidhi's follow-up question if there is a way to use Parse as a model-observer scheme: Not easily. Using a timer is the simplest option. The other option is to use push notifications. This would require getting permission from you user. You can set the Cloud Code on Parse to automatically send push notifications all relevant users and then intercept them within your client so that they are "silent". In other words, when they arrive, you can have your client respond by updating your game without showing a ribbon or notification like normal push notifications. I have not done this myself, as I prefer using Firebase for that kind of application, but I believe that it is possible.
Source: PFQueryTableView Auto Refresh When New Data Updated or Refresh Every Minute Using Parse
Keith's answer is similar to Nidhi's reference to refreshing PFObjects via a Timer, Handsomeguy's comment refers to the possibility of "silent" push notifications.

Can the way a site is coded affect how much we spend on hosting? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
Our website is an eCommerce store trading in ethically sourced loose diamonds. We do not get much traffic and yet our Amazon bill is huge ($300/month for 1,500 unique visits). Is this normal?
I do know we are daily doing some database pulling twice from another source and that the files are large. Does it make sense to just use regular hosting for this process and then the Amazon one just for our site?
Most of the cost is for Amazon Elastic Compute Cloud. About 20% is for RDS service.
I am wondering if:
(a) our developers have done something which leads to this kind of usage OR
(b) Amazon is just really expensive
IS THERE A PAID FOR SERVICE WHICH WE CAN USE TO ENSURE OUR SITE IS OPTIMISED FOR ITS HOSTING - in terms of cost, usage and speed?
It should probably cost you around 30-50 dollars a month. 300 seems higher than necessary.
for 1500 vistors, you can get away with using an m1.small instance most likely
I'd say check out the AWS trusted advisor service that will tell you about your utilization and where you can optimize your usage, but you can only get that with AWS Business support (100/month). However considering your way over what is expected, it might be worth looking into
Trusted advisor will inform you of quite a few things:
cost optimization
security
fault tolerance
performance
I've generally found it to be one of the most useful additions to my AWS infrastructure.
Additionally if you were to sign up for Business support, not only do you get trusted advisor, but you can ask questions directly to the support staff via chat, email, or phone. Would also be quite useful to help you pinpoint your problem areas.

Private Microblogging/Twitter-like Service [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
Are there any cloud based private Twitter-like services out there?
I am working for a client who needs a service like this implemented, but we don't have the time or budget to create one from scratch.
I am looking for something with a REST api where I can create an account on it from the master server, set an account to follow another account, post updates for accounts, and then get a feed of posts (sorted by date) from accounts that another account is following (like a facebook wall, or twitter feed). It would be great if it could automatically scale out to hundreds of thousands of users, with perhaps 50 000 daily posts being made.
I had thought about implementing this myself, but it seems like there are some tricky areas when it comes to having an account following a few thousand other accounts, or being followed by 10s of thousands of accounts, and generating the feed in somewhat realtime as posts come in.
I have found some services such as http://www.ning.com/ and http://www.socialengine.com/ but I'm not sure if they can do what I need, and they seem to be very focussed on having a website. This is for a mobile app so that is not required.
There are a few open source projects out there, but they would all require setting up/maintaining hosting (not a huge problem) and I'm not certain how scalable they are (the client requires it scale up to at least 100k users).
I'm sorry for the late reply. I hope it will be useful to others looking at this.
I had pretty much the exact same need as you, and ended up creating a full-featured solution after finding no other resources. The service is called Collabinate (http://www.collabinate.com). It provides a RESTful API that focuses on simplicity and ease of use, and currently leaves the UI completely up to you. It uses a graph database and algorithms in the backend, and scales quite well for your situation.
Maybe private team inbox can fit in your solution too...
https://www.flowdock.com/
there is not a following feature in this but if this is an internal company need...
you can create chat rooms for departments and in general ... maybe the chat rooms can be the following feature for you
Looks like there isn't a good solution here.
I have found jaiku which looks incredibly complex and doesn't seem to run on the latest app engine sdk.
There is also diaspora which could be modified and run on your own server to do what is needed.
In the end, I have decided to just implement this myself on Google App Engine. It seems the best way to do what is needed. Using the fan-out pattern seems to be the best way. The Fantasm library seems to provide an easy to use way to do this, so I am going to try that.

PAF database vs a service such as Postcode Anywhere [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
Would it be more cost effective for a small business (around 25 concurrent users) to buy a PAF database and code it up ourselves or use a Postcode service such as Postcode Anywhere?
The Royal Mail site is really confusing! http://www.royalmail.com/marketing-services/address-management-unit/address-data-products/postcode-address-file-paf/prices
We operate 24 hours a day and at any one time, we have between 1 and 25 users doing postcode searches. We are currently using a PAYG service and it is really pricey so we want to buy a PAF database and create our own. I don't understand the pricing on the link above (basically we're looking at something in the region of £2 to £49,500?!)
Also, what do you actually get with a PAF database? As in what kind of files do they send you, is there an API and do you pay a one off fee or an ongoing fee? Do you have to agree to the delete the data once you stop paying royal mail?
Thanks
For the time it would take to code it up yourselves, it would be more time and cost efficient to go with someone like Postcode Anywhere. They'll also provide guaranteed first class service along with service updates to improve service.
We use them on a lesser-scale (after moving from QAS which were crap in comparison).
Have you investigated pricing with any providers yet - if so, what's it coming out at?
I can't add anymore to the answer by Alan, which describes how the files are provided and how it needs to be done.
You get a bunch of flat files and need to use the PAF programmers guide to help build yourself a system
http://www.royalmail.com/marketing-services/address-management-unit/address-data-products/programmers-guide
See also
www royalmail.com/sites/default/files/docs/pdf/01_tell_me_the_basics.pdf
www royalmail.com/pafnews
You're probably better off buying some package www poweredbypaf.com/
You don't need to purchase the Royal Mail Postcode Address File (PAF). There are lots of API's available.
getAddress.io is the only one I've found that's free:
https://getAddress.io