AWS serverless send SMS to user [closed] - amazon-web-services

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I am trying to build a small web application with AWS but because the services are new to me I am having a hard time. My boss asked that there be a page where the user inserts his phone number before he participating in the research study. After the participant inserted his phone number he will receive 4 text messages (SMS) during the day with a task to perform. The messages will be sent 4 times a day (8:00, 12:00, 16:00, and 20:00) for 10 days, not including Friday and Saturday. could you help me to understand how to approach this project and whether it is possible to do it with AWS?
Thanks in advance,
Orly

One possibility is creating a Simple Notification Service Topic for each new user. Next you would subscribe the user's phone number to the Topic. You can then publish to this Topic the message (under 140 characters) that you want sent.

This use case is very possible with AWS. There are different ways to implement this functionality. To address this use case:
"The messages will be sent 4 times a day (8:00, 12:00, 16:00, and 20:00) for 10 days, not including Friday and Saturday. "
You can build serverless workflows using AWS Step Functions that can perform this use case. AWS Step Functions can invoke many different AWS Services to meet your business needs. For your use case to send a text message, you can invoke the Simple Notification Service.
In fact a workflow can send messages over different channels. Here is a Java based use case that walks you through how to build a workflow that can send messages (including Voice, Email, and Text messages).
Using AWS Step Functions and the AWS SDK for Java to build workflows that sends notifications over multiple channels
As you can see, this workflow reads data in an AWS database to get a result set. For your use case, this sounds applicable.
You can invoke this workflow too from another Lambda function. You can schedule a Lambda function to be invoked x number times a day using CRON expressions as discussed in this document:
Creating scheduled events to invoke Lambda functions
Finally, you can build a solution that lets users enter data into a web form and have that data be persisted in RDS or DynamoDB. We have an AWS development article too for that. For example, assume you want to store user data in DynamoDB.
Creating the DynamoDB web application item tracker
As you can see, by combing functionality covered in these various AWS tutorials, you can build the solution that you describe.
These articles are implemented by using AWS SDK for Java V2. However, if you are working in other programming languages, you need to port the logic to your programming language.

Related

Advice on cloud related architecture choices for a production mobile app calling external API [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 days ago.
This post was edited and submitted for review 5 days ago and failed to reopen the post:
Original close reason(s) were not resolved
Improve this question
Question:
Can the below architecture be used in production?
Mobile app serviced by Firebase(FireStore + CloudStorage) and Cloud Functions for http calls to the external API) +
AchivementsApi deployed on Appengine standard conencted with CloudSQL.
Context:
I wrote two apps that have to be used as digital support for offline gaming events.
Expected usage pattern:
The traffic starts increasing at the beginning of the week when users have to do some online tasks, then a big spike will happen in the weekend when the offline gathering is taking place. We expect to have thousands of users. In the most optimistic case, let say we will reach 8000 users.
Flutter Mobile App
a. Authentication/Profile – for this I choose Firebase as is free and scalable (also this option provides monitoring, alarms, push notifications, etc.)
b. Event related data that in most cases will not change (event timetable, exhibitors, infos... nothing intensive here) – Using the Firebase backend with Cloud Firestore db looks like the obvious choice.
c. Images can be stored on Cloud Storage or even packaged with the app
d. Integration with Achievements Api – this implies sending REST requests to the below java service, using api key for auth. A scalable and safe(storing the api key) option for this seems to be Cloud Functions. Of course, if I opt for the dedicated back end deployed on Cloud Engine or App Engine or somewhere else that service can handle the Rest calls also
Springboot Achievements Api
Service with some complex queries but no process is extremely intensive or time consuming. This must be a stand alone service available for future integrations. Due to the choice made above of using Firestore, I was thinking that this can be deployed on stantard AppEngine environment with Cloud SQL connection.

Auto-renewable subscription questions (for Swift and PHP) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I have a few questions about auto-renewable subscription. I apologize in advance if these questions are out there but I figured asking these questions here was my best place to start.
Does anyone know the best tutorial on auto-renewable subscriptions? The ones I have found all had problems somewhere. The way I want it set up is that once the user purchases it will run a php script and update the database then redirect the user to the membership section.
What is the best way to check if the user is still paying for their membership and haven't cancelled it? If cancelled then I will run a php script to update the database.
If possible how can the user cancel the auto-renewable subscription from the app? Say the user deletes their account then in the backend Swift I want to also cancel the subscription.
These questions are pretty broad and subjective. There are a lot of ways this could be set up depending on your requirements.
1) The links below may help. You'll need to build an API you can send the purchase receipt from the client. Your server will handle receipt validation, update your database, etc. then return a successful response that will be your trigger to transition to the membership section.
2) With the receipt saved on your server, periodically poll Apple's /verifyReceipt endpoint to get the most up-to-date subscription status for the user. You can combine this with Apple server-notifications which can be another trigger for you to refresh the receipt.
3) The can't cancel their subscription from within your app, there's no developer APIs to manage subscriptions. They can only cancel from the Apple subscription management page. If you've implemented #2 correctly you'll know about these cancellations shortly after they occur. Remember that when a user cancels they should still be able to access their subscription until the end of the billing period they have paid for, unless the cancellation was due to a refund.
Some helpful links to get you started:
Overview on handling auto-renewable subscriptions: iOS Subscriptions are Hard
What to build in your server: How to Build a Great iOS In-app Purchase Subscription Server
(Alternatively, since you're on a deadline you can use a hosted solution like RevenueCat that handles all of this and more right out of the box)

How can I easily visualize my data sent over MQTT with AWS IoT Core [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
I'm evaluating AWS IoT Services, and sending cpu_usage and available_memory for a Gateway.
I would like to visualize the data on a graph.
Using AWS IoT SDK, I can easy send data over MQTT to AWS IoT Core.
But then, I have no idea how to visualize the data.
I have seen this tuto from AWS that use Kinesis Firehose/Analytics and QuickSight to visualize data, but it just seems too much (and too expensive) for my use case.
I have also seen this tuto that use ELK to visualize the data, but really, I have no money to spend on a dedicated ELK instance just to visualize little data.
I tried to send data to CloudWatch using a Rule, but with query:
SELECT free_memory FROM topic_1
But I can't see this metric.
I also asked for AWS IoT Graph preview access hoping it will resolve my issue, but I am right now a bit lost.
Is there an easy / free tier compatible way to visualize data from IoT devices ?
It is possible to have IoT feed into CloudWatch. You can then configure a nice dashboard for your data:
I have an ESP8266 running Mongoose OS which is feeding in temperature data from a DHT11. The JSON being put onto MQTT looks like:
{
"total_ram": 51600,
"free_ram": 39584,
"temp": 16,
"humidity": 64
}
I then have a rule with the query:
SELECT * FROM '/devices/esp8266_079110/events'
Then two actions, one for temperature and one for humidity. They look something like:
And your data then goes into CloudWatch. You can hit the Dashboard section in the AWS Console, create a new one and interactively add line graphs, latest values, etc.
It took a while to configure this. I had some validation issues when first creating the rule actions but I was able to resolve them after enabling logging in the settings for IoT then AWS said exactly what I screwed up.
Despite how much time I spent on this, there's only 2 systems involved: IoT and CloudWatch. I'm very happy with how simple this is.

Which one is better to user between Parse, Firebase and AWS Cognito? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 8 years ago.
Improve this question
I am willing to use synchronisation service for my application. But I want to choose the best one. I want to know which one is better among all these. My application will run on Android , IOS , Windows and Web.
I am going with Firebase because I tested it. It is giving me fast results and it is also allowing me to work offline. Is it better or I will go with Parse or AWS Cognito?
I Also have an option of Google Cloud. Does Google Cloud provides service like Firebase? And are realtime updates possible with Parse as like Firebase?
Codeek has a good point that this question is opinion based, so take my answer with a grain of salt.
I have experience with both Parse and Firebase, but not with Cognito.
In my experience, Parse is better when working with large relationship-based databases. (I.E. databases where multiple classes of objects are pointing to each other and interact.) In this system, it is easy to store a lot of data very succinctly, but working with this data is done via snapshots. This means that you can take a snapshot of the data, edit it, and then refresh the server with the updated snapshot. This is perfect for things like my delivery application where only one user is updating the orders on our server at any one time.
Firebase implements a model-observer scheme, and so it is much better for applications that are highly interactive. For instance, I have used Firebase for creating a real-time game of hot potato. The advantage here is that changes to the data on the server are automatically pushed out to all devices that have registered as listeners (functionality not available on Parse from my experience). This keeps all users on the same page all the time. The downside is that the database is structured in a hierarchal manner and doesn't have defined "objects". Rather, it is structured via key/value pairs where parent keys cannot have an associated value. To illustrate this, a sample structure for storing a game on my database went something like this:
-Games
--1
---Users
----1 = "example#gmail.com"
----2 = "example2#gmail.com"
---PotatoHolder = 1
---TimeRemaining = 30
---Loser = -1
Cognito I am not familiar with, so I'll allow someone else to explain how that database system is designed.
In summary, codeek is correct that this is an opinion-based question, but for two of your options a good rule of thumb from my experience is that Parse is fantastic for large relationship databases in conjunction with single-user applications (i.e. single-player or turn based games). Firebase is more suited to hierarchal data systems in conjunction with real-time multiplayer applications.
I hope this helps! If you could post a little more about what kind of application you are trying to build then perhaps I, or someone else, could provide a little more guidance.
Expanded Answer: Although this question has been marked as off topic, to answer Nidhi's follow-up question if there is a way to use Parse as a model-observer scheme: Not easily. Using a timer is the simplest option. The other option is to use push notifications. This would require getting permission from you user. You can set the Cloud Code on Parse to automatically send push notifications all relevant users and then intercept them within your client so that they are "silent". In other words, when they arrive, you can have your client respond by updating your game without showing a ribbon or notification like normal push notifications. I have not done this myself, as I prefer using Firebase for that kind of application, but I believe that it is possible.
Source: PFQueryTableView Auto Refresh When New Data Updated or Refresh Every Minute Using Parse
Keith's answer is similar to Nidhi's reference to refreshing PFObjects via a Timer, Handsomeguy's comment refers to the possibility of "silent" push notifications.

Amazon SES (Simple Email Service) for bulk e-mail, NOT for transactional e-mails? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
The Amazon SES (Simple Email Service) self-described as a "highly scalable and cost-effective bulk and transactional email-sending service".
From everything that I can gather, and by perusing the AWS SDK as well as the SES guides and API, it looks great for transactional emails (i.e. application emails sent in a one-off fashion), but I cannot find anything about bulk emailing.
Based on the price-point, Amazon clearly wants/needs customers to send very large quantities of mail.
Is the expectation that you (as someone implementing Amazon SES) make individual calls per email send?
i.e. If you are sending a marketing email to 200,000 recipients, do you really make 200K requests to the SendEmail or SendRawEmail via curl (or whatever) or using the AWS sdk?
This seems impractical.
The docs now clearly state that you can add up to 50 recipients per message. So you can divide up your sender list in batches; for 200k recipients you would have to make 4k API calls. Not terribly convenient for bulk mails; I would guess Amazon is not orienting their service for this particular use.
If you take a look in the API reference it would certainly look like you can send to more than one account at a time per request.
SendEmail requires an argument of 'Destination' of type 'Destination'.
Destination has three properties: ToAddresses, CCAddresses, BCCAddresses - all are of type "string list".
If you look at the example requests in the Developer Guide, you'll see it specified the destination addresses as an argument similar to:
&Destination.ToAddresses.member.1=allan%40example.com
I'm going to go out on a limb and guess for a 'string list' they're expecting multiple addresses in a format similar to:
&Destination.ToAddresses.member.1=allan%40example.com
&Destination.ToAddresses.member.2=other%40example.com
&Destination.ToAddresses.member.3=asdfq%40example.com
...
&Destination.ToAddresses.member.1000=final%40example.com
I actually stumbled across your question looking for answers to some of my own questions about SES - as of yet the docs are complete enough to use, but not always terribly helpful - you often have to make some fun inferences to get answers - just a fair warning for you!
Cheers!
Edit: One other thing that might be possible I pulled from the quote you posted in your self-answer:
either by modifying the software to directly call Amazon SES, or reconfiguring it to deliver email through an Amazon SES SMTP relay as described above.
If you set up your own SMTP server, and just have it relay/forward through SES, that might handle your queuing/etc. You can just shoot out a few thousand e-mails and your SMTP server will handle queuing/etc before it hits Amazon.
Thanks NuclearDog,
Upon further review, I think the answer to the question is to call the api repeatedly, x times (below from the SES FAQ).
Lets say we are sending out 200K mailings. First, I would be very interested to know the realistic limit for how many "ToAddresses" we can tack on to one mailing. Once we know that, we could maybe batch sends into groups of 100 or so "ToAddresses" at a time.
Second, as with most bulk mailings, the content is slightly different per recipient, even if it is just a "Hello ," intro. Given that the mailing body will, while similar, will have personalization per email, I believe the expectation is simply to call the api over and over. I was thinking perhaps there would be some way to queue up multiple emails with one call, then do a send, but this is likely not realistic given the nature of the API.
SES is probably intended to be a bit more scalable in this fashion using one of the Amazon AWS database products.
For now, I think I would have to implement a queue or message system to call the api X times in an efficient so that all the api calls 1) don't take all day, and 2) don't tax our systems too much.
Q: Can I use Amazon SES to send bulk
email? Yes. Simply call the SendEmail
or SendRawEmail APIs repeatedly for
each email you would like to send.
Software running on Amazon EC2, Amazon
Elastic MapReduce, or your own servers
can compose and deliver bulk emails
via Amazon SES in whatever way best
suits your business. If you already
have your own bulk mailing software,
it’s easy to update it to deliver
through Amazon SES – either by
modifying the software to directly
call Amazon SES, or reconfiguring it
to deliver email through an Amazon SES
SMTP relay as described above.
You can use their Simple Queue Service to send bulk email.