How to use realm with AWS serverless lambda? - amazon-web-services

So I'm currently using AWS serverless to make a createObject endpoint, which takes the values from the http body and uses the key-values to store an item in DynamoDB. I am doing this via AWS Lambda. We recently started using Realm (Realm.io) and we see that they have a Javascript library for node. Is it possible to invoke a realm function to store the same object, via AWS Lambda?

As I understand it, the Node SDK for Realm is only for the Professional and Enterprise editions of the Realm Object Server (ROS). The ROS can only be deployed on your own Linux instance.
Please see this for more details: https://realm.io/docs/realm-object-server/pe-ee/

Exactly as Clifton Labrum says, the professional and enterprise editions of the Realm Mobile Platform has a node.js SDK to be used on the server to listen for changes in Realms.
At every change in a Realm, you will get an event, which you can then process as you like. For instance, you can store objects in DynamoDB. You can also just leave the objects in Realms.

Related

Selling Partner API using command line interface or tool

We're new to Amazon Seller Partner-API. Need to invoke certain Amazon SP-APIs for an integration workflow. For some internal reasons, using Amazon SDKs is a secondary option. With our conventional approach, we're able to interact with most APIs, in this case the AWS Request signing & Signature generation is where we're stuck.
As per Amazon using SDK handles it all internally. Is it possible to use a command line utility like - AWS CLI to interact with SP-APIs? Not sure if this is feasible. Found this - amazon-sp-api but not sure if it is stable / reliable.
I believe there should be ways to interact with SP-API from command line. If not, atleast there should be a tool that is able to produce AWS Request signature (given the request info, key etc...).
Kindly share your experience and expertise. We're new to AWS, so if I'm confusing AWS with SP-API (esp for Request signing - I believe both use the same mechanism) pls point it out.
The link you shared to amz.tools does not look like a command line interface. It is just an SDK generated in NodeJS. There is not way to connect to the API via command line. You can use Postman if you want to avoid SDKs.
And yes, AWS is not the same thing as SP API.
You can search github for SDKs generated on other languages; some seem to have a lot of use.
We generated our own SDK in C# because others didn't fit out criteria.

can I write such a API code that runs on serverless (aws lambda) and the same code can runs on ec2?

I am looking for a language / framework or a method by which I can build API / web application code such that it can run on Serverless compute's like aws lambda and the same code runs on a dedicated compute system like lightsail or EC2.
First I thought of using Docker to do this but AWS Lambda entry point is a specific function signature which is very different than Spring Controllers. Is there a solution available currently?
So basically when I run it in lambda - it will have cold start issue, later when the app is ready or get popular I would like to move it to a EC2 instance for better performance and higher traffic load.
I want to start right in this situation so that later it can be easy to port and resolve the performance issue's
I'd say; no this is not possible easily.
When you are building an api that you'd want to run on lambda's you most likely will be using an API Gateway which takes care of your routing to different lambda functions (best practice). So the moment you would me working on an api like this migrating to EC2 would be a nightmare as you would need to rebuild the whole application a more of a monolith application which could run on EC2.
I would honestly commit to either run it on EC2/Containers or run it on Lambda, if cold start is your main issue with Lambda's you might wanna look into Lambda Snapstart for Java or use another language like Typescript/Python.
After some correct keywords in google I finally got what I was looking for, checkout this blog and code library shared by AWS which helps you convert the request and response of the request as per the framework required http request
Running APIs Written in Java on AWS Lambda: https://aws.amazon.com/blogs/opensource/java-apis-aws-lambda/
Repo Code: https://github.com/awslabs/aws-serverless-java-container
Thanks Ricardo for your response - will do check out Lambda Snapstart for sure and try it as well. I have not tested out this completely but it looks promising to some extent.

Spring-Cloud-AWS vs AWS-SDK-Java 2

Ours is a Spring-Boot based application. For integration with AWS SNS and SQS, we have couple of options:
Use Spring-Cloud-AWS
Use AWS-SDK-Java 2
I wanted to know if there is any advantage in using one or the other.
When I ask AWS guys, they tell me that AWS SDK gets updated regularly and integration with SNS and SQS is not difficult. Hence, there is no need to integrate with Spring-Cloud-AWS.
I tried searching on gitter channel for Spring-Cloud and could not find any relevant information. Documentation does state that I can update the AWS-SDK version. Documentation does not state any compelling reason for not using AWS-SDK directly.
If anyone has some insights, please share.
From the AWS Spring Team:
"From now on, our efforts focus on 3.0 - based on AWS SDK 2.0."
So, if you need AWS SDK 2.0, you probably want to go directly with the SDK.
https://spring.io/blog/2021/03/17/spring-cloud-aws-2-3-is-now-available
For more on what's new on AWS Java SDK 2.0:
https://aws.amazon.com/blogs/developer/aws-sdk-for-java-2-0-developer-preview/
The main advantage over the AWS Java SDK is the Spring style convenience and head start we get by using the Spring project. As per the project documentation (https://cloud.spring.io/spring-cloud-aws/reference/html/##using-amazon-web-services)
Using the SDK, application developers still have to integrate the SDK
into their application with a considerable amount of infrastructure
related code. Spring Cloud AWS provides application developers already
integrated Spring-based modules to consume services and avoid
infrastructure related code as much as possible.

Serverless compute on Windows in AWS

I've got a piece of code that I need to make available over the 'Net. It's a perfect fit for an AWS Lambda with an HTTP API on top - a stateless, side effect free, rather CPU intensive function, blob in, blob out. It's written in C#/.NET, but it's not pure .NET, it makes use of the UWP API, therefore requires Windows Server 2016.
AWS Lambdas only run on Linux hosts, even C# ones. Is there any way to deploy this piece in the Amazon cloud in serverless manner - maybe something other than a Lambda? I know I can go with a EC2 VM, but this is the very kind of thing serverless architecture was invented for.
Lambda is the only option for serverless computing on AWS and Lambda functions run only on Linux machines.
If you need to run serverless functions in a Windows machine, try Azure Functions. That's the Lambda equivalent in the Microsoft cloud. I'm not sure if it runs in a Windows Server 2016 machine and couldn't find any reference to the platform, but I would expect that, as a brand new service, they are using their own edge tech.
To confirm if the platform is what you need, try this function:
using System.Management;
using System.Net;
using System.Threading.Tasks;
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
// Get OS friendly name
// http://stackoverflow.com/questions/577634/how-to-get-the-friendly-os-version-name
var caption = (from x in new ManagementObjectSearcher("SELECT Caption FROM Win32_OperatingSystem").Get().Cast<ManagementObject>()
select x.GetPropertyValue("Caption")).FirstOrDefault();
string name = caption != null ? caption.ToString() : "Unknown";
// the function response
return req.CreateResponse(HttpStatusCode.OK, name);
}
I think yoy can achieve this via combination of CodeDeploy service and AWS CodePipeline.
Refer to this article:
http://docs.aws.amazon.com/codedeploy/latest/userguide/getting-started-windows.html
to learn how to deploy code via CodeDeploy. Later see this article:
http://docs.aws.amazon.com/codepipeline/latest/userguide/getting-started-4.html
to learn how you can configure aws Pipline to call Code Deploy and later execute your batch job on created windows machine (note: you will probably want to use
S3 instead of Github - which is possible with CodePipeline).
I would consider to bootstrap whole such configuration via script - using aws cli - this way you can clean up easily your resources like this:
:aws codepipeline delete-pipeline --name "MyJob"
Of course you can configure the pipeline via aws web console and leave the pipeline configured to run your code on regular basis.

Emulating Amazon SQS during development

I'm quite interested in beginning some development using Amazon SQS, perhaps SimpleDB too, my question is this, are there any open source solutions that mimic the functionality, just for the purposes of development. I've already encountered the Eucalyptus project (http://open.eucalyptus.com) for creating an EC-esque cloud.
I've not had any success with google, I suspect it's because the cost of entry is so inexpensive, but still, does anyone know of anything like this?
For SQS I wrote ElasticMQ, which you can run either embedded (it's written in Scala, so runs on the JVM) or stand-alone. It has both persistent and in-memory modes, the first being good for dev, second for testing.
If you need a test double for more than just SQS, you can try LocalStack.
To simulate SQS, it internally uses ElasticMQ mentioned by adamw.
You can start LocalStack via Docker, for example, and it will start the following services:
API Gateway at http://localhost:4567
Kinesis at http://localhost:4568
DynamoDB at http://localhost:4569
DynamoDB Streams at http://localhost:4570
Elasticsearch at http://localhost:4571
S3 at http://localhost:4572
Firehose at http://localhost:4573
Lambda at http://localhost:4574
SNS at http://localhost:4575
SQS at http://localhost:4576
Redshift at http://localhost:4577
ES (Elasticsearch Service) at http://localhost:4578
SES at http://localhost:4579
Route53 at http://localhost:4580
CloudFormation at http://localhost:4581
CloudWatch at http://localhost:4582
SSM at http://localhost:4583
Some of the Amazon SDKs have "mock" mode, which is:
The mock service is an alternate way
to use the sample code. The service
doesn't call AWS, but instead returns
a set response that you can modify to
suit your needs (the XML response
files are in the Mock directory). The
mock service makes it easy for you to
test how your application handles
different responses.
For SQS, it appears the Perl and PHP SDKs have mock mode. I know that the .NET SDK for Amazon RDS also has the mock mode.
The Java SDK doesn't contain mock implementations:
The client mock implementations have been removed. Instead, developers
are encouraged to use more flexible and full featured mock libraries,
such as EasyMock, jMock
If the SDK you will be using doesn't have the mock mode available, you could probably create your own similar type of thing which returns the preconfigured responses instead of actually hitting up the service.
See here for more info
GoAws - https://github.com/p4tin/goaws - was just released as beta. (disclaimer - I am the developer).
Regarding the Java SDK, it does no longer contain mock implementations:
The client mock implementations have been removed. Instead, developers
are encouraged to use more flexible and full featured mock libraries,
such as EasyMock, jMock
If you are in .NET or Mono you can try Stratosphere. It has local implementations that mimic SimpleDB, SQS and S3. For SimpleDB mock implementation it uses SQLite, for SQS and S3 it stores messages/objects in file system.
if you need to simulate SNS as well as SQS you can check out: Yopa