I am going to use Amazon S3 Infrequent Access. I have played around and found that it is possible to access this service in two ways.
Making Requests Using the REST API. This way looks pretty simple and clear.
Using Amazon API gateway. I am not big expert in this service and one different that I have found that payload size is limited to 10 MB.
What other advantages/disadvantages has using Amazon S3 Infrequent Access via Amazon API gateway?
If you are going to use Amazon S3, then you should call S3 directly. This can be done via the REST API or via AWS SDKs for most popular programming languages.
You can also use the AWS Command-Line Interface (CLI), which makes it possible to write scripts that can call the AWS API.
Amazon API Gateway allows you to create your own APIs that can call an AWS Lambda function or your own application. It should not be used to call Amazon S3 unless you are trying to Create an API as an Amazon S3 Proxy, which is a rare situation.
Related
SageMaker provides a full machine learning development environment on AWS. It works with the Amazon SageMaker Python SDK, which allows Jupyter Notebooks to interact with the functionality. This also provides the path to using the Amazon SageMaker Feature Store.
Is there any REST API available for SageMaker? Say one wanted to create their own custom UI, but still use SageMaker features, is this possible?
Can it be done using the Amazon API Gateway?
Amazon API Gateway currently does not provide first-class integration for SageMaker. But you can use these services via AWS SDK. If you wish, you can embed the AWS SDK calls into a service, host on AWS (e.g. running on EC2 or as lambda functions) and use API gateway to expose your REST API.
Actually, SageMaker is not fundamentally different from any other AWS service from this aspect.
I think you're better off wrapping the functionalities you need as an API you own to avoid the timeouts associated with REST. Did you check out https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sagemaker.html as well?
I have a private S3 bucket with lots of small files. I'd like to expose the contents of the bucket (only read-only access) using AWS API Gateway as a proxy. Both S3 bucket and AWS API Gateway belong to the same AWS account and are in the same VPC and Availability Zone.
AWS API Gateway comes in two types: HTTP API, REST API. The configuration options of REST API are more advanced, additionally, REST API supports much more AWS services integrations than the HTTP API. In fact, the use case I described above is fully covered in one of the documentation tabs of REST API. However, REST API has one huge disadvantage - it's about 70% more expensive than the HTTP API, the price comes with more configuration options but as for now, I need only one - integration with the S3 service that's why I believe this type of service is not well suited for my use case. I started searching if HTTP API can be integrated with S3, and so far I haven't found any way to achieve it.
I tried creating/editing service-linked roles associated with the HTTP API Gateway instance, but those roles can't be edited (only read-only access). As for now, I don't have any idea where I should search next, or if my goal is even achievable using HTTP API.
I am a fan of AWSs HTTP APIs.
I work daily with an API that serves a very similar purpose. The way I have done it is by using AWS Lambda functions integrated with the APIs paths.
What works for me is this:
Define your API paths, and integrate them with AWS Lambda functions.
Have your integrated Lambda function return a signed URL for any objects you want to provide access to through API calls.
There are several different ways to pass the name of the object(s) you want to the Lambda function servicing the API call.
This is the short answer. I plan to give a longer answer at a later time. But this has worked for me.
Say you have a bunch of on premise applications (that are your potential publishers).. can we have them (sitting in some private network), publish message on to an AWS eventBus/eventBridge, so that the target subscribers can take it forward from there... Does hybrid AWS help here?
If so, can you share some links on achieving the same
That should be very straight forward - you can use the put events endpoint of any AWS SDK for EventBridge.
You'll need an IAM user with which to call the API via access key/secret key.
From a networking perspective, you'll need to be able to hit the AWS API endpoints.
Here's the method to use in Python.
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/events.html#EventBridge.Client.put_events
Other than AWS Console, there are multiple ways to access the AWS Services.
AWS CLI(awscli/AWSPowershell)
AWS SDK
REST API
Out of these three methods which is the most secure one?
Consider your thinking by the way - If you are working with AWSCLI, you need to store the credentials by using the command aws configure.
I know, without passing the credentials(AccessKey and SecretKey) for SDK,CLI and API, we cant access the AWS Services. But I hope still some other way will be there to access/manage the services.
In the end all of these ways call the AWS APIs, so from that perspective they are equally secure.
There are differences in the use of features of the APIs though. While the AWS CLI supports MFA authentication, only some SDKs do (e.g. boto3 does, the aws-sdk-js doesn't yet) and for accessing the APIs directly you would have to implement that yourself.
All of the methods mention have a similar degree of security. Based on how you store and use credentials, might affect the actual security strength.
I'm looking into the Amazon API Gateway. Is there a reason to use it as an S3 Proxy rather than making calls to the S3 API directly?