How to access to secrets in static site hosted in S3 bucket - amazon-web-services

I'm new and since I could not find relevant information in my searches I decided to ask for your advice.
I created a SPA (React) that receives a token, validates the token and if the token is valid it renders some content. That SPA is hosted in S3.
Now, I want to add some API keys (sensitive ones). Adding them to the code (manually or during the build of the bundle) it would be a bad idea, no?
I thought about storing them in AWS, like in secrets manager, and use the SDK (js) to retrieve them. But here is my doubt. I don't want neither to hardcode the AWS credentials in the code for the SDK, nor use something like cognito since the authentication would be done by this app through the token that it receives. What would be the best way to achieve this? I will appreciate advice and if you can point to some resources.
Feel free to make as many suggestions as you want. Thanks.

Related

AWS Cognito signup page that isn't public facing

I'm currently working on a way to hand off creation of users in a Userpool to my product team so that I don't need to handle user creation and password resets anymore. The key here is that the tool I give them needs to be simple and non-technical, and not require them going into was with permissions, knowing how to use Cognito and make the users within Cognito. This also needs to not be a public facing signup (i.e. the folks using the page need to never see the signup form). This is for my team's developer documentation which integration partners cannot see until they meet with us.
Looking at all the possibilities and the AWS API documentation has been making my head spin, though. I'm not sure what the best way to create this tool - the Cognito SDK? The AWS AdminCreateUser API? Or is there a way to set this up with the built in signup page UI provided by Cognito but host the signup page elsewhere (somewhere that people who look at our documentation will never see a signup page)?
Please let me know what your approach would be if given this problem. I'm a pretty green jr. developer and don't have much experience with AWS.
If you really don't want to use the built-in Cognito UI to create users, you would need to come up with an alternative custom solution. Mind you will need to implement all features you expect from such user administration tool, including login for administrators into the tool itself.
With AWS Cognito APIs you can do everything native UI can do (and even more, like setting user attributes which is not available at Cognito console).
Quick google search led me to this project: https://github.com/jzoric/cognito-user-manager-ui which may be a good starting point if you decide to go this route.
Alternatively you may want to explore other SAAS solutions (Auth0 or Okta) which may provide better native UI out of the box.

How to securely use Amazon S3 in a messaging application

So I'm building a messaging app in Cordova and I was wondering what the best approach is to secure the image files so no one else can view them. I suppose I can just generate random filenames and store them in the database, but that feels like pseudo-security. I also know that you can createPresignedRequest(), but that's for temporary files I believe. Maybe I'm missing something, but I can't figure out a good way to do this. I'm also using the PHP SDK. Not too important for scenario, but figured I'd mention it.
I also know that you can createPresignedRequest(), but that's for temporary files I believe.
Pre-signed links are temporary, but it doesn't matter if the object in S3 is.
You can either use pre-signed URLs or Amazon Cognito in combination with AWS IAM roles to grant certain users access to the files.
How it would work with Cognito is described on the following page: https://docs.aws.amazon.com/cognito/latest/developerguide/iam-roles.html

What is the recommended way to handle large file uploads to s3?

I'm using AWS SDK for Ruby to upload large files from users to s3.
The server is a sinatra app with a POST /images endpoint accepting multipart/form-data. I'm experiencing a noticeable delay with user uploads. This is to be expected, because it's making a request to s3 synchronously. I wanted to move this to a background job using something like Sidekiq, but I'm not sure I like that solution.
I read online that some people are promoting direct uploads to s3 on the client side. Some even called this a "best practice." I'm hesitant to do this for several reasons:
My client side code would be heavily tied down to my cloud provider. I love AWS (great experiences), but I like to remain somewhat cloud-agnostic. I don't want my mobile and web apps to have to know the details of my AWS setup. If I choose to move away from s3 at a later date (unlikely but plausible), I would want this to be a seamless transition. Obviously, this works ok for a web app, because I can always redeploy quickly. However, I have to worry about mobile. Users may not update, and everything will become a lot more complicated if some users are uploading to s3 and some are uploading to another service.
Business logic regarding determining which bucket and region to use would need to either exist on the client side or I'd need to expose an endpoint for determining which bucket and region to use for each user. Then, I'd have to make a request to my server to figure out the parameters before I can begin uploading to s3. I want to be able to change buckets or re-route users to alternative regions and so I'm not a fan of this tight coupling or the additional request.
Security is a huge concern. When files are uploaded and processed through my server, I can utilize AWS IAM to properly ensure that these files are only coming from my server. I believe that I have to grant an "all-write" privilege to users which is problematic. If I use AWS IAM credentials in JavaScript, I do not see how you can ensure that users do not get unlimited write access to my bucket. All client side javascript, can be read by a user. In addition, I'm unaware of how to process validations. On my server, I can scan the files and determine whether or not to upload to s3. If I upload directly from the client, I would have to move this processing into lambda functions. I'm ok with that, but there is a chance the object could be retrieved by users before the processing has occurred. Then, I'd have to build some sort of locking system to prevent access before processing.
So, the bottom line is I have no idea where to go from here. I've hacked around some solutions, but I'm not thrilled with any of them. I'd love to learn how other startups and enterprises are tackling this kind of problem. What would you recommend? How would you counter my argument? Forgive me if I'm missing something, I'm still relatively an AWS-newbie.
If you're worried about changing the post service I would suggest using an API and that way you can change the backed storage for your service. The mobile or web client would call the service and then your api would place the file where it needed to go. The api you have more control over and you could just created a signed s3 url to send to the client and let them still do the uploading.
An api, like in 1, solves this problem too, the client doesn't have to do all the work.
Use Simple Token Services and Temporary Security Credentials.
I agree with strongjz, you should use an API to upload your files from the server side.
Cloudinary provides an API for uploading images and videos to the cloud.
From what I know from my experience in using Cloudinary it is the right solution for you.
All your images, videos and required metadata are stored and managed by Cloudinary in Amazon S3 buckets owned by Cloudinary.
The default maximum file size limit for videos is 40MB. This can be customized for paid plans.
For example in Ruby:
Cloudinary::Uploader.upload("sample_spreadsheet.xls", :resource_type =>
:raw)

How to Connect Rails Client to IndentityServer SSO provider

At work we have a system set up running a ThinkTecture IndentityServer SSO provider which currently provides authentication for several .NET and ColdFusion sites. I am currently working on a new site we are supporting in Ruby on Rails and am having difficulty figuring out how to connect it to the SSO. (I'm pretty new to rails, but a long time developer in CF and .NET)
I've looked at the omniauth-oauth2 and oauth2 gems but it seems there are important parts missing from the documentation and explanations I can find. There is a ton of info if I wanted to authenticate using Twitter, Facebook or something similar, but I can't find anything that just addresses the client side for any generic OAuth2 provider.
I'm just looking for someone to point me in the right direction to find information on how I can do this. I don't care if it's specific to IdentityServer or just generic regardless of the provider. Thanks for the help.
Update: Just so you know, I would prefer to use OAuth2 for this connection, but I am not opposed to using any of the other ways that IdentityServer provides, including ADFS, WSFed or Simple HTTP. I can't use OpenID, though, because these accounts are specific to our system and can't be used for other systems.
You really need an open id connect library.
http://openid.net/developers/libraries/
It turns out this is pretty easy, overall. The difficulty is that there is no straight answer to the question. How you connect to IdentityServer entirely depends upon how IdentityServer is set up.
I'm not going to post my exact code, as this will not help anyone who doesn't have IdentityServer set up exactly the same way we do, and as I don't have access to the IdentityServer, I can't say exactly how that is. I will explain the overall solution, though.
The only gem needed for this is JWT
Get key codes from IdentityServer admin (client id, secret key, sign key)
Build login URL according to configuration of IdentityServer
Redirect user to login path generated in the last step
Receive token back from IdentityServer
Decode and verify using the JWT.decode function
From there you just have a JSON string with your data.

Hiding AWS secret from application

I'm a Java backend engineer working on a feature that the frontend (SPA and Android) must send (large) files to S3. Since I have to manage with a lot of requests. Because of network overload reasons I'm avoiding to make a 'proxy' service where the frontend send me the file so that I can send it to S3 but I have some concern about the best way to keep my apps secure.
I looked for some solutions but I cannot find one that manages exactly what I want.
Amazon S3 upload with not showing secret key in frontend
This post has almost my answer but I don't have enough score to comment.
S3 upload directly in JavaScript
I read some documentation on AWS but I still have some questions and some requisites.
The solution may permit the client an authenticated user to send a file to s3 directly
It may make a GET call to get some token or something like that (without sending a lot of data)
It's to be secure (no secret key knowledge at the frontend)
Which solution may be good for me?
The backend may generate a signing key and send it to frontend making the request to AWS (http://docs.aws.amazon.com/general/latest/gr/signature-v4-examples.html)
I can use STS to generate a temporary credential for each upload.
Do you think these approach will work? Which one do you think is better? What are the trade offs? Is there other way to deal with this problem?
Best thing to do here is use the Cognito service to generate anonymous credentials in the app that allow an upload to S3. For Android you can use the SDK then to do multi-part uploads from the device to S3, which will speed up the process as well.
I couldn't find an exact Android example, but this is one for iOS and the terminology should transfer the same, just with the other SDK: iOSTransferManager .
You can also call Cognito directly from javascript, if you have a web based app: Cognito in JS example
Hope that helps!
- Chris