AWS Amplify environment variables for pull requests? - amazon-web-services

Good evening,
I am new with AWS Amplify.
We have a setup - I didn't set it up - where pull requests to our private Github repository trigger a 'preview' in AWS Amplify.
My understanding is that every preview gets its own preview URL, which is provided by AWS.
Now we have an app where I need to configure a redirect URI. Since the URL cannot be hard-coded I would like to inject it as environment variable at build time.
How can I get access to the URL value?
Only docs I found were these: https://docs.aws.amazon.com/amplify/latest/userguide/environment-variables.html#amplify-console-environment-variables.
EDIT: Opened a question on Github as well: https://github.com/aws-amplify/amplify-console/issues/1310.

You can use AWS_PULL_REQUEST_ID.
This way, you can get the URL for the PR environment like:
https://pr-${AWS_PULL_REQUEST_ID}.${AWS_APP_ID}.amplifyapp.com

Related

local gitlab Auth 2.0 and django

Hello We made our own Gitlab installation on our server. I installed Readthedocs Local in the link below. In order to connect our accounts on gitlab with readthedocs, I was asked to make the following settings from the gitlab section in the readthedocs document.
https://readthedocs.org/
https://dev.readthedocs.io/en/latest/install.html
But interestingly, even though I set the settings on our own server gitlab.local, by default django goes to gitlab.com when I connect to gitlab via devthedocs.org. However, it should connect to gitlab.local on my server, how can I fix this problem?
On page 34 of this document here
https://readthedocs.org/projects/django-allauth/downloads/pdf/latest/
"The GitLab provider works by default with https://gitlab.com. It allows you to connect to your private GitLab
server and use GitLab as an OAuth2 authentication provider as described in GitLab docs at http://doc.gitlab.com/
ce/integration/oauth_provider.html"
I need your support in this matter.
Thank you very much.
Configure the applications on GitHub, Bitbucket, and GitLab. For each of these, the callback URI is http://devthedocs.org/accounts//login/callback/ where is one of github, gitlab, or bitbucket_oauth2. When setup, you will be given a “Client ID” (also called an “Application ID” or just “Key”) and a “Secret”.
Take the “Client ID” and “Secret” for each service and enter it in your local Django admin at: http://devthedocs.org/admin/socialaccount/socialapp/. Make sure to apply it to the “Site”.

Which way I should choose for aws s3 bucket my website project by using nuxtjs

i really confused about using ssr or spa to my website. I'm planing publish on s3 bucket by static.
So my first quesiton is do i need nodejs or not for ssr on s3 bucket?
Second question is nuxtjs is using ssr for i know but and see on opitons 'spa'. Just I'dont understand i can use spa with vuejs why i need nuxtjs for spa?
Last question is when i choose the spa on options, see two options more.
*>Server (Node.js hosting)
*>Static (Static/Jamstack hosting)
and I notice on ssr mode these two options avaible there. So I can publish my nuxt project with ssr but static?
I really need your help people. Thanks

Deploying AWS chatbot without the use of the S3 Bucket

Im trying to integrate an AWS chatbot to my Website with the help from this github repository https://github.com/aws-samples/aws-lex-web-ui , and im trying to get this deployed completely locally which means the S3 bucket will not be used, only the "cognito id" will be used,is that possible?
Yes, that is possible. Take a look at the methods of integration:
Only Method 1 uses S3 Bucket. You probably want method 3 to create a stand-alone page or an embedded iframe. Here are the links to those directions:
Stand-alone Page
Embeddable iframe
Note that method 3 says to use the libraries from the dist folder. That is commonly overlooked.

Google Cloud Run service url (discovery)

I am running several gcloud services which have assigned urls automatically in following format:
https://SERVICE_NAME-XXXXXXX-ew.a.run.app/
This is not particularly easy to work with and to pass these URLs to clients. Alternative is to use the custom domain, but this needs hardcoding subdomains within DNS records (as far as I understand) and I would like to avoid that and use the default URLs.
What is the best practice to work with these URLs? I can imagine keeping some mapping of service->URL and passing it to clients, but I would like to avoid reinventing the wheels.
Edit: I've released an external tool called runsd that lets you do this. Check it out: https://github.com/ahmetb/runsd
Thanks for this question! The "Service discovery by name" for Cloud Run is very much an active area of work. Though, there are no active timelines we can share yet.
You can see a prototype of me running this on Cloud Run here: https://twitter.com/ahmetb/status/1233147619834118144
APIs like Google Cloud Service Directory linked are geared more towards custom/DIY service discovery you might want to build to your RPC stack such as gRPC. It's more of a managed domain name directory, that you can integrate with your RPC.
If you are interested in participating an alpha for this feature in the future, drop me an email at ahmetb at google.
You can use a beta service Service Directory.
At service deployment
Create your service with a name and the URL as metadata
In your code
Request the service metadata with its name, and get the URL
Use the url
You can't use the endpoint feature of the service because your don't have IP/Port.
However, for now, there is client library and you have to use API directly.

Parse Dashboard on AWS and adding cloud code

I configured a parse server on my AWS elastic beanstalk using this guid I've tested it and it all works fine
Now I can't find a way to deploy parse dashboard on my server.
I did deployed parse dashboard on my local host and connected it to the application on server, But this way I cannot manage (Add and remove) my apps.
Another problem is that parse dashboard missing cloud code on default, I found this on git, but I cant understand where do I add the requested endpoints, is it something like adding app.use('/scripts', express.static(path.join(__dirname, '/scripts'))); on the index.js file?
in order to deploy parse-dashboard to your EC2 you need to follow the Deploying Parse Dashboard section in parse-dashboard github page
parse-dashbard github page
Please make sure that when you deploy parse-dashboard you are using https and also basic authentication (it is also part of the guide)
Now regarding the cloud code: the ability to deploy cloud code via parse CLI and to view the nodejs code in parse dashboard are not available in parse-server but those are parse.com features. Cloud code in parse-server is handled by modifying the main.js file which exist under the cloud folder and deployment should be done manually by you but the big advantage in parse-server cloud code is that you can use any NodeJS module that you want from there and you are not restricted to the modules that were used by parse.com .
Another point about the dashboard. What you can do is to create an express application and then add parse-server and parse-dashboard as a middleware to your express application and deploy the whole application to AWS and then you can enjoy both parse-server (that will be available under the /parse path, unless you changed it to something else) and parse dashboard that will be available under the /dashboard path
Enjoy :)