Hello I'm starting with aws and amplify. I would like to view the mock tables that amplify mock creates but when I try to access it fails, however, appSync works fine. I'm following this tutorial and it says I should be able to access dynamodb using localhost
this is the doc link https://docs.amplify.aws/cli/usage/mock/#api-mocking-setup
Have you checked inside the amplify/mock-data directory?
For DynamoDB storage, setup is automatically done when creating a GraphQL API with no action is needed on your part. Resources for the mocked data, such as the DynamoDB Local database or objects uploaded using the local S3 endpoint, inside your project under amplify/mock-data
Related
I have a React application with AWS Amplify as its backend. I'm using AppSync API and DynamoDB database to save data. AppSync API is the only category that I provisoned in my project.
Category
Resource name
Operation
Provider plugin
Api
testAPI
No Change
awscloudformation
I need to clone this same AWS Amplify backend to another AWS account easily.
Yes, I could create another Amplify project and provision resources one by one. But is there any other easy method to move this Amplify backend to another AWS account?
I found a solution through this (https://github.com/aws-amplify/amplify-cli/issues/3350) Github issue thread. But I'm not 100% sure whether this is the recommend method to migrate Amplify resources.
These are the steps that I followed.
First, I pushed the project into a GitHub repo. This will push only the relevant files inside the amplify directory. (Amplify automatically populates .gitignore when we initialize our backend using amplify init).
Clone this repo to a new directory.
Next, I removed the amplify/team-provider-info.json file.
Run amplify init and you can choose your new AWS profile or you can enter secretAccessKeyId and accessKeyId for the new AWS account. (Refer this guide to create and save an IAM user with AWS Amplify access)
This will create backend resources locally. Now to push those resources, you can execute amplify push.
If you want to export the Amplify backend using a CDK pipeline, you can refer to this guide: https://aws.amazon.com/blogs/mobile/export-amplify-backends-to-cdk-and-use-with-existing-deployment-pipelines/
My AWS Amplify app requires some "seed" data that needs to start in the database. The mechanism by which this runs should not be accessible to users of the app. What is the most idiomatic way to load data into DynamoDB for this purpose?
I have looked into creating a lambda function for this purpose (ie amplify function add), which is well integrated into amplify. However there is no easy way to actually invoke this lambda. Amplify doesn't tell you the lambda ID for use with the aws command, and there is no amplify command that relates to invoking a lambda.
There are a lot of alternative ways to do that. One could be using CDK, CloudFormation, Terraform or other IaC tool to create initial DynamoDB tables and items.
I am writing an application where I have to upload media files to GCS. So, I created a storage bucket and also created a service account which is being used by the application to put and get images from the bucket. To access this service account from the application I had to generate a private key as a JSON file.
I am tested my code and it is working fine. Now, I want to push this code to my Github repository but I don't want this service account key to be in Github.
How do I manage to keep this service account key secret, yet all my fellow colleagues should be able to use it.
I am going to put my application on GCP Container Instance and I want it to work there as well.
As I understand, if your application works from inside the GCP and use some custom service account, you might not need any private keys (as json files) at all.
The custom service account, which is used by your application, should get relevant IAM roles/permissions on the correspondent GCS bucket. And that's all you might need to do.
You can assign those IAM roles/permissions either manually (through UI console), or using CLI commands, or as part of your deployment CI/CD pipeline.
I am using AWS Amplify to set up an AppSync GraphQL API. I have a schema with an #model annotation and I am trying to write a lambda resolver that will read/write to the DynamoDB table that #model generates. However, when I attempt to test locally using amplify mock my JS function throws
error { UnknownEndpoint: Inaccessible host: `dynamodb.us-east-1-fake.amazonaws.com'. This service may not be available in the `us-east-1-fake' region.
I can't seem to find much documentation around this use case at all (most examples of lambda resolvers read from other tables / APIs that are not part of the amplify app) so any pointers are appreciated. Is running this type of setup even supported or do I have to push to AWS in order to test?
New Answer:
Amplify now has documentation on this use case: https://docs.amplify.aws/cli/usage/mock#connecting-to-a-mock-model-table
You can set environment variables for mock that will point the DDB client in the mock lambda to the local DDB instance
=====================================================================
Original Answer:
After some digging into the Amplify CLI code, I have found a solution that will work for now.
Here is where amplify mock initializes DynamoDB Local. As you can see, it does not set the --sharedDb flag which based on the docs means that the created database files will be prefixed with the access key id of the request and then the region. The access key id of requests from Amplify will be "fake" and the region is "us-fake-1" as defined here. Furthermore, the port of the DynamoDB Local instance started by Amplify is 62224 defined here.
Therefore, to connect to the tables that are created by Amplify, the following DynamoDB config is needed
const ddb = new AWS.DynamoDB({
region: 'us-fake-1',
endpoint: "http://172.16.123.1:62224/",
accessKeyId: "fake",
secretAccessKey: "fake"
})
If you want to use the AWS CLI with the tables created by Amplify, you'll have to create a new profile with the region and access keys above.
I'll still need to do some additional work to figure out a good way to have those config values switch between the local mock values and the actual ones, but this unblocks local testing for now.
As for another question that I had about where AWS::Region of "us-east-1-fake" was being set, that gets set here but it does not appear to be used anywhere else. ie, it gets set as a placeholder value when running amplify mock but using it as a region in other places for testing locally doesn't seem to work.
Please try the below setting, It's working fine for me,
const AWS = require('aws-sdk');
// Local
const dynamoDb = new AWS.DynamoDB.DocumentClient({
region: 'us-fake-1',
endpoint: "http://localhost:62224/",
accessKeyId: "fake",
secretAccessKey: "fake"
});
// Live
// const dynamoDb = new AWS.DynamoDB.DocumentClient();
your dynamodb host is incorrect. dynamodb.us-east-1-fake is not a valid host. Please update it with real dynamodb host name.
If you are running locally setup aws configure on cli first.
You can change the data source of appsync service in aws console, but I am not sure if it can work after I run command
amplify push api
And haven't found a way of changing data source with aws-amplify.
There isn't a direct way, no, as AppSync doesn't have native Postgres data sources. You could theoretically do anything with a Lambda data source, though, very much including Postgres.
An AppSync dev created a sample app that shows a way to do this, via what Amplify calls custom resolvers. You can find that repo here: https://github.com/mikeparisstuff/amplify-cli-nested-api-sample