I am trying to write a Trigger for before insert event on Account object.
trigger DeduplicationAccount on Account (before insert) {
//Get all the accounts in action in 'insert'
Account[] inputAccountList = Trigger.NEW;
I am trying to get a relative list of accounts in my input list of accounts.
Say for eample, I am trying to get such accounts where the last name = 'XXX' in my trigger new.
So, i am writing like this:
// Here, listOfSurname is containing a list of surname with 'XXX'
for(Account ac: Trigger.new){
List<Account> accountDuplicate = [Select ac.rr_First_Name__c, ac.rr_Last_Name__c From
Account ac where ac.rr_Last_Name__c IN : listOfSurname];
System.debug('accountDuplicate: '+ accountDuplicate);
}
But, this list is always coming as 0 though im my input,an account have surname as 'XXX'.
Trigger.New has all the information for the record and we could use that to verify against any condition. I rephrased your query below to check if each account's lastname in Trigger.new is part of the list.
for(Account ac: Trigger.new){
for(String s: listOfSurname){
if (ac.rr_Last_Name__c == s){
System.debug('accountDuplicate: '+ ac);
break;
}
}
}
Related
I'm a bit new to this so I apologize if it's basic.
We have an AWS parameters store managed by terraform.
we use a model for it:
the values of the keys are not stored and each time we do terraform apply we need to enter them manually.
suppose id like to add a new key, id need to edit the tf files to add another key and when i got terraform apply i need to re enter all the values of existing keys and then enter the value for the new key.
Im looking for a way to add to terraform another key, when ill terraform apply, i want terraform to see that only new key is added so all the current existing key are not to be modified. i was thinking about passing null to the variable of the existings keys and then terraform see it is null it will disregard the parameters store and wont touch it.
resource "aws_ssm_parameter" "default" {
count = var.enabled ? 1 : 0
name = "${module.label.id}-parameter"
type = var.type
value = var.value
description = var.description
tier = var.tier
key_id = var.key_id
overwrite = var.overwrite
allowed_pattern = var.allowed_pattern
}
This is the module i’m using.
I know there are multiple ways to get AWS account name by its ID, but is the opposite possible? Is there a way to programmatically (API, CLI, terraform etc.) get AWS account ID by its name?
Update: Forgot to mention that these accounts exist under organization structure in a specific OU, maybe this could help.
While this is not ideal, I realized that aws organizations list-accounts-for-parent command is the best compromise. It would give me all accounts within given OU, which I can filter by account name.
Given that my solution will ultimately be implemented in terraform I came out with something like this
data "external" "accounts" {
program = ["aws", "organizations", "list-accounts-for-parent", "--parent-id", local.ou, "--query", "Accounts[?Name==`${local.account_name}`] | [0]"]
}
locals {
ou = "ou-12345678"
account_name = "my-cool-account"
account_id = lookup(data.external.tools_accounts.result, "Id", null)
}
it would execute AWS CLI command, return back a map of key/values if account info is found, and lookup function would retrieve the account ID.
I was able to solve with the following:
data "aws_organizations_organization" "main" {}
locals {
account-name = "account1"
account-index = index(data.aws_organizations_organization.main.accounts.*.name, local.account-name)
account-id = data.aws_organizations_organization.main.accounts[local.account-index].id
}
output "account_id" {
value = local.account-id
}
I am working on a lambda function that gets called from API Gateway and updates information in dynamoDB. I have half of this working really dynamically, and im a little stuck on updating. Here is what im working with:
dynamoDB table with a partition key of guild_id
My dummy json code im using:
{
"guild_id": "126",
"guild_name": "Posted Guild",
"guild_premium": "true",
"guild_prefix": "z!"
}
Finally the lambda code:
import json
import boto3
def lambda_handler(event, context):
client = boto3.resource("dynamodb")
table = client.Table("guildtable")
itemData = json.loads(event['body'])
guild = table.get_item(Key={'guild_id':itemData['guild_id']})
#If Guild Exists, update
if 'Item' in guild:
table.update_item(Key=itemData)
responseObject = {}
responseObject['statusCode'] = 200
responseObject['headers'] = {}
responseObject['headers']['Content-Type'] = 'application/json'
responseObject['body'] = json.dumps('Updated Guild!')
return responseObject
#New Guild, Insert Guild
table.put_item(Item=itemData)
responseObject = {}
responseObject['statusCode'] = 200
responseObject['headers'] = {}
responseObject['headers']['Content-Type'] = 'application/json'
responseObject['body'] = json.dumps('Inserted Guild!')
return responseObject
The insert part is working wonderfully, How would I accomplish a similar approach with update item? Im wanting this to be as dynamic as possible so I can throw any json code (within reason) at it and it stores it in the database. I am wanting my update method to take into account adding fields down the road and handling those
I get the follow error:
Lambda execution failed with status 200 due to customer function error: An error occurred (ValidationException) when calling the UpdateItem operation: The provided key element does not match the schema.
A "The provided key element does not match the schema" error means something is wrong with Key (= primary key). Your schema's primary key is guild_id: string. Non-key attributes belong in the AttributeUpdate parameter. See the docs.
Your itemdata appears to include non-key attributes. Also ensure guild_id is a string "123" and not a number type 123.
goodKey={"guild_id": "123"}
table.update_item(Key=goodKey, UpdateExpression="SET ...")
The docs have a full update_item example.
I created the schema and resources (the DynamoDB tables), and I also attached the needed resolvers. Here is the screenshot of the User table with some sample data.
Sample data
For the first user, there are only id and username. Other fields are empty. When I query this user in AppSync console, there will be some error.
Query Error
Here is the query I used:
query getUser{
getUser(id:"5d0a2154-b828-4bcb-a34a-07503fe4b458"){
id
username
userProfile{
id
lastName
firstName
}
school{
id
schoolName
schoolAddress
}
role{
name
}
studentCourseSessions{
id
notes
}
programs{
name
courses{
name
}
}
}
}
And here is the VTL for the getUser query
Seems like, if a field is empty, there will be an error. I want, if the field is not filled, it should return "null". It can still return error but I need the "null" as well.
I am new to AppSync and DynamoDB.
I am trying to access my AWS DataPipelines using AWS Java SDK v1.7.5, but listPipelines is returning an empty list in the code below.
I have DataPipelines that are scheduled in the US East region, which I believe I should be able to list using the listPipelines method of the DataPipelineClient. I am already using the ProfilesConfigFile to authenticate and connect to S3, DynamoDB and Kinesis without a problem. I've granted the PowerUserAccess Access Policy to the IAM user specified in the config file. I've also tried applying the Administrator Access policy to the user, but it didn't change anything. Here's the code I'm using:
//Establish credentials for connecting to AWS.
File configFile = new File(System.getProperty("user.home"), ".aws/config");
ProfilesConfigFile profilesConfigFile = new ProfilesConfigFile(configFile);
AWSCredentialsProvider awsCredentialsProvider = new ProfileCredentialsProvider(profilesConfigFile, "default");
//Set up the AWS DataPipeline connection.
DataPipelineClient dataPipelineClient = new DataPipelineClient(awsCredentialsProvider);
Region usEast1 = Region.getRegion(Regions.US_EAST_1);
dataPipelineClient.setRegion(usEast1);
//List all pipelines we have access to.
ListPipelinesResult listPipelinesResult = dataPipelineClient.listPipelines(); //empty list returned here.
for (PipelineIdName p: listPipelinesResult.getPipelineIdList()) {
System.out.println(p.getId());
}
Make sure to check if there are more results - I've noticed sometimes the API returns only few pipelines (could even be empty), but has a flag for more results. You can retrieve them like this:
void listPipelines(DataPipelineClient dataPipelineClient, String marker) {
ListPipelinesRequest request = new ListPipelinesRequest();
if (marker != null) {
request.setMarker(marker);
}
ListPipelinesResult listPipelinesResult = client.listPipelines(request);
for (PipelineIdName p: listPipelinesResult.getPipelineIdList()) {
System.out.println(p.getId());
}
// Call recursively if there are more results:
if (pipelineList.getHasMoreResults()) {
listPipelines(dataPipelineClient, listPipelinesResult.getMarker());
}
}