DynamoDB multiple filter conditions, gives error - buildTree error: unset parameter: ConditionBuilder - amazon-web-services

I am building REST APIs, using Lambda and DynamoDB in GO.
I need to query the data based on multiple filters.
The number of filters can be varying based on the number of query parameters user has provided, while calling the REST API.
As per the below post, I had developed the code to add multiple conditions.
AWS SDK for Go - DynamoDb - Add multiple conditions to FilterExpression
But when I invoke the function, I get below error, in the logs.-
buildTree error: unset parameter: ConditionBuilder
The Filter expression is not applied and the scan returns all the results.
Here is the code snippet.
for queryParam, queryParamValue := range searchParams {
fmt.Println("queryParam:", queryParam, "=>", "queryParamValue:", queryParamValue)
if queryParam == “param1” {
param1Condition = expression.Name(“param1”).Equal(expression.Value(queryParamValue))
}
if queryParam == “param2” {
param2Condition = expression.Name(“param2”).Equal(expression.Value(queryParamValue))
}
}
sampleExpr, errSample := expression.NewBuilder().
WithCondition(param1Condition.Or(param2Condition)).
Build()
if errSample != nil {
fmt.Println("Error in building Sample Expr ", errSample)
} else {
fmt.Println("sampleExpr ", sampleExpr)
}
input := &dynamodb.ScanInput{
ExpressionAttributeNames: sampleExpr.Names(),
ExpressionAttributeValues: sampleExpr.Values(),
FilterExpression: sampleExpr.Filter(),
TableName: aws.String(deviceInfotable),
}
But if I create the expression in different way, it works.
filt := expression.Name("param1").Equal(expression.Value("valu1")).Or(expression.Name("param2").Equal(expression.Value("value2")))

ConditionBuilder has mode field
type ConditionBuilder struct {
operandList []OperandBuilder
conditionList []ConditionBuilder
mode conditionMode
}
The zero value of mode is unsetCond. When build condition, unsetCond raises the error.
https://github.com/aws/aws-sdk-go/blob/7798c2e0edc02ba058f7672d32f4ebf6603b5fc6/service/dynamodb/expression/condition.go#L1415
case unsetCond:
return exprNode{}, newUnsetParameterError("buildTree", "ConditionBuilder")
In your code, if queryParam != “param1” and queryParam != “param2”, the param1Condition and param2Condition is zero value of ConditionBuilder, which fails on build.

Related

How to update one attribute value using ConditionExpression without affecting other attributes - DynamoDB

I need to update Attribute Values of table items of DynamoDB. repeats section should update only if usersIDs Array consist with the user ID of current user.
Then I created ConditionExpression and run it.
var metricsParams = {
TableName: table,
Key:{
"metricsID" : metricsID,
},
UpdateExpression: "SET fans.orgID = :orgIDNew, fans.orgName = :orgNameNew, fans.noOfGamesPlayed = fans.noOfGamesPlayed + :val, Moment.datePlayed = :dateNew, Moment.monthPlayed = :monthNew, Moment.week = :weekNew, Moment.usersIDs = list_append(Moment.usersIDs, :usersNew), Moment.repeats = list_append(Moment.repeats, :repeateUsers)",
ConditionExpression: "contains(Moment.repeats, :repeateUsers)",
ExpressionAttributeValues:{
":orgIDNew": body.team.id,
":orgNameNew": body.team.domain,
":val": 1,
":dateNew" : Moment().format('LL'),
":monthNew" : Moment().format("MMMM"),
":weekNew" : Moment().format('WW'),
":usersNew" : [body.user.id],
":repeateUsers": [body.user.id]
},
ReturnValues:"UPDATED_NEW"
};
console.log("Attempting a conditional update...");
metricsDoc.update(metricsParams, function(err, data) {
if (err) {
console.error("Unable to update item. *from id update* Error JSON:", JSON.stringify(err, null, 2));
} else {
console.log("UpdateItem succeeded: FROM DYNAMODB METRICS**", JSON.stringify(data, null, 2));
But when I add this ConditionExpression other Attributes also affect because of this. So How Can I fix this. Do i need to create seperate UpdateExpression?
You are correct. If you have multiple conditions, you will need multple operations. An operation can have 0-1 conditions, which applies to the operation as a whole:
Docs: If the condition expression evaluates to true, the operation succeeds; otherwise, the operation fail
Here are two options, given your current structure:
Make two update operations, one unconditional and one conditional. (note that list_append(Moment.repeats, :repeateUsers) does not guard against adding duplicate entries)
Make a query to retrieve the current record and, after determining what needs to be written, make one unconditional update operation.

Get the size of a folder in Amazon S3 using Go SDK 2

I know there are no folders in Amazon S3, but we can emulate them by using "/" on the key name.
Given that, is it possible using the AWS SDK for Go v2 to calculate the size of a folder? Or do I have to retrieve all objects in the folder and then calculate one by one the size?
Given that example, and the Object types documentation here
It is possible to compute the size occupied by items within a bucket
package main
import (
"context"
"flag"
"fmt"
"log"
"github.com/aws/aws-sdk-go-v2/config"
"github.com/aws/aws-sdk-go-v2/service/s3"
)
var (
bucketName string
objectPrefix string
objectDelimiter string
maxKeys int
)
func init() {
flag.StringVar(&bucketName, "bucket", "", "The `name` of the S3 bucket to list objects from.")
flag.StringVar(&objectPrefix, "prefix", "", "The optional `object prefix` of the S3 Object keys to list.")
flag.StringVar(&objectDelimiter, "delimiter", "",
"The optional `object key delimiter` used by S3 List objects to group object keys.")
flag.IntVar(&maxKeys, "max-keys", 0,
"The maximum number of `keys per page` to retrieve at once.")
}
// Lists all objects in a bucket using pagination
func main() {
flag.Parse()
if len(bucketName) == 0 {
flag.PrintDefaults()
log.Fatalf("invalid parameters, bucket name required")
}
// Load the SDK's configuration from environment and shared config, and
// create the client with this.
cfg, err := config.LoadDefaultConfig(context.TODO())
if err != nil {
log.Fatalf("failed to load SDK configuration, %v", err)
}
client := s3.NewFromConfig(cfg)
// Set the parameters based on the CLI flag inputs.
params := &s3.ListObjectsV2Input{
Bucket: &bucketName,
}
if len(objectPrefix) != 0 {
params.Prefix = &objectPrefix
}
if len(objectDelimiter) != 0 {
params.Delimiter = &objectDelimiter
}
// Create the Paginator for the ListObjectsV2 operation.
p := s3.NewListObjectsV2Paginator(client, params, func(o *s3.ListObjectsV2PaginatorOptions) {
if v := int32(maxKeys); v != 0 {
o.Limit = v
}
})
// Iterate through the S3 object pages, printing each object returned.
var i int
var total int64
log.Println("Objects:")
for p.HasMorePages() {
i++
// Next Page takes a new context for each page retrieval. This is where
// you could add timeouts or deadlines.
page, err := p.NextPage(context.TODO())
if err != nil {
log.Fatalf("failed to get page %v, %v", i, err)
}
// Log the objects found
for _, obj := range page.Contents {
// fmt.Println("Object:", *obj.Key)
total += obj.Size
}
}
fmt.Println("total", total)
}
Then, if I am correct, reading at s3.ListObjectsV2Input documentation, it appears to me that you can configure the Prefix member of the s3.ListObjectV2Input instance to select a specific folder. The example already demonstrates that if you pass in the flag -prefix=...
Not sure if the easiest way, however you can iterate over your objects list of interest - https://docs.aws.amazon.com/AmazonS3/latest/API/API_ListObjectsV2.html, and aggregate the size locally.
Another way:
enable AWS storage lens - advanced metric - prefix aggregation for your bucket, with exportation the metric as csv to a bucket
get the csv data from the file in the bucket
note: the metric is exported every 24 hours

AmazonCloudWatch PutMetricData request format parsing

How to parse PutMetricData Sample Request as show below.
I want to parse all the MetricData and stores the values in a struct in golang.
https://monitoring.&api-domain;/doc/2010-08-01/
?Action=PutMetricData
&Version=2010-08-01
&Namespace=TestNamespace
&MetricData.member.1.MetricName=buffers
&MetricData.member.1.Unit=Bytes
&MetricData.member.1.Value=231434333
&MetricData.member.1.Dimensions.member.1.Name=InstanceID
&MetricData.member.1.Dimensions.member.1.Value=i-aaba32d4
&MetricData.member.1.Dimensions.member.2.Name=InstanceType
&MetricData.member.1.Dimensions.member.2.Value=m1.small
&MetricData.member.2.MetricName=latency
&MetricData.member.2.Unit=Milliseconds
&MetricData.member.2.Value=23
&MetricData.member.2.Dimensions.member.1.Name=InstanceID
&MetricData.member.2.Dimensions.member.1.Value=i-aaba32d4
&MetricData.member.2.Dimensions.member.2.Name=InstanceType
&MetricData.member.2.Dimensions.member.2.Value=m1.small**
&AUTHPARAMS
Not able to understand this is in which format and how to parse it. Any library available to generate and parse this kind of formatted message?
If you remove the newlines that is a URL. Start with url.Parse, then use the Query() function to get access to the url parameters:
func main() {
var input = `https://monitoring.&api-domain;/doc/2010-08-01/
?Action=PutMetricData
&Version=2010-08-01
&Namespace=TestNamespace
&MetricData.member.1.MetricName=buffers
&MetricData.member.1.Unit=Bytes
&MetricData.member.1.Value=231434333
&MetricData.member.1.Dimensions.member.1.Name=InstanceID
&MetricData.member.1.Dimensions.member.1.Value=i-aaba32d4
&MetricData.member.1.Dimensions.member.2.Name=InstanceType
&MetricData.member.1.Dimensions.member.2.Value=m1.small
&MetricData.member.2.MetricName=latency
&MetricData.member.2.Unit=Milliseconds
&MetricData.member.2.Value=23
&MetricData.member.2.Dimensions.member.1.Name=InstanceID
&MetricData.member.2.Dimensions.member.1.Value=i-aaba32d4
&MetricData.member.2.Dimensions.member.2.Name=InstanceType
&MetricData.member.2.Dimensions.member.2.Value=m1.small**
&AUTHPARAMS`
// possibly also needs to replace \r
input = strings.ReplaceAll(input, "\n", "")
uri, err := url.Parse(input)
if err != nil {
log.Fatal(err)
}
for key, val := range uri.Query() {
fmt.Println(key, val)
}
}
Playground
From here on out it's up to you how you want the target struct to look like.

How to retrieve distinct properties of documents

In our CouchDB document database, we have documents with different "status" property values like this:
doc1: {status: "available"},
doc2: {status: "reserved"},
doc3: {status: "available"},
doc4: {status: "sold"},
doc5: {status: "available"},
doc6: {status: "destroyed"},
doc7: {status: "sold"}
[...]
Now, I would like to write a map-reduce function that returns all distinct status values that exist over all documents: ["available", "reserved", "sold", "destroyed"].
My approach was to begin writing a map function that returns only the "status" property of each document:
function (doc) {
if(doc.status) {
emit(doc._id, doc.status);
}
}
And now, I would like to compare all map rows to each other such that no status duplicates will be returned.
The official CouchDB documentation seems to be very detailed and technical, but cannot really be projected to our use case, which does not have any nested structures like in blog posts but simply "flat objects" with a "status" property. Besides, our backend uses PouchDB as an adapter to connect to our remote CouchDB.
I discovered that when executing the reduce function below (which I implemented myself trying to understand what happens under the hood), some strange result will be returned.
function(keys, values, rereduce) {
var array = [];
if(rereduce) {
return values;
} else {
if(array.indexOf(values[0]) === -1) {
array.push(values[0]);
}
}
return array;
}
Result:
{
"rows": [
{
"key": null,
"value": "[reduce] [status] available,available,[status] sold,unknown,[status] available,[status] available,[status] available,reserved,available,[status] reserved,available,[status] available,[status] sold,reserved,[status] sold,sold,[status] available,available,[status] reserved,[status] reserved,[status] available,[status] reserved,available"
}
]
}
The reduce step seems to be executed exactly once, while the status loops sometimes have only a single value, then two or three values, without a recognizable logic or pattern.
Could somebody please explain to me the following:
How to retrieve an array with all distinct status values
What is the logic (or workflow) of the reduce function of CouchDB? Why do status rows have an arbitrary number of status values?
Thanks to #chrisinmtown's comment I was able to implement the distinct retrieval of status values using the following functions:
function map(doc) {
if(doc.status) {
emit(doc.status, null);
}
}
function reduce(key, values) {
return null;
}
It is important to send the query parameter group = true as well, otherwise the result will be empty:
// PouchDB request
return this.database.query('general/all-status', { group: true }).pipe(
map((response: PouchDB.Query.Response<any>) => response.rows.map((row: any) => row.key))
);
See also the official PouchDB documentation for further information how to use views and queries.

Exit Pipeline if user did not enter values for Active Choice Parameterized pipeline build

I wish to abort the pipeline if the user did not select any value for Active Choice parameter for single/multi choice/string pipeline parameter.
For example I have Active Choices Reactive Parameter Named "IPAddress" of Type "Multi Select" with Groovy Script as below:
if (Location.equals("MyTown")) {
return["DDL1", "DDL2", "DDL3", "DDL4"]
} else if (Location.equals("Your Town")) {
return["DDP1", "DDP2"]
} else {
return ["Select an IP from the drop-down"]
}
Thus, once I run the pipeline i see "Select an IP from the drop-down" for IPAddress.
Now, If the user does not select anything from the dropdown the pipeline should fail & abort.
In the pipeline script I have written the below condition check which fails to check the condition despite user ignoring to select any IPAddress.
def ex(param){
currentBuild.result = 'ABORTED'
error('BAD PARAM: ' + param)
}
pipeline {
agent any
stages {
stage ("Pre-Check Parameters") {
steps {
echo "Pre-Check called in pipeline"
script {
if ("${params.IPAddress}" == null) {ex("IPAddress")}
//if ("${params.myEnv}" == null) {ex("b")}
//if ("${params.myLoc}" == null) {ex("c")}
}
}
}
}
}
Can you please suggest what could be the issue here ?
Do you have any constraint against using the input step?
def days=''
pipeline{
agent any;
stages {
stage('master'){
steps{
script {
try {
timeout(time:10, unit:'SECONDS') {
days = input message: 'Please enter the time window in number of days', ok: 'Fetch Statistics', parameters: [string(defaultValue: '90', description: 'Number of days', name: 'days', trim: true)]
}
}
catch (err){
error("No custom value has been entered for number of days.")
}
}
}
}
}
}
To determine if your string is empty you can use the method .trim(). It will remove leading and trailing spaces from your string. The two magic words are "Groovy Truth". An empty string is false in Groovy. This makes it easier to evaluate conditional expressions. Means in your case, if you use .trim() in combination with an if conditional then the Groovy Truth value of the string will be used for the evaluation.
Your pipeline should work if you change it to the following. It will check if your variable is null or empty:
script {
if (!params.IPAddress?.trim()) {
ex("IPAddress")
}
}