With an existing Step Functions definition JSON file, how can I use it directly in CDK to create a Step Function?
Use the L1 CfnStateMachine construct. It has a definitionString prop that accepts a stringified JSON definition.
Here is the code snippet if it is useful to anyone.
private createStepFunction(props: {
stepfunction_name: string;
stepfunctions_role_arn: string;
}): stepfunctions.CfnStateMachine {
const file = fs.readFileSync("../step_functions/definition.asl.json");
const stepFunction = new stepfunctions.CfnStateMachine(
this,
"cfnStepFunction",
{
roleArn: props.stepfunctions_role_arn,
definitionString: file.toString(),
stateMachineName: props.stepfunction_name,
}
);
return stepFunction;
}
Related
I've following stack which deploys the two constructs within. The construct DynamoDBConstruct exports the table name, and the construct IAMRoleConstruct consumes it. However, during deployment, it fails stating No export named dbTableName found, despite the fact that dependency is added/specified, the IAMRoleConstruct gets deployed first, why?
Stack:
public AllStacks(Construct scope, string id, IStackProps props = null) : base(scope, id, props)
{
var db = new DynamoDBConstruct(this, "DynamoDB");
var iam = new IAMRoleConstruct(this, "IAMRole");
iam.Node.AddDependency(db);
}
DynamoDBConstruct
public DynamoDBConstruct(Construct scope, string id): base(scope, id)
{
var dbTable = new Table(this, "dbTable", new TableProps()
{
PartitionKey = new Attribute
{
Name = "contactID",
Type = AttributeType.STRING
},
TableClass = TableClass.STANDARD,
TableName = (string)Node.TryGetContext("dbTableName"),
RemovalPolicy = RemovalPolicy.DESTROY
});
new CfnOutput(this, "OutputTableName", new CfnOutputProps()
{
ExportName = "dbTableName",
Value = dbTable.TableName
});
}
IAMRoleConstruct
public IAMRoleConstruct(Construct scope, string id) : base(scope, id)
{
var dbTableName = Fn.ImportValue("dbTableName");
/*
Some code
.
*/
}
With the disclaimer that I am not sure what language your code is in, I'm going to write in CDK's native language (which I recommend you to do as well) - Typescript.
The problem comes most likely from the fact that you are using the export within the same CDK/CFN stack. The export won't be available during stack creation, as that is part of the stack creation itself.
When you're working within a single stack, the simplest, most intuitive way of "moving data" from one construct to another is to just expose values through a public member of your class, e.g.:
class DynamoDBConstruct extends Construct {
public readonly tableName: string;
constructor(scope: Construct, id: string, props: Whatever) {
super(scope, id);
const table = new Table(this, 'Table', {
partitionKey: { name: 'id', type: AttributeType.STRING },
billingMode: BillingMode.PAY_PER_REQUEST,
// omitting table name on purpose - it will be generated by CDK
});
this.tableName = table.tableName;
}
}
Now inside your stack, you can simply use that table name:
class MyStack extends Stack {
constructor(scope: App, id: string, props: Whatever) {
const table = new DynamoDBConstruct(...);
const myOtherConstruct = new MyOtherConstruct(this, 'myOtherConstruct', {
// using table name here
tableName: table.tableName,
});
}
}
The reason for the error is that you are trying to produce and consume a Stack Output in the same stack. That won't work:
Docs: Output values are available after the stack operation is complete. Stack output values aren't available when a stack status is in any of the IN_PROGRESS status.
No worries! As #Victor says, there is a much easier alternative. Get rid of the Outputs. Instead, share data between your custom constructs by declaring public fields (e.g. public Table table) in the providing class, passing the references as props to the consuming class. This is what the CDK constructs do.
See the C# example stacks in the aws-cdk-examples repo.
I'm building the infrastructure for an application using AWS-CDK.
I have a construct that builds multiple S3 buckets and another construct that creates a lambda function that fetches data from these buckets.
In order to be able to give my lambda permissions to fetch data from the bucket I need the buckets ARN.
Is there a way in which I could export the bucket arn from the construct that produces the buckets and import it into the lambda construct?
Sure, maybe something like this:
export class ConsumingStack extends cdk.Stack {
constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) {
super(scope, id, props);
const producingStack = new BucketProducingStack(this, 'BucketProducingStack');
const { bucket1, bucket2 } = producingStack;
//Create new lambda stack here
//const lambdaStack = new LambdaStack(this, { bucket1, bucket2} );
}
}
export class BucketProducingStack extends cdk.NestedStack {
bucket1: string;
bucket2: string;
constructor(scope: cdk.Construct, id: string, props?: cdk.NestedStackProps) {
const bucket1 = new Bucket(this, 'BucketOne');
const bucket2 = new Bucket(this, 'BucketTwo');
this.bucket1 = bucket1.bucketArn;
this.bucket2 = bucket2.bucketArn;
}
}
No guarantee this compiles as it was written entirely in this window, but hopefully conveys the idea.
If you are using python, you can add
#property
def main_source_bucket(self) -> _s3.IBucket:
return self.bucket
Reference it in your app stack like this ..
bucket = S3Construct(self, "bucket", "bucket1")
LambdaConstruct(self, "lambda1", "dev", bucket.main_source_bucket
I followed the instructions on the github readme and have imported the app into my AWS Mobile Hub project and downloaded my projects aws-config.js to src/assets. When I attempt to serve the app I get a runtime error:
Runtime Error:
aws_cognito_region is not defined
Stack:
ReferenceError: aws_cognito_region is not defined
at new Cognito (http://localhost:8100/build/main.js:112:36)
at _createClass (http://localhost:8100/build/vendor.js:10975:20)
at _createProviderInstance$1 (http://localhost:8100/build/vendor.js:10949:26)
at resolveNgModuleDep (http://localhost:8100/build/vendor.js:10934:17)
at _createClass (http://localhost:8100/build/vendor.js:10977:29)
at _createProviderInstance$1 (http://localhost:8100/build/vendor.js:10949:26)
at resolveNgModuleDep (http://localhost:8100/build/vendor.js:10934:17)
at NgModuleRef_.get (http://localhost:8100/build/vendor.js:12159:16)
at resolveDep (http://localhost:8100/build/vendor.js:12655:45)
at createClass (http://localhost:8100/build/vendor.js:12525:32)
Any insight would be greatly appreciated.
Edit: I have added below my app.config.ts code as well as a segment of my aws-config.js file (omitting the constant declarations at the top that contain my AWS mobile hub project details)
app.config.ts:
import { Injectable } from '#angular/core';
declare var AWS: any;
declare const aws_mobile_analytics_app_id;
declare const aws_cognito_region;
declare const aws_cognito_identity_pool_id;
declare const aws_user_pools_id;
declare const aws_user_pools_web_client_id;
declare const aws_user_files_s3_bucket;
#Injectable()
export class AwsConfig {
public load() {
// Expects global const values defined by aws-config.js
const cfg = {
"aws_mobile_analytics_app_id": aws_mobile_analytics_app_id,
"aws_cognito_region": aws_cognito_region,
"aws_cognito_identity_pool_id": aws_cognito_identity_pool_id,
"aws_user_pools_id": aws_user_pools_id,
"aws_user_pools_web_client_id": aws_user_pools_web_client_id,
"aws_user_files_s3_bucket": aws_user_files_s3_bucket
};
AWS.config.customUserAgent = AWS.config.customUserAgent + ' Ionic';
return cfg;
}
}
aws-config.js:
const 'aws_cognito_region' = 'us-east-1';
... etc
AWS.config.region = aws_project_region;
AWS.config.credentials = new AWS.CognitoIdentityCredentials({
IdentityPoolId: aws_cognito_identity_pool_id
}, {
region: aws_cognito_region
});
AWS.config.update({customUserAgent: 'MobileHub v0.1'});
I fixed this issue by going to the aws-config.js and removing the single quotes on each of the variables defined. So if you have this:
const 'aws_cognito_region' = 'us-east-1';
Change to this:
const aws_cognito_region = 'us-east-1';
You need to configure src/app/app.config with your AWS information (app id, pool id, etc.).
I have a already json11 object build:
Json my_json = Json::object {
{ "key1", "value1" },
{ "key2", false },
{ "key3", Json::array { 1, 2, 3 } },
};
And I want to add a new value to key3 array like this:
my_json["keys3"].push_back(4);
How I can achieve that? I can't see anything to modify objects (all operator to access values are const!)
Unfortunately it seems you cannot modify directly an instance of Json.
It's an opaque wrapper around a JsonValue that is inaccessible.
Anyway, note that a Json::object is a std::map<std::string, Json>. You can create a copy of your original Json::object as it follows:
Json::object json_obj = my_json.object_items();
Then the key keys3 contains a Json::array, that is nothing more than a std::vector<Json>.
You can modify it as it follows:
json_obj["keys3"].push_back(4);
Finally you must create a new Json from your Json::object and that's all:
Json another_json = json_obj;
Quite expensive an operation.
I suspect the right way is to create your objects step by step and at the very end of your process create an instance of a Json.
I found next issues on github about this question:
[https://github.com/dropbox/json11/issues/20]: more o less the same that skypjack explain
The Json type is immutable, but the Json::object type is just a
std::map, so your code would work if the first line created a
Json::object instead. You can use that map to build whatever data you
want, then wrap it in as Json(data) when you're done modifying it. You
can also extract the map from a Json using object_items(), copy it,
mutate it, and use it to create a new Json, similar to a builder
pattern.
[https://github.com/dropbox/json11/issues/75]: This one is very interesting because explain why it's not possible to modify a json
The Json type is intended to be an immutable value type, which has a
number of advantages including thread safety and the ability to share
data across copies. If you want a mutable array you can use a
Json::array (which is just a typedef for a vector) and mutate it
freely before putting it into a Json object.
If you are using json11 you can do it like this:
Json json = Json::object
{
{
"num_neurons_in_each_layer", Json::array{ 1000, 1000, 10, 10 }
},
{
"non_editable_data",
Json::object
{
{"train_error", -1.0 },
{"validation_error", -1.0 }
}
}
};
Json* p_error = const_cast<Json*>(&json["non_editable_data"].
object_items().find("validation_error")->second);
*p_error = Json(2.0); //"validation_error" has been modified to 2.0
p_error = nullptr;
delete p_error;
in node.js, is there any shortcut to export ALL functions in a given file? i want to do this for unit testing purposes, as my unit tests are in a separate file from my production code.
I know i can go through and export each function manually, as in:
exports.myFunction = myFunction;
But i'm wondering if there is a simpler/slicker way to do this.
(and yes, i realize for modularity reasons it isn't always a good idea to export all functions, but for unit testing purposes you do want to see all the little functions so you can test them piece by piece.)
Thanks!
You could do something like this:
// save this into a variable, so it can be used reliably in other contexts
var self = this;
// the scope of the file is the `exports` object, so `this === self === exports`
self.fnName = function () { ... }
// call it the same way
self.fnName();
Or this:
// You can declare your exported functions here
var file = module.exports = {
fn1: function () {
// do stuff...
},
fn2: function () {
// do stuff...
}
}
// and use them like this in the file as well
file.fn1();
Or this:
// each function is declared like this. Have to watch for typeos, as we're typing fnName twice
fnName = exports.fnName = function () { ... }
// now you can use them as file-scoped functions, rather than as properties of an object
fnName();
Mixin objects is the answer.
This lib can help you: https://github.com/shimondoodkin/nodejs-clone-extend
//file1.js
var _ = require('cloneextend');
_.extend(this, require('file2.js'));
file1.js has now all exports from file2.js
Here's a simple way to do it. Parse the AST and look for top level function definitions, and export those.
const esprima = require('esprima')
const program = fs.readFileSync(__filename,'utf8')
const parsed = esprima.parseScript(program)
for (let fn of parsed.body) {
if (fn.type.endsWith('FunctionDeclaration')) {
module.exports[fn.id.name] = eval(fn.id.name)
}
}