Installing Windows applications/extensions with Amazon CloudFormation - amazon-web-services

Amazon's Windows Roles and Features template demonstrates how to install features on a server. But can anyone tell me how would I install an application/feature that is not present in this list, such as WebDeploy? I appreciate I could create a custom AMI, but I'm looking to do this entirely at template-level.
Thanks in advance.

For your example of WebDeploy, you could add this:
"C:\\Webdeploy\\WebDeploy_amd64_en-US.msi" : {
"source" : "http://download.microsoft.com/download/1/B/3/1B3F8377-CFE1-4B40-8402-AE1FC6A0A8C3/WebDeploy_amd64_en-US.msi"
}
as a element of this section of the template
"WindowsServer": {
"Type" : "AWS::EC2::Instance",
"Metadata" : {
"AWS::CloudFormation::Init" : {
"config" : {
"files" : {
You could then add this command:
"1-installwebdeploy" : {
"command" : "msiexec.exe /i C:\\Webdeploy\\WebDeploy_amd64_en-US.msi ADDLOCAL=ALL /qn /norestart"
}
to the list of commands.

There's a slightly easier mechanism if it's just a bog stock msi installation...
"packages" : {
"msi" : {
"urlrewrite" : "http://download.microsoft.com/download/6/7/D/67D80164-7DD0-48AF-86E3-DE7A182D6815/rewrite_2.0_rtw_x64.msi"
}
},
Means you don't need the "command" section.

Related

How to read ssm parameters in a shell script in aws data pipeline?

I'm setting up a data pipeline in aws. and plan to use a "getting started using ShellCommandActivity" template to run a shell script. how can i pass credentials stored in ssm parameter as a parameter to this script.
I haven't verified that, but ShellCommandActivity is similar to ShellScriptConfig from what I can tell. Based on the examples provided for these commands, I would think that you could pass the ssm param name as follows:
{
"id" : "CreateDirectory",
"type" : "ShellCommandActivity",
"command" : "your-script.sh <name-of-your-parameter>"
}
or
{
"id" : "CreateDirectory",
"type" : "ShellCommandActivity",
"scriptUri" : "s3://my-bucket/example.sh",
"scriptArgument" : ["<name-of-your-parameter>"]
}
and in the example.sh you would use $1 to refer to the value of the argument passed.

Capture start and end of lambda functions

Is it possible to capture the startTime and endTime of execution of lambda functions along with parameters that were passed to it ?
I couldn't find any state-change event configurations that could be configured to send events when lambda function starts/terminates?
A crappy alternative is to record parameters & start time in database when the lambda is being invoked and have the lambda update the endgame as final step before it's completion. This appears prone to failures scenarios like function erroring out before updating DB.
Are there other alternatives to capture this information
aws x-ray may be a good solution here. It is easy to integrate and use. You may enable it aws console.
Go to your lambda function/ configuration tab
Scroll down & in AWS X-Ray box choose active tracing.
Without any configuration in the code, it is going to record start_time and end_time of the function with additional meta data. You may integrate it as a library to your lambda function and send additional subsegments such as request parameters. Please check here for documentation
Here is a sample payload;
{
"trace_id" : "1-5759e988-bd862e3fe1be46a994272793",
"id" : "defdfd9912dc5a56",
"start_time" : 1461096053.37518,
"end_time" : 1461096053.4042,
"name" : "www.example.com",
"http" : {
"request" : {
"url" : "https://www.example.com/health",
"method" : "GET",
"user_agent" : "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/601.7.7",
"client_ip" : "11.0.3.111"
},
"response" : {
"status" : 200,
"content_length" : 86
}
},
"subsegments" : [
{
"id" : "53995c3f42cd8ad8",
"name" : "api.example.com",
"start_time" : 1461096053.37769,
"end_time" : 1461096053.40379,
"namespace" : "remote",
"http" : {
"request" : {
"url" : "https://api.example.com/health",
"method" : "POST",
"traced" : true
},
"response" : {
"status" : 200,
"content_length" : 861
}
}
}
]
}

How can i set the maximumRetryAttempt for aws lambda in the cloudformation lambda?

I have a serverless project created throught visual studio and i'm looking for setting the maximumRetryAttempt of a specific lambda in the cloudformation template.
I saw EventInvokeConfig, however the lambda function name is generated automatically and different from each environment. I am wondering if there is an aws specific parameter to get a lambda function name?
"EventInvokeConfig": {
"Type" : "AWS::Lambda::EventInvokeConfig",
"Properties" : {
"FunctionName" : "???",
"MaximumRetryAttempts" : 0,
"Qualifier" : "$LATEST"
}
}
Here is my serverless template
{
"AWSTemplateFormatVersion":"2010-09-09",
"Transform":"AWS::Serverless-2016-10-31",
"Description":"An AWS Serverless Application that uses the ASP.NET Core framework running in Amazon Lambda.",
"Resources":{
"MyFunctionLambda":{
"Type":"AWS::Serverless::Function",
"Properties":{
"Handler":"MyPlatformServerless::MyPlatformServerless.Lambdas.MyFunctionLambda::FunctionHandler",
"Runtime":"dotnetcore2.1",
"CodeUri":"",
"Description":"Default function",
"MemorySize":512,
"Timeout":60,
"Role":null
}
}
}
}
You can make use of the Ref instrinsic function. For the resource of type AWS::Serverless::Function the returned value is the name of the function.
This can be referenced in other resources defined in the template. For EventInvokeConfig, the template would look like
{
"AWSTemplateFormatVersion":"2010-09-09",
"Transform":"AWS::Serverless-2016-10-31",
"Description":"An AWS Serverless Application that uses the ASP.NET Core framework running in Amazon Lambda.",
"Resources":{
"MyFunctionLambda":{
"Type":"AWS::Serverless::Function",
"Properties":{
"Handler":"MyPlatformServerless::MyPlatformServerless.Lambdas.MyFunctionLambda::FunctionHandler",
"Runtime":"dotnetcore2.1",
"CodeUri":"",
"Description":"Default function",
"MemorySize":512,
"Timeout":60,
"Role":null
}
},
"EventInvokeConfig": {
"Type" : "AWS::Lambda::EventInvokeConfig",
"Properties" : {
"FunctionName" : { "Ref" : MyFunctionLambda },
"MaximumRetryAttempts" : 0,
"Qualifier" : "$LATEST"
}
}
}
}
Your issue might have resolved already but you can improve the code. As you're using Serverless (SAM) you can directly specify the EventInvokeConfig in the lambda resource properties and no need of another resource creation. Please find the below snippet:
{
"AWSTemplateFormatVersion":"2010-09-09",
"Transform":"AWS::Serverless-2016-10-31",
"Description":"An AWS Serverless Application that uses the ASP.NET Core framework running in Amazon Lambda.",
"Resources":{
"MyFunctionLambda":{
"Type":"AWS::Serverless::Function",
"Properties":{
"Handler":"MyPlatformServerless::MyPlatformServerless.Lambdas.MyFunctionLambda::FunctionHandler",
"Runtime":"dotnetcore2.1",
"CodeUri":"",
"Description":"Default function",
"MemorySize":512,
"Timeout":60,
"Role":null,
"EventInvokeConfig": {
"MaximumRetryAttempts" : 0
}
}
}
}
}
You can also specify other attributes like DestinationConfig and MaximumEventAgeInSeconds in the EventInvokeConfig object.
References:
https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/sam-property-function-eventinvokeconfiguration.html

Fn If in cf template and executing a command

I would greatly appreciate your help on this.
I am trying inside a AWS CF template to run a command when the env=QA
This is what I'm trying to achieve but there is no way
"Conditions" : {
"CreateQaResources" : {"Fn::Equals" : [{"Ref" : "Env"}, "AwsQaUs"]}
},
and then
"Fn::If": [
"CreateQaResources",
{
"echo \"10.0.0.0 DOMAIN.COMPANY.com \" >>/etc/hosts \n",
},
{
"Ref" : "Env::AwsQaUs"
}
]
Can you please tell me where is my mistake?
Thank you
Everything seems to be okay, but it would be nice to see more of your template to make sure it looks right. Are you getting some sort of error or is the end result just not what you're hoping? I realize this isn't an answer. Apparently I don't have permission to add comments, yet. Sorry!

Can you connect a SqlActivity to a JdbcDatabase in Amazon Data Pipeline?

Using Amazon Data Pipeline, I'm trying to use a SqlActivity to execute some SQL on a non-Redshift data store (SnowflakeDB, for the curious). It seems like it should be possible to do that with a SqlActivity that uses a JdbcDatabase. My first warning was when the wysiwyg editor on Amazon didn't even let me try to create a JdbcDatabase, but I plowed on anyway and just wrote and uploaded a Json definition by hand, myself (here's the relevant bit):
{
"id" : "ExportToSnowflake",
"name" : "ExportToSnowflake",
"type" : "SqlActivity",
"schedule" : { "ref" : "DefaultSchedule" },
"database" : { "ref" : "SnowflakeDatabase" },
"dependsOn" : { "ref" : "ImportTickets" },
"script" : "COPY INTO ZENDESK_TICKETS_INCREMENTAL_PLAYGROUND FROM #zendesk_incremental_stage"
},
{
"id" : "SnowflakeDatabase",
"name" : "SnowflakeDatabase",
"type" : "JdbcDatabase",
"jdbcDriverClass" : "com.snowflake.client.jdbc.SnowflakeDriver",
"username" : "redacted",
"connectionString" : "jdbc:snowflake://redacted.snowflakecomputing.com:8080/?account=redacted&db=redacted&schema=PUBLIC&ssl=on",
"*password" : "redacted"
}
When I upload this into the designer, it refuses to activate, giving me this error message:
ERROR: 'database' values must be of type 'RedshiftDatabase'. Found values of type 'JdbcDatabase'
The rest of the pipeline definition works fine without any errors. I've confirmed that it activates and runs to success if I simply leave this step out.
I am unable to find a single mention on the entire Internet of someone actually using a JdbcDatabase from Data Pipeline. Does it just plain not work? Why is it even mentioned in the documentation if there's no way to actually use it? Or am I missing something? I'd love to know if this is a futile exercise before I blow more of the client's money trying to figure out what's going on.
In your JdbcDatabase you need to have the following property:
jdbcDriverJarUri: "[S3 path to the driver jar file]"