I'm trying to schedule a task on ECS (running on Fargate), via CDK. When it runs, my task uses the LATEST platform version (which is currently 1.3.0, despite 1.4.0 having been released). I'd like to use version 1.4.0, but I can't see a way to specify that in the CDK objects - it only seems to be supported when creating a FargateService, whereas I'm using a FargateTaskDefinition. Does anyone know if this is possible? (It's possible via the direct API, and via the console).
You have to upgrade your aws-cdk first.
npm install -g aws-cdk
Then upgrade the package.json
npx npm-check -u
the latest "#aws-cdk/aws-ecs": 1.72 allows you to specify the platform version in ecs task:
new EcsTask({
...
platformVersion: FargatePlatformVersion.VERSION1_4
})
Related
I am trying to deploy a service to aws using serverless. I am deploying it using gitlab cicd instead of doing it locally. Initially my serverless version was latest(had not mentioned any specific version) but then when I pushed my code to gitlab and i got few errors in the pipeline as the latest version is not stable. So had to change the version to a stable version. Now when i pushed my code changes to gitlab, my deployment failed and i got
Serverless Error ----------------------------------------
Cannot run local installation of the Serverless Framework by the outdated global version. Please upgrade via:
npm install -g serverless Note: Latest release can run any version of the locally installed Serverless Framework.
I dont want to upgrade my serverless version.
in my gitlab-ci.yml i have changed
- npm install -g serverless
to this
- npm install -g serverless#2.69.1
Is there any way I can fix this ?
Any help would be appreciated, thank you.
in your case, the most likely reason for that is the fact that you have a local installation of Serverless or some of the plugins/your other dependencies have Serverless v3 in peer dependencies and install it by default in npm#7 and higher.
To resolve it, either remove local installation or pin the version of the locally installed Serverless (in devDependencies of package.json of your project).
./node_modules/.bin/sls deploy does the trick.
However, the proper answer is in the docs:
There are 2 scenarios:
Using v3 globally, and v2 in specific projects.
This is the simplest. Upgrade the global version to v3, and install v2 in specific projects (via NPM). The serverless command will automatically run the correct version (v3 can run v2).
Using v2 globally, and v3 in specific projects.
To achieve that, install v3 in specific projects (via NPM). Then, use serverless for v2 projects, and npx serverless for v3 projects.
https://www.serverless.com/framework/docs/guides/upgrading-v3#using-v2-and-v3-in-different-projects
I have issues with cdk while trying to bundle lambdas with esbuild while working in my WSL2 debian
esbuild is installed as a global npm package and also in devDependencies of my cdk project
node --version
v14.16.0
cdk --version
1.95.1
esbuild --version
0.11.2
Examples of lambda definition
lex_create_bot = _lambda_node.NodejsFunction(
self,
id="lambda-lex-create-bot",
entry="lambdas_fns/lex_create_bot/lex-create-bot.ts",
handler="handler",
runtime=_lambda.Runtime.NODEJS_14_X,
bundling={"minify": True}
)
Everytime I try to deploy, check diff, cdk try to bundle the lambdas with docker instead of esbuild.
I work on this stack for a while and eveything was fine until I switched from remote-container to WSL2 to manage my dev environement in vscode.
docker is really slow for bundling and creates diff for already deployed lambdas that have no code changes.
Any idea how to solve this ?
EDIT
Same issue with Ubuntu-20.04 WSL2
I upgrade to cdk 1.97.0 and esbuild 0.11.5 this morning and all is working well now.
Still a strange behavior that I want to avoid in the future, if anyone got a more generic solution to this problem ...
I wish to implement the answer that is outlined here: https://stackoverflow.com/a/50397276/1980516
However, I find that I keep running into Unable to import module 'index' at exactly this line:
const _archiver = require('archiver');
So, I'm guessing that I cannot do this via the online console. Instead, I probably have to create a deployment package.
How do I go about this? I apparently need AWS CLI, Node.js, npm and I'm new to all of it. In the Amazon docs I can't find a practical list of how to set up my local development environment.
What tools do I install, which versions and in what order exactly?
Edit: Windows :)
My guess is that you need to npm install archiver and package the node_modules dependencies along with your index.js (handler file for your lambda entry point). You can zip up and deploy/upload it to your lambda.
Also have a look at https://github.com/serverless/serverless framework, that will do these type of things easier.
Have a look at AWS SAM, the Serverless Application Model. It provides a local development setup for things like Lambda functions and API Gateway endpoints, and a way to easily package and deploy things. The exact steps you need are:
Create an AWS account and an IAM user with admin privileges
Install node.js
Install the AWS CLI (and configure it with aws configure)
Install SAM CLI and Docker (the local instances run in docker containers)
Initialize a new SAM project with sam init --runtime nodejs (or other runtime version if need)
Run through the quickstart to get an idea of how to define a SAM template, build a SAM app, and deploy.
If you don't want to use the framework or local development environment and just want to create the source bundle, there are docs. The gist is:
Install nodejs (e.g. using homebrew or an installer)
npm install the modules you need
Zip up your code including the node_modules folder
Upload the zip via the AWS Console
I have been trying to install Spark on latest EMR((5.13.X)cluster via bootstrapping using the following with Terraform, but not getting successful. Any ready to use latest Spark/emr version bootable script or other solution to do using Terraform?
bootstrap_action = {
path = "s3://support.elasticmapreduce/spark/install-spark"
name = "install-spark"
args = ["instance.isMaster=true", "echo running on master node"]}
That install-spark bootstrap action hasn't worked since before Spark was officially supported as an application on AMI version 3.9.0 about three years ago. Also, bootstrap actions built for AMI version 3.x and earlier do not work at all with release labels emr-4.x and emr-5.x+.
Instead, to install Spark on emr-4.x or emr-5.x, you simply include "Spark" in the list of Applications of the RunJobFlowRequest.
I have not used Terraform to create an EMR cluster, but the example I found at https://www.terraform.io/docs/providers/aws/r/emr_cluster.html shows exactly how to create a cluster with Spark.
I am using Cloud Composer and I noticed that it selects the version of Apache Airflow and Python (2.7.x) for me. I want to use a different version of Airflow and/or Python. How can I change this?
Cloud Composer deploys the latest stable build of Airflow. New versions of Airflow are usually deployed by Composer within a a few weeks of their stable release. The Airflow version deployed and the Python version installed cannot be changed at this time. A future release of Cloud Composer may offer the ability to select the Airflow and/or Python version for new environments.
If you want to deploy a specific version of Airflow you will need to use the gcloud CLI tool in order to specify this. It is not currently possible to do this from the web front end.
Have a look at the follow page to see the available versions https://cloud.google.com/composer/docs/concepts/versioning/composer-versions
If you would like to deploy say Airflow 1.10 and Python 3 to your environment you would use the
--image-version
--python-version
flags in order to set this. For example if you used the following it would install with Composer 1.4.1, Airflow 1.10 and Python 3
gcloud beta composer environments create ENV_NAME --image-version composer-1.4.1-airflow-1.10.0 --python-version 3
You will need to specify all the other parameters and arguments required for the environment as well. The above only shows the two arguments to set the Airflow and Python versions.