I want to execute some commands in ECS instances, before the tasks start. Like installing something in EC2 instances. I am using ECS-CLI is there any possible way of achieving this?
Not sure where and when do exactly you want to run commands, but you can use ecs-cli up which has option:
--extra-user-data string - Specifies additional user data for your container instance. Files can be shell scripts or cloud-init directives. They are packaged into a MIME multipart archive along with user data provided by the Amazon ECS CLI that directs instances to join your cluster. For more information, see Specifying User Data.
Related
I would like to perform domain join on any servers provisioned in my AWS environment.
For ASG and EKS, can launch template be used to run custom scripts as soon as new nodes are launched?
Yes, with ASG you can run custom scripts. You need to modify your User data in your launch template or launch configuration. You can retrive scripts saved in s3 or pass any commands or script directly. If it's just a join to freeipa for example. You can do it with a simple one line command(don't foget to add #!/bin/bash in the begining of user data) instead of a script.
More information can be found here and here.
Is there any way to install exe/MSI agents in AWS EC2 instances in an automated way?? In specific, I am looking for a counterpart of Azure's Custom Script Extension. [Free of cost]
Scenario:
I want to install BigFix and Datadog agents on 1000 Ec2 instances, this is a one time job, so I am not looking for any solution that involves Chef / Puppet, etc.,
Yes, you can pass a script to the instance that will be executed on the first boot (but not thereafter). It is often referred to as a User Data script.
See:
Running Commands on Your Windows Instance at Launch - Amazon Elastic Compute Cloud
Running Commands on Your Linux Instance at Launch - Amazon Elastic Compute Cloud
If you wish to install after the instance has started, use the AWS Systems Manager Run Command.
I have a release of my project. I build a docker image and deploy it on an ec2 instance.
Later, when I have a new release, I would like update the docker on ec2 remotely (without accessing the machine, just executing some service).
Is there a way how to do it without ECS and ElasticBeanstalk?
If it's not possible can I somehow re-run the cfn-init script?
My Research
https://aws.amazon.com/blogs/aws/new-ec2-run-command-remote-instance-management-at-scale/
You can manage your instances remotely (i.e. make changes without manually SSHing into the instance and typing commands) by using any of the many system management services out there. AWS offers Simple Systems Manager (SSM) of which the Run Command you linked is part. AWS also offers the OpsWorks service which uses Chef. You also have other products like Ansible and SaltStack, and you can optionally integrate the use of those services with the AWS SSM service.
I would like to create a Managed Compute Environment for AWS Batch, but use EC2 User Data to configure the instances as they are brought into the ECS fleet that Batch is scheduling jobs onto.
It shouldn't matter, but the purpose of the User Data script is to pull down large data files onto an InstanceStore that the Docker containers will reference.
This is possible in ECS, but I have found no way to pass User Data to a Managed Batch Compute Environment.
At most, I can specify the AMI. But since we're going with Managed, we must use the Amazon ECS-optimized AMI.
I'd prefer to use EC2 User Data as the solution, as it gives a entry-point for any other bootstrapping we wish to perform. But I'm open to other hacks or solutions, so long as they are applicable to a Managed Compute Environment.
You can create an AMI based on the AWS provided AMI, and customize it. It will still be managed since the Batch and/or ECS daemon is running on it.
As a side note I’m trying to do the same thing but no luck so far. I may end up creating a custom AMI and include the configure script in the AMI itself in /etc/rc.local. Not ideal but I don’t think Batch can pass a user data script other than what it needs. I am still looking into this.
You can create a launch template containing your user-data. Then assign this launch template to your compute environment. Keep in mind that you might have to clean the cloud init directory in your AMI since it probably was already spun up once (at ami creation).
Launch template userguide
We are trying to automate running of multiple powershell scripts in AWS EC2 instance. Any suggestion how this can be done. These execution has to be sequential.
In order to configure state on a fresh EC2 instance, you can use 'User Data' in order to:
a. Fully provision and configure state using powershell, see:
http://docs.aws.amazon.com/AWSEC2/latest/WindowsGuide/ec2-instance-metadata.html#instancedata-add-user-data
b. Install an agent for a configuration management tool such as Puppet, Chef, Ansible (as mentioned in the comment above). An example of provisioning stock Window's AMI's can be followed here: http://blog.rolpdog.com/2015/09/manage-stock-windows-amis-with-ansible.html
If you wish to trigger scripts on an existing Amazon EC2 instance (or a whole fleet of instances), also consider Running PowerShell Commands or Scripts with Amazon EC2 Run Command.