I created one ELB and attached a few instances to this ELB. So when I login into one of these instance, I would like to type a command or run a nodejs script that can return me the its ELB name, is it possible? I know I can look up on AWS console but I'm looking for a way to look it up programmatically. If possible, I would like to see how it is done in AWS Nodejs SDK
You do not run nodejs on an elb instance. elb is proxy to load balance client requests to your app server where you run nodejs.
You could use the aws command line tools (http://aws.amazon.com/cli/):
aws elb describe-load-balancers
Parse the JSON output for the instance ID (which you can get using this answer: Find out the instance id from within an ec2 machine) and look for whatever ELB it's attached too.
Related
In my development environment, we constantly create and destroy applications for testing. I would like all Application Load Balancers (ALB) to be placed behind our single AWS Web Application Firewall (WAF) daily through a Lambda script. I want a trigger to run a script that finds all ALBs and then assigns them to the WAF. I can't seem to figure out which boto3 client I should be using to execute this in Python 3.8. Any guidance? Thanks
I suggest use cloudtrail event, using lambda look for elb creation event and connect them to waf.
And I assume you are using SAM or serverless for lambda deployment.
Some usefull link
https://docs.aws.amazon.com/elasticloadbalancing/latest/application/load-balancer-cloudtrail-logs.html
https://docs.aws.amazon.com/lambda/latest/dg/with-cloudtrail-example.html
I setup my ec2 instance and got https working for a bit only to realize I need tls 1.2 on default and in order to do that I had to configure my code to instruct it to read my cert file in the code. Problem is I don't know which it is as there are 269 files in the directory /etc/ssl/certs. I have googled for a couple hours hoping something would tell me where to look to check what file amazon generated for me that it specifically wants. Otherwise im shooting in the dark trying pems one at a time.
secureProtocol: 'TLSv1_2_server_method',
pfx: fs.readFileSync("/etc/ssl/certs/FILENAME.PEM")
}, app).listen(443);
Help is greatly appreciated.
Please refer EC2 instance details in AWS management console.
Steps:
Login to AWS management console and goto EC2 -> Instances.
Select the instance to which we need to connect and scroll the
description which is present in bottom window which will have EC2
instance details.
Check for "Key pair name" , this will be the key pair which needs
to be used to securely connect to respective EC2 instance.
I assume that you got a certificate on Amazon ACM.
ACM Certificates can be used in,
Elastic Load Balancing
Amazon CloudFront
Amazon API Gateway
AWS Elastic Beanstalk
AWS CloudFormation(for email validation only)
The certificate issued by ACM cannot be installed directly on an EC2 instance.
If you want to install an SSL certificate directly on your EC2 instance, you will need to obtain a SSL certificate through a third-party
Therefore, you cannot find any files related to the certificate issued by ACM inside your EC2 instance.
hope this helps.
I am new to AWS ECS. I am developing two services in Java Spring Boot, Service 1 and Service 2. I have created two ECS services with one task each, in the same clsuter.
I can see that there is a "Service Discovery Endpoint" Service2.local and "Service discovery name" Service2. I can also see SRV and Type A record in Route 53 for Service 2. I do not know how do I call Service 2 from Service 1. Before I could try from SpringBoot, I tried the following curl command to try to get status from Service2.
curl service2.local/status
I get the error could not resolve host service2.local . I want to understand how to use the service discovery entpoint or name correctly.
Edit:
I have tried to execute the following command, but it returns nothing.
dig +short service2.local
If
you have the entry in your hosted zone
and
you can curl using the posted IP inside your hosted zone (so the ports are correct and the security groups work, and the app is up) then:
Check that your VPC has both DNS hostnames and DNS resolution otherwise AWS will not resolve the DNS server correctly. (NB it can take a while for it to come online, go brew a cuppa while you wait)
I am trying to run my springboot API on AWS however when i try to connect to the endpoint the error Site cannot be reached IP refused to connect. This my first time working with AWS.
I created a linux instance and connected to it using filezilla. Afterwards i added my jar to a folder which i created on the linux instance using filezilla. I started the springboot project and its running but the problem is that i cannt seem to connect to the endpoints. Am i missing something, how do i connect to my endpoints.
The other thing to note is that i enabled https on my API and added swagger also.
You need to enable relevant ports in the instances' Security Group.
Look at this to create a new Inbound rule for the specific port.
You can go to the aws console, (here I am assuming you have deployed to us-east-1 if its something else, go to the relevant region.
Open up the relevant security group, and then click edit Inbound roles.
I want to run a python script, (which I have run on a docker ubuntu installation) on AWS. It sends data to from Twitter to Elastic Search. I want to run it on Amazon Elasticsearch Service. I have set up Amazon Elasticsearch Service on AWS but I don't know how to get the script into the system and get it running.
What would the ssh be to access the Elastic Search Server?
Once I am able to access it where would I place a python script in order to feed data into the Elasticsearch server?
I tried
PS C:\Users\hpBill> ssh root#search-wbcelastic-*******.us-east-1.es.amazonaws.com/
but just get this:
ssh.exe": search-wbcelastic-**********.us-east-1.es.amazonaws.com/: no address associated with name
I have this information
Domain status
Active
Endpoint
search-wbcelastic-*********.us-east-1.es.amazonaws.com
Domain ARN
arn:aws:es:us-e******1:domain/wbcelastic
Kibana
search-wbcelastic-********.us-east-1.es.amazonaws.com/_plugin/kibana/
You cannot SSH directly into AWS Cloud Search. So that SSH command will never work. You have two option to run the Python script either launch a EC2 instance with AWS CLI or store and run the script from your local machine with AWS CLI. Here is the developer guide for the AWS CLI for Cloud Search
http://docs.aws.amazon.com/cloudsearch/latest/developerguide/using-cloudsearch-command-line-tools.html