We'd like to collect IIS log through logstash and display it on Kibana based on this doc , any advice would be appreciated
Kibana alone is just a UI. What you need is an ELK stack.
You need Elasticsearch, Kibana and Logstash.
Logstash is the software that takes IIS logs and forward it to elasticsearch in order to make them visible in Kibana.
Related
I am planning to deploy ELK to monitor my application running in AWS. My apps are using AWS xray for trace data. I am reading the doc about elastic APM to see how to ingest AWS xray to elasticsearch but I can't find any solution.
I have read the agent doc https://www.elastic.co/guide/en/apm/agent/nodejs/3.x/intro.html but xray is not listed as supported framework. Does this mean I need to build a xray agent and send the trace data to APM server? Or is there an easier way to do that?
We are in the process of migrating from activeMQ to amazonMQ on AWS. ActiveMQ and AmazonMQ internally uses kahaDB as there data store.
Earlier we were able to see kahaDB logs files while using activeMQ on data center is there a similar way of seeing the kahadb logs file on AWS while using amazonMQ?
Tried enabling cloudwatch logs but it contains general and audit logs of amazonMQ.
I checked with AWS technical team, they don't allow to access the kahaDB logs.
We'd like to use the ELK service provided by the Swisscom cloud. Because the applications we want to log are not hosted with Swisscom, but externally, we'd like to connect to the ELK service from outside. Is this possible at all? Or is the ELK service only available to Apps hosted in the Swisscom cloud?
When you create and bind ELK service you will receive connection string and credentials like this
$ cf env $APP
Getting env variables for app $APP in org $ORC / space $SPACE as $USER...
OK
System-Provided:
{
"VCAP_SERVICES": {
"elk": [
{
"credentials": {
"elasticSearchHost": "9zz2ulprvgzlepa5.service.consul",
"elasticSearchPassword": "$PASSWORD",
"elasticSearchPort": 48783,
"elasticSearchUsername": "$USERNAME",
"kibanaPassword": "$PASSWORD",
"kibanaUrl": "http://xjcv9zh0jer2s44q.service.consul:59664",
"kibanaUsername": "$USERNAME",
"logstashHost": "gew5qn71sxcz49gd.service.consul",
"logstashPort": 46611,
"syslog": "syslog://uew5qn71sxcz49gd.service.consul:46611"
},
"label": "elk",
"name": "example-so",
"plan": "small",
"provider": null,
"syslog_drain_url": "syslog://gew5qn71sxcz49gd.service.consul:46611",
"tags": []
}
],
You can't reach the addresses *.service.consul from outside (DNS is only available in Swisscom Cloud). You can only reach those addresses from your App (running in Cloud Foundry container).
There is a workaround, but I recommend only for development purposes.
You can create from your local desktop a tunnel to Elasticsearch or Kibana web interface.
See Administrating Service Instances with Service Connector. This is a CF CLI plugin developed by Swisscom.
After creating a service instance, you’ll eventually need to
administrate the service. For example you might need to create data
tables in a database or backup/restore your data. For these use cases,
we created the Cloud Foundry CLI Plugin Service Connector which is a
local proxy app through which you can connect to your service
instances using your preferred locally installed tools.
example for Kibana web interface.
cf service-connector 80 xjcv9zh0jer2s44q.service.consul:59664
you can also access Elasticsearch from your desktop and use API for inserting or query documents.
The ELK stack has three components:
Elastic Search - storage, index
Logstash - receive and process log messages (like syslog, JSON, text)
Kibana - Web UI to search and visualize
Like written by #Fydor, you cannot reach ELK's service endpoints from the outside. This is also an issue, if you want to access the logs of you CF hosted apps. You do not always want to have to use Swisscom's service connector to access Kibana.
Thus normally, you deploy a small proxy application. Swisscom has a sample for that one.
Alternatively there is the possiblity to use a proxy app like the
Swisscom Kibana Proxy to make your Kibana dashboard publicly
available.
As Elastic Search uses a REST interface, you can use the proxy to publish the Elastic Search endpoint. Eventually, you should also take the chance, to put some security measures into the proxy app.
There are already many logging frameworks, which directly support forwarding to Elastic Search.
If you need to integrate into existing logging solutions (like Syslog, text logs, ...) then you might want to use logstash.
As Cloud Foundry currently supports only publishing HTTP and HTTPS endpoints, you cannot use Swisscom's provided instance for that, but must deploy your own instance and configure this to use your published Elastic Search endpoint.
I want to host my app on an VPS/VPC and am currently leaning towards the AWS EC2 server. I'm looking at the console right now and I see a bunch of services offered like CloudSearch(managed search service) and SES(email sending service).
Considering the fact that I have already written code to do these things (at least for the search) that works locally, do I/should I still utilize these services? If so, why and how?
You do not need to use these services. But there are limits on sending emails from EC2 instances. (http://aws.amazon.com/ec2/faqs/#general , search for Are there any limitations in sending email from EC2 instances?).
If you intent to send huge amount of emails then you want to use SES.
For creating webhosting in EC2 instance you can use Easyengine, refer the below link for it:
http://docs.rtcamp.com/easyengine/install/aws.html
If you intend to send emails from your website you can either use Amazon SES or also any other mailing service.
For sending e-mail using Amazon SES follow the below steps:
Step 1) Verify the email address from which you need to send emails.
Step 2) Use the credentials which you get from step1 in your web application to send email to your user.
I have a java web application running on Tomcat deployed on an EC2 instance. Is there any way I can monitor/set alarms for when the web application goes down or stops responding? Essentially what I would like to do is to check if a HTTP request to the web app responds with status 200. If it does not respond with 200 (for a few times) then it should raise an alarm and send an e-mail to some ops people.
I know there are third party options like Nagois / uptimerobot that I could use but I wanted to know if there are any AWS offerings for this? Is it possible to set up such automated monitoring using AWS Cloud Watch? I could not find a way to do this based on what I read up about Cloud Watch. If this isn't the sort of thing Cloud Watch can handle, then is there another AWS service suited for this?
I think Port Monitoring Feature is available under AWS Beanstalk.
You can consider checking this http://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.healthstatus.html
Ashutosh,
Ec2 is an IAAS service from AWS and you will not have an AWS offering to monitor your Tomcat server. However, you have custom-built solutions, which I think you are not looking for here.
However, if you are using an Application Load balancer or Beanstalk you get options to trigger alarms.
Yes , you can achieve it through a cloudwatch . collect your logs with a cloudwatch agent and upload it on cloudwatch logstream. below is the reference url for configuring cloudwatch agent.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/Install-CloudWatch-Agent.html
After that with "create matrix filter" you can set up an email trigger as per your requirements.
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/MonitoringPolicyE
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/Counting404Responses.htmlxamples.html