ColdFusion Uptime Monitor Script - coldfusion

I need some help with creating a simple site status (uptime/downtime) monitor script in ColdFusion.
My guess it can be done using cfschedule, but I am not knowledgeable with that so I would really appreciate any help.
Basically I would like the script to check if an application on my site (http://www.mysite.com/application) is accessible or not 60 minutes. If the application is down in that 60 minutes, then I am sent an e-mail to email#mysite.com.
Can anybody please help me with this? I am using ColdFusion 7.

Remember that checking your site/application using a script on the same server may not do much good. After all, if the server or CF is down then your script will fail to run in any case.
Be that as it may the easiest thing is to create some sort of page on your application that returns something you can check - like an XML packet or simply the word "ok" if you like. In some cases you might run a DB query as well - since DB's are at the top of the list for likely culprits when you have trouble. So for example you might do something like:
<cfsetting enablecfoutputonly="yes"/>
<cfquery name="checkQuery" datasource="myDSN">
SELECT getDate() AS myDate
</cfquery>
<cfoutput>OK</cfoutput>
And save the page as "test.cfm" in your application. You might do other things as well.
Then, in a CFM page that is NOT a part of your application - and preferably on a different server altogether - you would create a script that hits your test.cfm page and looks for return of "OK". Anything else would be a problem and you could log or send an email or whatever. That code might look like this.
<cfhttp
url="http://www.mysite.com/myapplication/test.cfm"
timeout="10">
</cfhttp>
<cfif trim(cfhttp.filecontent) IS NOT "OK">
send an email or log or whatever action you want to take to handle the exception.
</cfif>
Hope this helps :)

Related

How to share Newman htmlextra report?

This may be a basic question but I cannot figure out the answer. I have a simple postman collection that is run through newman
newman run testPostman.json -r htmlextra
That generates a nice dynamic HTML report of the test run.
How can I then share that with someone else? i.e. via email. The HTML report is just created through a local URL and I can't figure out how to save it so it stays in its dynamic state. Right clicking and Save As .html saves the file, but you lose the ability to click around in it.
I realize that I can change the export path so it saves to some shared drive somewhere, but aside from that is there any other way?
It's been already saved to newman/ in the current working directory, no need to 'Save As' one more time. You can zip it and send via email.
If you want to change the location of generated report, check this.

Proc http with https url

So, I want to use Google Url shortener Api, and I try to use
proc http
so, when I run this code
filename req "D:\input.txt";
filename resp "D:\output.txt";
proc http
url="https://www.googleapis.com/urlshortener/v1/url"
method="POST"
in=req
ct="application/JSON"
out=resp
;run;
(where D:\input.txt looks like {"longUrl": "http://www.myurl.com"} ) everything works greate on my home SAS Base 9.3. But, at work, on EG 4.3, I get:
NOTE: The SAS System stopped processing this step because of errors.
and no possible to debug. After googling, I found, that I have to set java system option like this
-jreoptions (-Djavax.net.ssl.trustStore=full-path-to-the-trust-store -Djavax.net.ssl.trustStorePassword=trustStorePassword)
But, where I can get "the certificate of the service to be trusted"- and password to it?
Edit: As I noticed in comments below, my work SAS installed into server, so I didn't have direct access to configuration. Also, It isn't good idea to change servers config. So, I try to google more, and found beautiful solution using cUrl, without X command (cause it block in my EG). Equivalent syntax is:
filename test pipe 'curl -X POST -d #D:\input.txt https://www.googleapis.com/urlshortener/v1/url --header "Content-Type:application/json"';
data _null_;
infile test missover lrecl= 32000;
input ;
file resp;
put _infile_;
run;
Hope it help someone
Where to get the certificate
Open the URL that you want the certificate from via Chrome. Click on the lock file in the URL bar, click on "details" tab and then click on "Save as file" in the bottom right. You will need to know what trust store you are going to use at this stage. See the following step.
The password and trust store is defined by you. It is in most cases nothing more than an encrypted zip file. There are a lot of tools out there that allow you to create a trust store, encrypt it and then import the certificates into it. The choice will depend on what OS you are using. There are some java based tools that OS independent, for example Portecle. It allows to define various trust stores on different OS and you can administer them remotely.
Regards,
Vasilij

How can I write data about process assignees to database

I use camunda 7.2.0 and i'm not very experienced with it. I'm trying to write data about users, who had done something with process instance to database (i'm using rest services) to get some kind of reports later. The problem is that i don't know how to trigger my rest(that sends information to datebase about current user and assignee) when user assignes task to somebody else or claims task to himself. I see that camunda engine sends request like
link: engine/engine/default/task/5f965ab7-e74b-11e4-a710-0050568b5c8a/assignee
post: {"userId":"Tom"}
As partial solution I can think about creating a global variable "currentUser" and on form load check if user is different from current, and if he is - run the rest and change variable. But this solution don't looks correct to me. So is there any better way to do it? Thanks in advance
You could use a task listener which updates your data when the assignee of a task is changed. If you want this behavior for every task you could define a global task listener.

Lotus Notes throws NotesException: Database open failed (%1)

I am running a web service from an instance of Salesforce to our client's Lotus Notes server. I am able to get hard-coded content to return, so I feel certain that the connection itself is working as intended.
However, as noted in the subject, I am running into a NotesException. This is being thrown on the last line of the code below (db and path are simply parameters I pass into the function, I am able to view records from the nsf they correspond to):
s = WebServiceBase.getCurrentSession();
Database data = s.getDatabase(db, path);
data.open();
If I try not opening the database, I get an exception that says I need to open the database. We had been developing this web service for a while without ever opening the database or knowing that this was something to concern ourselves with. Obviously something changed, but as my office is full of Salesforce devs and not LN devs, we don't know what.
Any help in tracking down the root cause of this issue would be greatly appreciated.
Edit:
Comments want to know what I meant by hardcoded content. The function returns a 2d String array, so it would be something like:
result[0][0] = "Hello World";
return result;
The Problem is that the db ist not found. So try to change the path from the db.
if your db is in a folder on the server give the path for the folder in java.
Your path variable must have the right format.

Codeception - HTML report generation seems slow?

I am using Codeception to run three acceptance tests which basically are as follows:-
Check the email address 'admin#admin.com' exists
Create a new user account
Login to the website
Obviously this requires the database so I have added 'Db' to the list of modules in the acceptance.suite.yml, however the generation of the report takes sometime, is this normal or do I have something wrong with my setup?
Below is the report (and time taken for each according to the html file it is generating)
check admin#admin.com account exists (AdminCept.php) (0.01s)
create new user account (CreateUserCept.php) (19.1s)
log in to the website (LoginCept.php) (21.72s)
Approx 40 seconds in total (although the command line states 1:02 - I guess as it replaces the mock database dump.sql back into the database as well)
Can anybody shed any light on the matter?
Not really an answer but closing this off - simply put the report generation takes time.