SAS auto log clear with rsubmit - sas

I used to add dm "out;clear;log;clear;"; to clear the log and prevent the code from pausing for input. However, now I am using WRDS remote connection. This line after rsubmit does not work and the I lost connection to the server because I was not by the computer when the log was full and needed for user input to be cleared. Is there a way to prevent the code from stopping? Here is what I am doing now.
options ls = 78 ps = 66;
********************connect to WRDs;***************************************;
%let wrds = wrds.utexas.edu 4016;options comamid = TCP remote=WRDS;
signon username=_prompt_;
*************************************************************************;
rsubmit;
libname qa"F:\research2\transcripts";
libname cq '/wrds/nyse/sasdata/taqms/cq';
proc upload data=qa.daylist out=daylist; run;
data daylist;set daylist;traday2 = input(put(traday,yymmddn8.),8.);drop traday;rename traday2=traday;run;
options errors=2;
data intraday;run;
%macro temp;
%do i = 1 %to 2215;
.......
dm "out;clear;log;clear;";
%end;
%mend;
%Temp;

One option (which avoids the need to clear the log) is to write the log to an external destination using proc printto (doc link). The syntax is:
proc printto log='/path/to/your.log';
run;

Related

Paging through rest api results?

I am using the following code to pull a json file from a url :
options NOQUOTELENMAX;
filename usage "/folders/myfolders/sasuser.v94/usage.json";
%let AccessKey = reallylongstring;
proc http
url="https://a.url"
method="GET" out=usage;
headers
"Authorization"="Bearer &AccessKey.";
run;
libname usage json "/folders/myfolders/sasuser.v94/usage.json";
data usage;
set usage.data;
run;
proc print data=usage noobs;
run;
however now the results are returning more that 1000 results and i need to check for nextLink property somehow?
in .net I could use something like this:
$usagerest = Invoke-Restmethod -url $usageurl -header $authheaders -method get
while ($null -ne $usageRest.nextLink) {
$usageRest = Invoke-Restmethod -uri $usagerest.nextlink -headers $authheaders -method get
}
is something like this possible with proc http in sas?
heres a tree view of the actual json if of help?
so far i have tried a quick dirty version:
libname usage1 JSON fileref=resp1;
data x;
set usage1.root;
call symputx('nextLink',nextLink);
run;
proc http
url="%superq(nextLink)"
method="GET" out=resp2;
headers
"Authorization"="Bearer &AccessKey.";
run;
libname usage2 JSON fileref=resp2;
data y;
set usage2.root;
call symputx('nextLink',nextLink);
run;
proc http
url="%superq(nextLink)"
method="GET" out=resp3;
headers
"Authorization"="Bearer &AccessKey.";
run;
libname usage3 JSON fileref=resp3;
data z;
set usage3.root;
call symputx('nextLink',nextLink);
run;
Libraries overview:
usage1.data sample:
work.x sample:
Thanks
For the context of the resolved macro variable needing to be double quoted, such as url= option value, place the %superq inside double quotes -- "%superq(<macro-var-name>)"
Try
…
url = "%superq(nextLink)"
If the link continues to be troublesome you can try
url = "%qsysfunc(urlencode(%superq(nextLink)))"
Invoking a macro once per row in a control data set
There are numerous ways, such as
Stack the 1,000 invocations via data _null_; and call execute(…
Proc SQL select nextLink into :link1- to create 1,000 macro variables
Open the data set with %let ds=%sysfunc(open(… and loop over the 1,000 rows — %do %while (%sysfunc(fetch(&ds) = 0))
Sample code for second bullet item:
This sample passes a macro variable from the 'looping' macro to the 'worker' macro. When macro variable name is passed, the macro var does not have to be resolved and quoted for passage into the worker. Instead the work gets the macro var (aka symbol name) and has the superq resolve it in quoted manner for use in source code generation. Essentially, passing the macro var is akin to a traditional language's pass-by-reference concept.
data have;
do linknum = 1 to 25;
link = cats("place=", byte(64+linknum),'&extra="zoom=',linknum,'"&key=MYAPIKEY');
output;
end;
run;
%macro processLink(link_mvar=);
%put url="%sysfunc(urlencode(%superq(&link_mvar)))";
%mend;
%macro processLinks (data=);
proc sql;
reset noprint;
select link into :link1- from &data;
quit;
%local i;
%do i = 1 %to &sqlobs;
%* pass name of macro variable to macro;
%processLink (link_mvar=link&i);
%end;
%mend;
%processLinks(data=have)
The question seems clearer. The process can be macro-ized to repeatedly call Proc HTTP inside a %do loop. Each page of data can be appended to a data set that will grow with each page retrieved.
Untested
%macro getThatPagedJsonData (url=, accesskey=, out=);
%local page guard;
%let page = 1;
%Let guard = 50; * just in case - prevent infinite/excessive looping during development/testing;
filename response temp;
%do %until (%length(%superq(url)) eq 0 or &page > &guard);
* clear libname, releasing any locks on json repsonse file;
libname page;
* prior response will be over written;
proc http url="%superq(url)" method="GET" out=response;
headers "Authorization"="Bearer &AccessKey.";
run;
* magic json engine;
libname page JSON fileref=response;
if &page = 1 %then %do;
* first page starts the output data set;
data &out;
set page.data;
run;
%end;
%else %do;
* append subsequent pages of data;
proc append base=&out data=page.data;
run;
%end;
* track number of pages processed;
%let page = %eval (&page + 1);
* reset url for %until test;
%let url=;
* fetch the nextlink as the url for next iteration of %do %until;
* might need error handling here when last page has no nextlink;
data _null_;
set page.root;
call symput('url', trim(nextlink));
run;
%end;
%mend;
%getThatPagedJsonData (url=...., accesskey=...., out=serviceAggreements);

Error Handling in Proc SQL(like try Catch)

I doing like this
libname DZY 'Path';
Proc sql;
select * from DZ.some_table;
run;
Here I have to add an error handling like if something goes wrong in select statement or within the block I have to write error message to an separate text file in the folder.
This is what I tried
%macro sortclass;
Proc sql;
select * from DZ.some_table;
run;
%if &amp;SQLRC gt 0 %then %goto error;
%error:
proc export data=""
run;
%exit:
%mend;
%sortclass;
I am trying to do an try catch like error handling.., How can do this in effective way. Thanks In advance
Base SAS does not have try/catch mechanisms, so an error due to macro generated code can cause problems with attempts to externally log messages.
filename logfile 'mysolution.log' mod;
%macro mylogger(message);
* may fail due to apriori errors;
data _null_;
file log;
put "%superq(message)";
run;
* a more advanced version would instead use macro `%sysfunc'd` invocations of
* `FOPEN FPUT FWRITE FWRITE` to append to a custom external log file;
%mend;
%macro mymacro(...);
... some SQL statement(s) ...
%if &SYSRC > 0 %then %do;
%mylogger(Failer at step 1);
%return;
%end;
... some more SQL statement(s) ...
%if &SYSRC > 0 %then %do;
%mylogger(Failer at step 2);
%return;
%end;
%mend;
Note: You can also submit SAS code from java or .net code that uses a SAS workspace session -- search for SAS integrated object model (iom) for more info. Such a solution would have a rich environment for try/catch/throw and advanced logging models.

SAS libref not recognized in macro loop

I've run into an odd SAS quirk that I can't figure out - hopefully you can help.
I have a simple macro loop that imports CSV files and for some reason if I use a libref statement in the "out=" part of the import procedure, SAS doesn't recognize the libref as a valid name. But if I use the same libref in a data step, it works just fine.
The specific error it gives is: "ERROR: "TESTDB." is not a valid name."
I'd like to figure this out because I work with pretty big files and want to avoid reading through them more times than is necessary.
Here's the code that works, with some comments in it. I got around the issue by reading in the files, then writing them to permanent SAS datasets in a second step, but ideally I'd like to import the files directly into the "TESTDB" library. Any idea how to get SAS to recognize a libref in the "out=" statement of the import procedure?
libname testdb "C:\SAS test";
%let filepath = C:\SAS test\;
%macro loop(values);
%let count=%sysfunc(countw(&values));
%do i = 1 %to &count;
%let value = %qscan(&values,&i,%str(,));
proc import datafile = "&filepath.&value..csv"
out = &value dbms=csv replace; getnames=yes;
/*"out=testdb.&value" in the line above does not work*/
run;
data testdb.&value; set &value; run;
/*here the libref testdb works fine*/
%end;
%mend;
%loop(%str(test_a,test_b,test_c));
Thanks in advance for your help!
john
Perhaps try:
out=testdb.%unquote(&value)
Sometimes the macro language does not unquote values automatically. With result that the extra quoting characters introduced by a quoting function (%qscan %str %bquote %superq etc) cause problems.
Strange error. I am not able to pin it. My guess is that it has something to do with how the value macro variables are being created. When I moved the value variable creation to a data step and used Call Symputx, it works.
%macro loop(files);
/* Create macro variables for files.*/
data _null_;
count = countw("&files.",",");
call symputx("count",count,"L");
do i = 1 to count;
call symputx(cats("file",i),scan("&files.",i,","),"L");
end;
run;
/* Read and save each CSV as a sas table. */
%do i=1 %to &count.;
proc import datafile = "&filepath.&&file&i...csv"
out = testdb.&&file&i. dbms=csv replace; getnames=yes;
run;
%end;
%mend;
%loop(%str(test_a,test_b));

Syslput, rsubmit & macro

I am trying to pass a local macro variable within a macro to a remote session as follows (this example assumes 'mynode has already been signed on to):
%macro mytest;
%do i = 1 %to 3;
%syslput mynewval = &i;
rsubmit mynode;
%let mynewval2 = &mynewval;
%put &mynewval2;
endrsubmit;
This looks like the correct syntax to me, however '&mynewval2' is resolving to blank when I attempt to print it to the log. Can anyone see what I am doing wrong?
Thanks
%end;
%mend;
%mytest;
The %let mynewval2 = &mynewval; is being run on the client and not the server. IE, the local macro processor is running the code. It doesn't know what &mynewval is -- you defined it with the remote system.
Try wrapping the code inside the RSUBMIT in a macro. I don't have SAS/CONNECT licensed so I cannot test.
rsubmit mynode;
%macro run_on_server();
%let mynewval2 = &mynewval;
%put &mynewval2;
%mend;
%run_on_server();
endrsubmit;

Opening SAS datasets for viewing from within a .sas program

Is there a way to open a SAS dataset for viewing (i.e., in the "ViewTable" window) from within a .sas file?
I think this will do what you want:
dm log "vt sashelp.air";
Just change the "sashelp.air" part to your lib.table combo.
dw.mackie's answer is right on the money. That works great when submitted from the SAS editor window.
But I just want to caution you to be careful if you attempt it in batch mode (that is, having SAS run a .sas program directly from command-line using the -sysin option). It will indeed attempt to pop open the interactive SAS window environment upon execution.
But, if your batch code also attempts to build some graphs/charts, you'll be required to use the -noterminal option. And the -noterminal option isn't compatible with the dm command. You'd spot it right away in the log, but I just wanted to give you a heads-up.
Because of the size of some of my datasets I just do a simple proc print and limit the output to only 50 observations. I do this so often that I created the following macro that dumps the output to a html file.
%Macro DPrt(Dset, obs=50, vars=, w=, Path="C:\output\");
%LET BKPATH = &Path;
%PUT BKPATH= &BKPATH;
options obs = &obs.;
title;
ods listing close;
ods html
path = &BKPATH.
body = "Debug-&Dset..htm"
style = THEME;
proc print data = &Dset n u split=' ';
%if &vars NE %THEN %DO;
var &vars.;
%END;
%if &w NE %THEN %DO;
&w;
%END;
Run;
ods html close;
ods listing;
options obs = MAX;
%Mend Dprt;
Sample call for dataset test looks like
%dprt(test)