How to skip code if created dataset has zero rows - sas

I have a job which at first imports some xlsx files, then connects to multiple DB tables. Based on conditions, the job selects rows to output, and creates an excel file to send on to the final end-user.
Sometimes, that job returns zero rows, which is acceptable; in that case, I would prefer to create an empty excel file with only the variables, but not run the other code (checking/cleaning code).
How can I conditionally execute code only when there are results?
Something like this:
I get 0 rows
If Result = 0 then Go to *"here"*
Else *"just run the code further"*

You have a few useful things that can help you here.
First off, PROC SQL sets a macro variable SQLOBS, which is particularly useful in identifying how many records were returned from the last SQL query it ran.
proc sql;
select * from sashelp.class;
quit;
%put I returned &SQLOBS rows;
You might use this to drive further processing, either with %IF blocks as Tom notes in comments or other methods I will cover below.
You can also check how many rows are in a dataset explicitly, if you prefer a slightly more robust option.
proc sql;
select count(*) into :class_count from sashelp.class;
quit;
%put I returned &class_count rows;
For very large datasets, there are faster options (using the dataset descriptors, dictionary tables, or a few other options), but for most tables this is fine.
Either way, what I would typically do with a program I intended to run in production would be then to drive the rest of the program from macros.
%macro whatIWantToDo(params);
...
do stuff
...
%mend whatIWantToDo;
proc sql;
mySqlStuff;
quit;
%if &sqlobs. gt 0 %then %do;
%whatIWantToDo(params);
%end;
%else %do;
%put Nothing to do;
%end;
Another option is to use call execute; this is appropriate if your data drives the macro parameters. The big advantage of call execute is that it only runs if you have data rows - if you have zero, it won't do anything!
Say you have some datasets to run code on. You could have up to twelve - one per month - but only have them for the current calendar year, so in Jan you have one, Feb you have two, etc. You could do this:
data mydata_jan mydata_feb mydata_mar;
set sashelp.class;
run;
%macro printit(data=);
title "Printing &data.";
proc print data=&data;
run;
title;
%mend printit;
data _null_;
set sashelp.vtable;
where upcase(memname) like 'MYDATA_%' and nobs gt 0;
callstr = cats('%printit(data=',memname,')');
call execute(Callstr);
run;
First I make the datasets, with a name I can programmatically identify. Then I make the macro that I want to run on each (this could be checking, cleaning, whatever). Then I use sashelp.vtable which shows which tables are created, and check the nobs variable (number of observations) is more than zero. Then I use call execute to run the macro on that dataset!

Related

splitting a sas program into multiple programs and finally merging output

I created a file (metadata.sas7bdat) which has 100 entries with two columns directory and sasdatasets.
Directory filename
/home abc.sas7bdat
/home def.sas7bdat
/home/sub_dir ghj.sas7bdat
Assume I have sample.sas which just gets the means of each dataset.
Proc means data=abc.sas out=x;
Run;
The above sample.sas should run on each dataset present in metadata file. Right now I have written it in very traditional way of it loops through all the files in metadata and runs proc means on each file and appends all the data. The final dataset has the means of all the files present in metadata.
I believe if I can split the metadata file into 4 parts each having (100/4=25 entries) and submit it as 4 programs and finally merge the output from all the 4 programs It would reduce the processing time by large amount. ( think of 10,000 entries and also assume there is more processing than proc means). Its just I am not well versed with what kind of options to use to submit it as 4 programs and how to sync the output from 4 different processes.
Can you provide me the skeleton of how I should construct this program , I have my vague thoughts but I am sure I can take away some elegant answers .
This question appears to be very similar to Is it possible to loop over SAS datasets?
Breaking into four parts will not reduce the overall processing time. For each part sure. If you think the overall there will be reduced processing, what tests or evidence is there supporting that premise ?
When there is a very large set of processing, P, broken into steps, p(1), p(2), ... p(N), you will need to construct a data structure that can store intermediate results, and rules for not repeating past processing when restarting the process after some p(i) experiences an error, or prior attempt at P stops at p(i)
Consider your case, you will need to store, or accumulate intermediate means with a step-key. The natural key would be the directory and filename.
A top-level dispatcher invokes a step macro for each item in the metadata. The step macro performs the process details when necessary and appends the results in an accumulating way.
Top-level: dispatch each step for the metadata items
%macro process_all (metadata=, results=);
data _null_;
set &metadata;
invoke = cats('%process_step(libname=',libname,',memname=',memname,",results=&results)");
put 'NOTE: ' _n_= invoke=;
call execute ('%nrstr(' || trim(invoke) || ')');
/* global parameter for checking testing 'restart' *
%if %symexist(test_param_1) %then %do;
if _n_ >= &test_param_1 then stop;
%end;
run;
%mend;
Step-level: core process for one metadata item. Skip step if done before, otherwise do actions and accumulate results.
%macro process_step(libname=, memname=, results=);
%local have_results;
%local step_done;
%let have_results = %sysfunc(exist(&results,data));
%let step_done = 0;
%if &have_results %then %do;
data _null_;
set &results; where libname="&libname" and memname="&memname";
call symput ('step_done', '1'); stop;
run;
%end;
%if &step_done %then %do;
%put NOTE: This step already done. &=libname &=memname;
%return;
%end;
proc delete data=_step_out;
proc means data=&libname..&memname noprint;
var height weight;
output out=_step_out(label="step output for &libname. . &memname.");
run;
data _step_out;
length libname $8 memname $32.;
set _step_out;
libname = symget('libname');
memname = symget('memname');
run;
proc append base=&results data=_step_out force;
run;
%mend;
Test the scheme for various configurations of metadata
data _1 _2 _3 _4 _5 _6 _7 _8;
set sashelp.class;
run;
data configuration_1(keep=libname memname);
length libname $8 memname $32.;
libname = 'work';
do index = 1,2,3,7,8; memname='_'||cats(index); output; end;
run;
options mprint;
/*
* reset results to start from scratch
proc delete data=results_1;run;
*/
%let test_param_1 = 3; %* force stoppage after first three items;
%process_all(metadata=configuration_1, results=results_1)
%symdel test_param_1; %* force rerun of all to ignore first three already done items;
%process_all(metadata=configuration_1, results=results_1)
However you actually program the scheme you should be aware of and account for intermediate settings or data sets that might be left over from a prior run or step.

Using a series of values for a SAS macro parameter

I'm looking for a way to use a series of values for a macro parameter instead of a single value. I'm basically manipulating a series of files for consecutive months (May 2014 to Sept 2015) and I've written a macro to take advantage of the naming conventions. However, I'm still manually writing out the months to use the macro. I'm doing this many times over with lots of different files from this month. Is there a way to have the parameter reference a list of values and go through them like an array/do-loop? I've looked into %ARRAY as a possibility but that doesn't seem to do what I'm looking for unless I'm not seeing it's full capability. I've attached a sample of this code below.
%MACRO memmonth(monyr=);
proc freq data=work.both_&monyr ;
table var1/ out=work.mem_&monyr;
run;
data work.mem_&monyr;
set work.mem_&monyr;
count_&monyr=count;
run;
%MEND memmonth;
%memmonth(monyr=may14)
%memmonth(monyr=jun14)
%memmonth(monyr=jul14)
%memmonth(monyr=aug14)
%memmonth(monyr=sep14)
%memmonth(monyr=oct14)
%memmonth(monyr=nov14)
%memmonth(monyr=dec14)
%memmonth(monyr=jan15)
%memmonth(monyr=feb15)
%memmonth(monyr=mar15)
%memmonth(monyr=apr15)
%memmonth(monyr=may15)
%memmonth(monyr=jun15)
%memmonth(monyr=jul15)
%memmonth(monyr=aug15)
%memmonth(monyr=sep15)
In general I would recommend passing the list of values as a space delimited list and adding looping logic in the macro. If spaces are valid characters in the values then use some other delimiter. Do not use comma as the delimiter as it means you will need to use macro quoting to call the macro.
So your basic macro is this.
%macro memmonth(monyr);
proc freq data=work.both_&monyr ;
table var1/ out=work.mem_&monyr (rename=(count=count_&monyr)) ;
run;
%mend memmonth;
%memmonth(may14)
%memmonth(jun14)
You could change it to this.
%macro memmonth(monyrlist);
%local i monyr;
%do i=1 %to %sysfunc(countw(&monyrlist));
%let monyr=%scan(&monyrlist,&i);
proc freq data=work.both_&monyr ;
table var1/ out=work.mem_&monyr (rename=(count=count_&monyr)) ;
run;
%end;
%mend memmonth;
%memmonth(may14 jun14)
If you always want to process all of the months in an interval then you could just pass in the start and end month of the interval.
%macro memmonth(start,end);
%local i monyr;
%do i=0 %to %sysfunc(intck(month,"01&start"d,"01&end"d));
%let monyr=%sysfunc(intnx(month,"01&start"d,&i),monyy5.);
proc freq data=work.both_&monyr ;
table var1/ out=work.mem_&monyr (rename=(count=count_&monyr)) ;
run;
%end;
%mend memmonth;
%memmonth(may14,sep15)
If you have a source list, whether it is a text file or a dataset, you can use a simple data step to generate macro calls. So if you have an input dataset with the variable MONYR then your driver program would look like this:
data _null_;
set mylist ;
call execute(cats('%nrstr(memmonth)(',MONYR,')'));
run;
If the source is a file with the names then replace the SET statement with the appropriate INFILE and INPUT statements. If the source is a directory name then look at one of the many SAS ways to read the names of files in a directory into a dataset and use that to drive the macro call generation.

Controlling export (ODS or PROC) based on Number of Observations

I currently have a SAS process that generates multiple data sets (whether they have observations or not). I want to determine a way to control the export procedure based on the total number of observations (if nobs > 0, then export). My first attempt was something primitive using if/then logic comparing a select into macro var (counting obs in a data set) -
DATA _NULL_;
SET A_EXISTS_ON_B;
IF &A_E > 0 THEN DO;
FILE "C:\Users\ME\Desktop\WORKLIST_T &PDAY..xls";
PUT TASK;
END;
RUN;
The issue here is that I don't have a way to write multiple sets to the same workbook with multiple sheets(or do I?)
In addition, whenever I try and add another "Do" block, with similar logic, the execution fails. If this cannot be done with a data null, would ODS be the answer?
The core of what you want to do, conditionally execute code, can be done one of a number of ways.
Let's imagine we have a short macro that exports a dataset to excel. Simple as pie.
%macro export_to_excel(data=,file=,sheet=);
proc export data=&data. outfile=&file. dbms=excel replace;
sheet=&sheet.;
run;
%mend export_to_excel;
Now let's say we want to do this conditionally. How we do it depends, to some degree, on how we call this macro in our code now.
Let's say you have:
%let wherecondition=1; *always true!;
data class;
set sashelp.class;
if &wherecondition. then output;
run;
%export_to_excel(data=class,file="c:\temp\class.xlsx", sheet=class1);
Now you want to make this so it only exports if class has some rows in it, right. So you get the # of obs in class:
proc sql;
select count(1) into :classobs from class;
quit;
And now you need to incorporate that somehow. In this case, the easiest way is to add a condition to the export macro. Open code doesn't allow conditional executing of code, so it needs to be in a macro.
So we do:
%macro export_to_excel(data=,file=,sheet=,condition=1);
%if &condition. %then %do;
proc export data=&data. outfile=&file. dbms=excel replace;
sheet=&sheet.;
run;
%end;
%mend export_to_excel;
And you add the count to the call:
%export_to_excel(data=class,file="c:\temp\class.xlsx", sheet=class1,condition=&classobs.)
Tada, now it won't try to export when it's 0. Great.
If this code is already in a macro, you don't have to alter the export macro itself. You can simply put that %if %then part around the macro call. But that's only if the whole thing is already a macro - %if isn't allowed outside of macros (sorry).
Now, if you're exporting a whole bunch of datasets, and you're generating your export calls from something, you can add the condition there, more easily and more smoothly than this.
Basically, either make by hand (if that makes sense), or use proc sql or proc contents or (other method of your choice) to make a dataset that contains one row per dataset-to-export, with four variables: dataset name, file to export, sheet to export (unless that's the same as the dataset name), and count of observations for that dataset. Often the first three would be made by hand, and then merged/updated via sql or something else to the count of obs per dataset.
Then you can generate calls to export, like so:
proc sql;
select cats('%export_to_excel(data=',dataname,',file=',filename,',sheet=',sheetname,')')
into :explist separated by ' '
from datasetwithnames
where obsnum>0;
quit;
&explist.; *this actually executes them;
Assuming obsnum is the new variable you created with the # of obs, and the other variables are obviously named. That won't pull a line with anything with 0 observations - so it never tries to execute the export. That works with the initial export macro just as well as with the modified one.
Suggest you google around for different approaches to writing XLS files.
Regarding using a DATA step or PROC step, the DATA step is tolerant of datasets that have 0 obs. If the SET statement reads a dataset that has 0 obs, it will simply end the step. So you don't need special logic. Most PROCS also accomodate 0 obs dataset without throwing a warning or error.
For example:
1218 *Make a 0 obs dataset;
1219 data empty;
1220 x=1;
1221 stop;
1222 run;
NOTE: The data set WORK.EMPTY has 0 observations and 1 variables.
1223
1224 data want;
1225 put "I run before SET statement.";
1226 set empty;
1227 put "I do not run after SET statement.";
1228 run;
I run before SET statement.
NOTE: There were 0 observations read from the data set WORK.EMPTY.
NOTE: The data set WORK.WANT has 0 observations and 1 variables.
1229
1230 proc print data=empty;
1231 run;
NOTE: No observations in data set WORK.EMPTY.
But note as Joe points out, PROC EXPORT will happily export a dataset with 0 obs and write an file with 0 records, overwriting if it was there already. e.g.:
1582 proc export data=sashelp.class outfile="d:\junk\class.xls";
1583 run;
NOTE: File "d:\junk\class.xls" will be created if the export process succeeds.
NOTE: "CLASS" range/sheet was successfully created.
1584
1585 data class;
1586 stop;
1587 set sashelp.class;
1588 run;
NOTE: The data set WORK.CLASS has 0 observations and 5 variables.
1589
1590 *This will replace class.xls";
1591 proc export data=class outfile="d:\junk\class.xls" replace;
1592 run;
NOTE: "CLASS" range/sheet was successfully created.
ODS statements would likely do the same.
I use a macro to check if a dataset is empty. SO answers like:
How to detect how many observations in a dataset (or if it is empty), in SAS?

what is wrong with the following sas code?

I am new to sas. I have written a basic code but it is not working. Can somebody help me figure out what is wrong with the code. I wish to append the datasets.
options mprint mlogic symbolgen;
%macro temp();
%let count = 0;
%if &count = 0 %then %do;
data temp;
set survey_201106;
%let count = %eval(&count +1);
%end;
%else %do;
%do i = 201107 %to 201108;
data temp;
set temp survey_&i;
%end;
%end;
run;
%mend;
%temp;
You are setting &count to 0 at the beginning of the macro, so the %else clause will never be executed.
I'm not sure what your aim is, but it looks like you just want to concatenate 3 datasets and store in a new dataset. If so, will this not suffice:
data temp;
set survey_201106-survey_201108;
run;
This creates a dataset called temp and populates it with the the contents of survey_201106, survey_201107 and survey_201108 in order. The - tells SAS that you want the all the datasets named survey_20110* between survey_201106 and survey_201108 inclusive.
Details of the syntax.
options mprint mlogic symbolgen;
%macro temp();
proc sql noprint;
create table table_list as
select monotonic() as num,memname
from dictionary.tables
where libname = 'WORK' and memname contains 'SURVEY_';
quit;
proc sql noprint;
select count(*) into :cnt
from table_list;
quit;
%do i = 1 %to &cnt.;
%if &i eq 1 and %sysfunc(exist(work.temp)) %then %do;
proc sql;
drop table work.temp;
quit;
%end;
proc sql noprint;
select memname into :memname
from table_list
where num = &i.;
quit;
proc append base = temp data = &memname. force;
run;
%end;
%mend;
%temp;
Working: Above code will append all the work data sets whose names starting with
'SURVEY_' to temp data set.
dictionary.tables is used to create a data set which will contain the list of data sets whose names starts with 'survey_'.
table_list data set:
num memname
1 survey_201106
2. survey_201107
3. survey_201108
cnt macro variable is created to hold number of such data sets
Within loop, each data set present the list of data set names (in table table_list) will be appended to work.temp data set
What you're probably trying to do is create a macro that appends (without using proc append for some reason) when a dataset exists, or creates it new when it does.
SAS is not like r or other similar languages, where you have to control largely everything that happens. SAS's strength is that you can ask it to do common things with only a line or two of code. SAS is what's commonly called a 4th Generation Language for this reason: you're not supposed to control all of the little bits. That's a waste of time. Use the Procedures (PROC...) and constructs SAS provides you.
In this case, PROC APPEND does exactly what this whole macro does. It creates a dataset or appends new rows to it if it already exists.
proc append base=temp data=in_data;
run;
Now, if you're trying to learn the macro language and using this concept as a learning tool only, it is possible to do this in a macro that's not all that different from yours.
Note: This is not a good way to do this. It might be useful for learning macro concepts, but it should not be used as an example of good code. Despite my improvements, it is still not the way you should do this; proc append or SRSwift's example are better.
One thing I'm going to introduce here: a macro parameter. A good rule of macro programming is that all macros should have parameters. If there's no possible parameter, it should usually be possible to do without needing a macro. Parameters are what make macros useful, most of the time. In this case I'm going to rewrite your macro to take one append dataset as a parameter and one 'base' dataset. In your example, temp was the base dataset and survey_1106 etc. are the append datasets.
Also, &count needs to be a global macro variable. In SAS, variables created inside a macro are by default local in scope - ie, they only are defined inside one run of the macro and then disappear. This is nearly identical to functions in c/etc. languages (a bit different from r, which uses lexical scoping, and you might be expecting given how you wrote this). There are some funny rules, though, but for now we'll just go with this. global macro variables, which include any variable that has already been defined in the global scope, are available in all macro iterations (and outside of macros).
So:
%macro append_dataset(base=,append=);
%if &count=0 %then %do;
data &base.;
set &append.;
run;
%end;
%else %do;
data &base.;
set &base. &append.;
run;
%end;
%let count=%eval(&count.+1);
%mend append_dataset;
%let count=0;
%append_dataset(base=temp,append=survey_1106);
%append_dataset(base=temp,append=survey_1107);
%append_dataset(base=temp,append=survey_1108);
Now, you could generate those calls through an external method (such as dictionary.tables as in Harshad's example). You also could add another element to the macro, which is to iterate over all of the elements in a list provided as append. You could also hardcode the list in a %do loop, as you did in the initial example (but I think that's bad practice to get into). You could literally do that in my macro:
%macro append_dataset(base=,append=);
%do survey=201106 to 201108;
%if &count=0 %then %do;
data &base.;
set survey_&survey.;
run;
%end;
%else %do;
data &base.;
set &base. survey_&survey.s;
run;
%end;
%let count=%eval(&count.+1);
%end;
%mend append_dataset;
Notice the count increment is inside the do loop - that's one of the places you went wrong here. Otherwise this is just adding an outer loop and changing the append mentions to the calculated values from the loop. But again, this is fairly poor coding practice - the loop should at minimum be constructed from a macro parameter.

Dropping a table in SAS

What is the most efficient way to drop a table in SAS?
I have a program that loops and drops a large number of tables, and would like to know if there is a performance difference between PROC SQL; and PROC DATASETS; for dropping a single table at a time..
Or if there is another way perhaps???
If it is reasonable to outsource to the OS, that might be fastest. Otherwise, my unscientific observations seem to suggest that drop table in proc sql is fastest. This surprised me as I expected proc datasets to be fastest.
In the code below, I create 4000 dummy data sets then try deleting them all with different methods. The first is with sql and on my system took about 11 seconds to delete the files.
The next two both use proc datasets. The first creates a delete statement for each data set and then deletes. The second just issues a blanket kill command to delete everything in the work directory. (I had expected this technique to be the fastest). Both proc datasets routines reported about 20 seconds to delete all 4000 files.
%macro create;
proc printto log='null';run;
%do i=1 %to 4000;
data temp&i;
x=1;
y="dummy";
output;run;
%end;
proc printto;run;
%mend;
%macro delsql;
proc sql;
%do i=1 %to 4000;
drop table temp&i;
%end;
quit;
%mend;
%macro deldata1;
proc datasets library=work nolist;
%do i=1 %to 4000;
delete temp&i.;
%end;
run;quit;
%mend;
%macro deldata2;
proc datasets library=work kill;
run;quit;
%mend;
option fullstimer;
%create;
%delsql;
%create;
%deldata1;
%create;
%deldata2;
I tried to fiddle with the OS-delete approach.
Deleting with the X-command can not be recommended. It took forever!
I then tried with the system command in a datastep:
%macro delos;
data _null_;
do i=1 to 9;
delcmd="rm -f "!!trim(left(pathname("WORK","L")))!!"/temp"!!trim(left(put(i,4.)))!!"*.sas7*";
rc=system(delcmd);
end;
run;
%mend;
As you can see, I had to split my deletes into 9 separate delete commands. The reason is, I'm using wildcards, "*", and the underlying operating system (AIX) expands these to a list, which then becomes too large for it to handle...
The program basically constructs a delete command for each of the nine filegroups "temp[1-9]*.sas7*" and issues the command.
Using the create macro function from cmjohns answer to create 4000 data tables, I can delete those in only 5 seconds using this approach.
So a direct operating system delete is the fastest way to mass-delete, as I expected.
We are discussing tables or datasets?
Tables implies database tables. To get rid of these in a fast way, using proc SQL pass-through facility would be the fastest. Specifically if you can connect to the database once and drop all of the tables, then disconnect.
If we are discussing datasets in SAS, I would argue that both proc sql and proc datasets are extremely similar. From an application standpoint, they both go through the same deduction to create a system command that deletes a file. All testing I have seen from SAS users groups or presentations have always suggested that the use of one method over the other is marginal and based on many variables.
If it is imperative that you have the absolute fastest way to drop the datasets / tables, you may just have to test it. Each install and setup of SAS is different enough to warrant testing.
In terms of which is faster, excluding extremely large data, I would wager that there is little difference between them.
When handling permanent SAS datasets, however, I like to use PROC DATASETS rather than PROC SQL, simply because I feel better manipulating permanent datasets using the SAS-designed method, and not the SQL implementation
Simple Solution for temporary tables that are named similarly:
If all of your tables start with the same prefix, for example p1_table1 and p1_table2, then the following code will delete any table with that starts with p1
proc datasets;
delete p1: ;
run;
proc delete is another, albeit undocumented, solution..
http://www.sascommunity.org/wiki/PROC_Delete