SAS Running out of memory while calling macros - sas

When I need to call a macro several times, I've been using CALL EXECUTE in a DATA NULL step like so:
DATA _NULL_;
DO i=1 to 1000;
CALL EXECUTE ('%mymacro');
RUN;
This has worked fine for me up until now. However if I use this method to call %mymacro a million times (say), I get an "out of memory" error before it runs the macro once.
My naive understanding of this is that SAS attempts to "write out" the macro a million times before executing and thus runs out of memory during this process. Is this accurate? And: what are good ways to get around this?

You just need to understand how Call Execute works :
Basically ,Call Execute will parse the macro code immediately, but it queues up the resulting SAS steps until after the current data step finishes. In other words, you are potentially building up millions upon millions of lines of SAS code in memory that SAS is just storing up to be executed when that data _null_ step finishes. Eventually, this gets so large that SAS just crash.
Here's a couple of solutions :
1- Either add %nrstr() into your CALL EXECUTE statements.
2- Or change your data _null_ step to generate a file with the code and %include the file.

One option would be to chage the data step so that it actually creates a .sas file that contains the macro calls... and then %include it. For exmaple:
data _null_;
file "myfile.sas";
do i=1 to 1000;
put '%mymacro';
end;
run;
%include "myfile.sas";
This may fix the issue. Then again I'm not sure if SAS would like a .sas program that contains 1 million lines of code either. If the latter is the case, then simply break the program up into 10 .sas files each with 100k lines of code.

Related

Call sas program in a loop

I have a SAS program . I need to call the SAS program multiple times, each time passing a different date parameter.
Am I correct that first I need to wrap the entire .sas file into some kind of macro and then I need to call that macro repeatedly? Or is there a way to do it without wrapping it in a macro ?
In short: maybe, yes.
Maybe:
If you have specific program you wish to launch each time with certain parameter that can be done from command line. There is sysparm-variable, which is imported to the program, like following:
> <path>SASHome\SASFoundation\9.4\sas.exe -sysparm "21537"
which in SAS code is equivalent to:
%let sysparm = 21537;
This enables you to restrict, label data wit your input as much need be. Also you can run your program as many times with any parameter you wish. What we do is parse the Sysparm to allow multiple parameter to be passed.
For more on Sysparm, see documentation
Yes:
If you want to run your code mulitple times in a session you ideally want something like:
%macro do_stuff(your_date):
%put Processing date &your_date.;
data data_&your_date.;
set someLib.begin;
if your_date < data_date < (your_date-20) ;
run;
/*And so forth....*/
%mend do_stuff;
%do_stuff(date_1);
%do_stuff(date_2);
%do_stuff(date_3);

use a for loop applying a condition to upload multiple files in SAS [duplicate]

This code executes fine when Run as a SAS program:
%MyMacro(foo_val, bar_val, bat_val);
I have created a table using:
DATA analyses;
input title : $32. weight : $32. response : $32.;
datalines;
foo1 bar1 bat1
foo2 bar2 bat2
;
I want to execute MyMacro once for each row of the analyses table.
The following code appears to only pass the string values title, weight and response (rather than the data values foo1 etc.) to my macro (tested with calls to the %put command) :
DATA _NULL_ ;
set analyses;
%MyMacro(title, weight, response);
RUN;
How can I invoke the macro once per record of the analyses table whilst passing data values as arguments to the macro? The intention is to actually run this for a very large number of analyses so the solution must scale appropriately to many more records in the analyses table.
This in part depends on what your macro is doing. If we assume that your macro is doing something that is intended to be run outside of a data step (ie, it's not just assigning a data step variable), then you have several options.
CALL EXECUTE
PROC SQL: SELECT INTO macro variable
Write macro calls into an %INCLUDE file
DOSUBL
CALL EXECUTE has already been explained, and is a good option for some cases. It has some downsides, however, particularly with macro timing, that requires some extra care to protect in some cases - particularly when you are creating macro variables inside your macro. Quentin in his comments shows a way to get around this (adding %NRSTR to the call), but I find that I prefer to only use CALL EXECUTE when there's an advantage to doing so over the other methods - particularly, if I want to use SAS data step techniques (such as FIRST or LAST, for example, or some form of looping) in creating my macro calls, or when I have to do things in a data step anyway and can avoid the overhead of reading the file another time. If I'm just writing a data step like yours above - data something, set something, call execute, run - I wouldn't use it.
PROC SQL SELECT INTO is typically what I use for list processing (which is largely what this is). I like SQL's simplicity a bit better when doing things that aren't too complicated; for example, you can get just one version of each macro call easily with DISTINCT without having to explicitly write a proc sort nodupkey or use first/last processing. It also has the advantage for debugging that you can write all of your macro calls to your results window (if you don't add noprint), which is a bit easier to read than the log for me if I'm trying to see why my calls didn't get generated properly (and doesn't take any extra PUT statements).
proc sql;
select catx(',','%macro(',arg1,arg2,arg3)||')'
into :mvarlist separated by ' '
from dataset;
quit;
&mvarlist.
That runs them quite simply, and has no timing issues (As you're just writing a bunch of macro calls out).
The main downside to this method is that you have a maximum of 64k characters in a macro variable, so if you're writing a huge number of these you'll run into that. In that case use CALL EXECUTE or %INCLUDE files.
%INCLUDE files are largely useful either as replacement for SELECT INTO when the call is over the character limit, or if you find it useful to have a text file to look at with your calls (if you're running this in batch mode for example, this could be easier to get to and/or parse than log or listing output). You just write your calls out to a file, and then %INCLUDE that file.
filename myfile temp; *or a real file if you want to look at it.;
data _null_;
set dataset;
file myfile;
length str $200;
str=catx(',','%macro(',arg1,arg2,arg3)||')';
put str;
run;
%include myfile;
I don't really use this much anymore, but it's a common technique used particularly by older SAS programmers so good to know.
DOSUBL is a relatively new method, and to some extent can be used to replace CALL EXECUTE as its default behavior is typically closer to what you expect intuitively than CALL EXECUTE's. The doc page has really the best example for how this works differently; basically, it fixes the timing issue by letting each separate call look import and export the macro variables from/to the calling environment, meaning that each iteration of DOSUBL is run at a distinct time versus CALL EXECUTE where everything is run in one bunch and the macro environment is 'fixed' (ie, any reference to a macro variable is fixed at run time, unless you escape it messily with %NRSTR).
One more thing worth mentioning is RUN_MACRO, a part of the FCMP language. That allows you to completely run a macro and import its contents back to the data step, which is an interesting option in some cases (for example, you could wrap a call around a PROC SQL that selected a count of something, and then import that to the dataset as a variable, all in one datastep). It's applicable if you're doing this for the purpose of calling a macro to assign a data step variable, not to run a process that does things that don't need to be imported into the data step, but it's something worth considering if you do want that data back all in the dataset that called the process.
You could use CALL EXECUTE:
data _null_;
set analyses;
call execute('%nrstr(%MyMacro('||title||','||weight||','||response||'))');
run;
You can put the variables values into macrovariables and then call your %MyMacro many times (the number of obs in your dataset) with the macrovariables as argument:
Data :
DATA analyses;
input title : $32. weight : $32. response : $32.;
datalines;
foo1 bar1 bat1
foo2 bar2 bat2
;
run;
Code to run macro :
data _NULL_;
set analyses end=fine;
call symput("ARGUMENT"||compress(_N_),catx(",",title,weight,response));
if fine then call symput("NLOOPS",compress(_N_));
run;
%*PUT &ARGUMENT1;
%*PUT &ARGUMENT2;
%MACRO MAIN;
%DO L=1 %TO &NLOOPS;
%MyMacro(&&ARGUMENT&L);
%END;
%MEND;
%MAIN;

Killing an entire sas process

I have developed a SAS process in Enterprise Guide 7.1 that sends e-mails daily (if need be).
The way it works is this:
[external program] generates a file which specifies who needs to be emailed and the subject matter
.
My sas process then looks like this:
1. import this file.
2. manipulate this file.
3. generate emails based on contents of manipulated file.
The problem is, everything crashes if the original file imported in step 1 is empty. Is there a way to run the import, check if the dataset is empty, and then if it is terminate the entire sas process tree?
Thank you in advance, I've been searching but to no avail.
Best way would be to put step 2 and 3 completely in a macro and only execute it when step1 dataset is not empty.
step 1 import file in dataset mydata
data _null_;
set mydata nobs=number;
call symput('mydata_count', number);
stop;
run;
%macro m;
%if &mydata_count > 0 %then %do;
step 2 manipulate this file
step 3 generate emails
%end;
%mend;
%m;
As alternative you could use the statements "Endsas" or "abort" which both terminate your job and session but they can have unwanted sideeffects, you can find these statements and information about them easily when googling for them together with keyword sas.
Although the two statements do what you originally wanted, i would recommend the logical approach i posted as first, because you have more control about what is happening that way and you can avoid some bad side-effects when working with the statements
IMO a better way is to start using a macro like %runquit;. See my answer here. https://stackoverflow.com/a/31390442/214994
Basically instead of using run; or quit; at the end of a step you use %runquit;. If any errors occurred during that step then the rest of the SAS process will be aborted. If running in batch, the entire process is killed. If running interactively, code execution stops, but your interactive session remains open.
EDIT: This assumes you get some kind of error message or warning if the file is empty.

Why does my macro behave differently with call execute()?

Using SAS, I often want to perform an action on each row of a dataset. To do so, I use a command I found on a tutorial : call execute(). As I'm not very familiar with SAS environment, I tend to use macro-functions to do anything I don't know how to and execute them with call execute().
However I have difficulties understanding how macro-language works exactly in SAS. I know that all macro references are resolved first, which provides a base-SAS code which is then executed (or am I already wrong ?). But I don't understand how it applies with call execute().
Let's consider the following code :
%macro prog1; /* %prog1 defines a macrovariable mv1 */
%global mv1;
data _null_;
call symputx("mv1","plop");
run;
%mend;
%macro prog2(var); /* prog2 puts it on the log */
%put PUT &var;
%mend;
%macro prog_glob; /* prog_glob executes prog 1 then prog2 */
%prog1;
%prog2(&mv1);
%mend;
I know there is no need for three macros here but this is a minimal version of my real code, which has this structure.
Now if I execute prog_glob :
%prog_glob;
I get PUT plop on the log, as expected.
But if I use it with call execute() (even if there is no need for loop here) :
data _null_;
mac=%NRSTR("%prog_glob");
call execute(mac);
run;
I get only PUT.
There is no error suggesting that the macrovariable is not defined so the %global statement worked. But somehow, prog2 was executed before prog1 base part (at least I think so) and mv1 is not defined yet.
My questions are :
Is my interpretation correct ?
Why does the result change when I use call execute ?
Depending on the precedent question answer, how should I fix it or is there a more convenient way to loop trough a column values ?
EDIT : My original code intends to rename the variables of several tables, which I have listed in a dataset. For each listed table, I want the following algorithm executed :
prog1 : store a list with all variables in a macrovariable (this is where I define mv equivalent)
prog2 : add a common suffix to these variables names
There is probably a more clever way to do this. Again, I'm not so familiar with SAS and I tend to over-use macros. If you want to show me a better way to do this, I'd be happy to chat but I don't expect you guys to rewrite all my code so an alternative to call execute would be enough for me to be grateful ! :)
Let's have a look at the documentation
http://support.sas.com/documentation/cdl/en/mcrolref/61885/HTML/default/viewer.htm#a000543697.htm
If an EXECUTE routine argument is a macro invocation or resolves to one, the macro executes immediately. However, any SAS statements produced by the EXECUTE routine do not execute until after the step boundary has been passed.
Note: Because macro references execute immediately and SAS statements do not execute until after a step boundary, you cannot use CALL EXECUTE to invoke a macro that contains references for macro variables that are created by CALL SYMPUT in that macro. See Interfaces with the Macro Facility, for an example.
This means, if you call it via call execute:
macro statements are executed immediately - those are:
1.1. first in %prog1: %global mv1; - so mv1 is defined but empty, no Apparent... warning
1.2. SAS statements from %prog1 are still deferred
Now %prog2 - here's only macro statement %PUT which puts (still empty) &mv1 variable. That what you see in the log
Now all the what gets executed immediately has been done. The data step which contains call execute ends.
SAS statements deferred from call execute are now being executed:
4.1. the dataset from prog1 sets the value for mv1.
And that's all :-)
EDIT: regarding your edit: try looking at this http://support.sas.com/kb/48/674.html
data _null_;
mac='%nrstr(%prog_glob)';
call execute(mac);
run;
or, more plainly, as you would see in the documentation...
data _null_;
call execute('%nrstr(%prog_glob)');
run;
or
%let prog=%nrstr(%prog_glob);
data _null_;
mac="&prog.";
call execute(mac);
run;
or, and I wouldn't really recommend this one, but you could also manually concatenate the macro quotes
data _null_;
mac=cats('01'x,'%prog_glob','02'x);
call execute(mac);
run;
The way you are running it, the macro statements get execute at run time and the data step is execute after the calling data step completes. You're not properly using %NRSTR for this context, as described by the documentation. You need to pass the macro, along with the quoting as text to the call routine.

Dynamically call macro from sas data step

This code executes fine when Run as a SAS program:
%MyMacro(foo_val, bar_val, bat_val);
I have created a table using:
DATA analyses;
input title : $32. weight : $32. response : $32.;
datalines;
foo1 bar1 bat1
foo2 bar2 bat2
;
I want to execute MyMacro once for each row of the analyses table.
The following code appears to only pass the string values title, weight and response (rather than the data values foo1 etc.) to my macro (tested with calls to the %put command) :
DATA _NULL_ ;
set analyses;
%MyMacro(title, weight, response);
RUN;
How can I invoke the macro once per record of the analyses table whilst passing data values as arguments to the macro? The intention is to actually run this for a very large number of analyses so the solution must scale appropriately to many more records in the analyses table.
This in part depends on what your macro is doing. If we assume that your macro is doing something that is intended to be run outside of a data step (ie, it's not just assigning a data step variable), then you have several options.
CALL EXECUTE
PROC SQL: SELECT INTO macro variable
Write macro calls into an %INCLUDE file
DOSUBL
CALL EXECUTE has already been explained, and is a good option for some cases. It has some downsides, however, particularly with macro timing, that requires some extra care to protect in some cases - particularly when you are creating macro variables inside your macro. Quentin in his comments shows a way to get around this (adding %NRSTR to the call), but I find that I prefer to only use CALL EXECUTE when there's an advantage to doing so over the other methods - particularly, if I want to use SAS data step techniques (such as FIRST or LAST, for example, or some form of looping) in creating my macro calls, or when I have to do things in a data step anyway and can avoid the overhead of reading the file another time. If I'm just writing a data step like yours above - data something, set something, call execute, run - I wouldn't use it.
PROC SQL SELECT INTO is typically what I use for list processing (which is largely what this is). I like SQL's simplicity a bit better when doing things that aren't too complicated; for example, you can get just one version of each macro call easily with DISTINCT without having to explicitly write a proc sort nodupkey or use first/last processing. It also has the advantage for debugging that you can write all of your macro calls to your results window (if you don't add noprint), which is a bit easier to read than the log for me if I'm trying to see why my calls didn't get generated properly (and doesn't take any extra PUT statements).
proc sql;
select catx(',','%macro(',arg1,arg2,arg3)||')'
into :mvarlist separated by ' '
from dataset;
quit;
&mvarlist.
That runs them quite simply, and has no timing issues (As you're just writing a bunch of macro calls out).
The main downside to this method is that you have a maximum of 64k characters in a macro variable, so if you're writing a huge number of these you'll run into that. In that case use CALL EXECUTE or %INCLUDE files.
%INCLUDE files are largely useful either as replacement for SELECT INTO when the call is over the character limit, or if you find it useful to have a text file to look at with your calls (if you're running this in batch mode for example, this could be easier to get to and/or parse than log or listing output). You just write your calls out to a file, and then %INCLUDE that file.
filename myfile temp; *or a real file if you want to look at it.;
data _null_;
set dataset;
file myfile;
length str $200;
str=catx(',','%macro(',arg1,arg2,arg3)||')';
put str;
run;
%include myfile;
I don't really use this much anymore, but it's a common technique used particularly by older SAS programmers so good to know.
DOSUBL is a relatively new method, and to some extent can be used to replace CALL EXECUTE as its default behavior is typically closer to what you expect intuitively than CALL EXECUTE's. The doc page has really the best example for how this works differently; basically, it fixes the timing issue by letting each separate call look import and export the macro variables from/to the calling environment, meaning that each iteration of DOSUBL is run at a distinct time versus CALL EXECUTE where everything is run in one bunch and the macro environment is 'fixed' (ie, any reference to a macro variable is fixed at run time, unless you escape it messily with %NRSTR).
One more thing worth mentioning is RUN_MACRO, a part of the FCMP language. That allows you to completely run a macro and import its contents back to the data step, which is an interesting option in some cases (for example, you could wrap a call around a PROC SQL that selected a count of something, and then import that to the dataset as a variable, all in one datastep). It's applicable if you're doing this for the purpose of calling a macro to assign a data step variable, not to run a process that does things that don't need to be imported into the data step, but it's something worth considering if you do want that data back all in the dataset that called the process.
You could use CALL EXECUTE:
data _null_;
set analyses;
call execute('%nrstr(%MyMacro('||title||','||weight||','||response||'))');
run;
You can put the variables values into macrovariables and then call your %MyMacro many times (the number of obs in your dataset) with the macrovariables as argument:
Data :
DATA analyses;
input title : $32. weight : $32. response : $32.;
datalines;
foo1 bar1 bat1
foo2 bar2 bat2
;
run;
Code to run macro :
data _NULL_;
set analyses end=fine;
call symput("ARGUMENT"||compress(_N_),catx(",",title,weight,response));
if fine then call symput("NLOOPS",compress(_N_));
run;
%*PUT &ARGUMENT1;
%*PUT &ARGUMENT2;
%MACRO MAIN;
%DO L=1 %TO &NLOOPS;
%MyMacro(&&ARGUMENT&L);
%END;
%MEND;
%MAIN;