I have a project that utilizes two separate EGP files for two different stages of data processing. In the first file, I calculate a specific number and store it in a macro variable. Is there a way to import that macro variable into the second file, similar to how you might import various objects with an .RDA file in R?
Thanks!
Does an Enterprise Guide project file even store values of macro variables?
Write the value of the macro variable into a dataset
data '/some directory/that SAS server can see/my_mvars.sas7bdat';
length mname $32 value $300 ;
mname='MYMACRO';
value=symget(mname);
output;
run;
or a text file
data _null_ ;
file '/some directory/that SAS server can see/my_mvars.sas';
put '%let MYMACRO=' "&mname" ;
run;
when the first project runs.
Then read it back in the second project from the dataset:
data _null_;
set '/some directory/that SAS server can see/my_mvars.sas7bdat';
call symputx(mname,value);
run;
or the text file
%include '/some directory/that SAS server can see/my_mvars.sas' / source2;
Related
How does one input all variables/columns within a data step using INPUT but without naming every variable? This can be done by naming each variable, for example:
DATA dataset;
INFILE '/folders/myfolders/file.txt';
INPUT variable1 variable2 variable3 variable4 $ variable5;
RUN;
However, this is very tedious for large datasets containing 200+ variables.
The original question implied that you already had a SAS data set. In that case all variables are automatically included when you SET the dataset.
data copy ;
set '/folders/myfolders/file.sas7bdat';
run;
Or just reference it in the analysis you want to do.
proc means data='/folders/myfolders/file.sas7bdat';
run;
If you actually have a TEXT file and you want to read it into a SAS dataset you could use PROC IMPORT to guess what is in the file. If it has a header row then proc import will try to convert those into valid variable names. It will also try to guess how to define the variables based on what values it sees in the text file.
proc import out=want datafile='/folders/myfolders/file.txt' dbm=dlm ;
delimiter=',';
run;
Or if the issue that it is too hard to create 200 unique variable names you could just use a variable list with numeric suffixes to save a lot of typing.
DATA dataset;
INFILE '/folders/myfolders/file.txt' dsd ;
length var1-var200 $20 ;
input var1-var200 ;
RUN;
I have a list of ~100 files. The first file contains header information for the other 98 data files. The information should be in table format, however each table is a different size (with regards to column and row number).
My goal is to import these files such that the column headers from the first file are correctly assigned.
Additional information:
I am told this list of files was generated using SAS (however I am not familiar with the file format) Furthermore, the "CIMPORT" command does not work on these files.
The files are "|" delineated
Thank you very much for any help.
This was a fun issue. I came up with following way:
First lets load up some data.
proc import datafile = "\\Datadrive\mydata.csv"
out=w_headers;
delimiter=";";
guessingrows=32767;
run;
proc import datafile = "\\Datadrive\no_headers.csv"
out=no_headers;
delimiter=";";
guessingrows=32767;
run;
Then I extract the names of the columns and variable number to a dataset.
proc contents data=w_headers out=meta(keep=NAME VARNUM) noprint ; run ;
Then I create commands to renaming the columns without names to have proper names based on the existing. ones.
data meta;
set meta;
cmd = cats('VAR',VARNUM,'=', name);
run;
Here comes the kicker, I put the the commends to a variable. Next the variable is fed to proc datasets for renaming the columns.
proc sql noprint;
select cmd into :cmd_list separated by ' ' from meta;
quit;
proc datasets library = work nolist;
modify no_headers;
rename &cmd_list;
quit;
At this point my two datasets have identical column names. the method is a bit tricky, but works. I'm sure there is another way, but this was fun one. :)
I created a stored process but I want to export the output to Excel. My usual export statement doesn't work in the stored process.
%let _ODSDEST=none;
%STPBEGIN();
data x;
set sashelp.class;
run;
proc export data=x outfile = "//my documents/sp_test.xlsx" dbms=xlsx replace;
sheet="table1";
run;
* Begin EG generated code (do not edit this line);
;*';*";*/;quit;
%STPEND;
Is there a way to get this to work in the stored process?
One way to have a stored process return an excel file (actually in this case it's an xml file that excel will happily open) is to use ODS to output tagsets.excelxp (xml).
When you do this, you can use stpsrv_header to modify the HTML header. The first statement tells the browser to open the file with excel, the second tells it the file name. I believe for this header modification to work the stored process needs to deliver streaming results, not package results. But I could be wrong.
When I run below, I get a file download dialog box from the browser, allowing me to open or save the file. I'm running from Stored Process Web App, but should work fine when called from Information Delivery Portal.
%let _odsdest=tagsets.excelxp;
%let rc=%sysfunc(stpsrv_header(Content-type,application/vnd.ms-excel));
%let rc=%sysfunc(stpsrv_header(Content-disposition,attachment%str(;) filename=MyExcelFile.xls));
%stpbegin()
proc print data=sashelp.shoes (obs=&obs);
run;
%stpend()
did u check with your spelling proc exportd and outfile='mypath/my documents/myoutpt.xlsx' dbms=xlsx or outfile='mypath/my documents/myoutpt.xls' dbms=xls?? U can try with ODS also.
You can also try setting your STP up as a streaming web service, removing the %STPBEGIN and %STPEND macros, and sending to _webout using this macro: https://core.sasjs.io/mp__streamfile_8sas.html
The benefit of this, is that your code will subsequently work on Viya as well.
I found a macro and have been using it to import datasets that are given to me in csv format. Now I need to edit it because I have datasets that have an id number in them and I want sas datasets with the same name.
THE csvs are named things like IDSTUDY233_first.csv So I want the sas dataset to be IDSTUDY233_first. It should appear in my work folder.
I thought it would just create a sas dataset for each csv named IDSTUDY233_first or something like that. (and so on and so forth for each additional study). However it's naming this way.
IDSTUDY_FIRST
and over rights itself for every ID. I am newer to macros and have been trying to figure out WHY it does this and how to fix it. Suggestions?
%let subdir=Y:\filepath\; *MACRO VARIABLE FOR FILEPATH;
filename dir "&subdir.*.csv "; *give the file the name from the path that your at whatever the csv is named;
data new; *create the dataset new it has all those filepath names csv names;
length filename fname $ 200;
infile dir eof=last filename=fname;
input ;
last: filename=fname;
run;
proc sort data=new nodupkey; *sort but don't keep duplicate files;
by filename;
run;
data null; *create the dataset null;
set new;
call symputx(cats('filename',_n_),filename); *call the file name for this observation n;
call symputx(cats('dsn',_n_),compress(scan(filename,-2,'\.'), ,'ka')); *call the dataset for this file compress then read the file;
call symputx('nobs',_n_); *call for the number of observations;
run;
%put &nobs.; *but each observation in;
%macro import; *start the macro import;
%do i=1 %to &nobs; *Do for each fie to number of observations;
proc import datafile="&&filename&i" out=&&dsn&i dbms=csv replace;
getnames=yes;
run;
%end;
%mend import;
%import
*call import macro;
As you can see I added my comments of my understanding. Like I said macros are new to me. I may be incorrect in my understanding. I am guessing the problem is either in
call symputx(cats('dsn',_n_),compress(scan(filename,-2,'\.'), ,'ka'));
or it is in the import statement probably out=&&dsn&i since it rapidly over writes the previous SAS files until it does every one. It's just I need all the sas files not just the last 1.
My guess is that you are right, it is to do with this line:
call symputx(cats('dsn',_n_),compress(scan(filename,-2,'\.'), ,'ka'));
The gotcha is in the arguments passed to compress. Compress can be used to remove or keep certain characters in a string. In the above example, they are using it to just keep alphabetic characters by passing in the 'ka' modifiers. This is effectively causing files with different names (because they have different numbers) to be treated as the same file.
You can modify this behaviour to keep alphabetic characters, digits, and the underscore character by changing the parameters from ka to kn.
This change does mean that you also need to make sure that none of your file names begin with a number (as SAS datasets can't begin with a number).
The documentation for the compress function is here:
http://support.sas.com/documentation/cdl/en/lrdict/64316/HTML/default/viewer.htm#a000212246.htm
An easy way to debug this would be to take the dataset with all of the call symput statements, and in addition to storing these values in macro variables, write them to variables in the dataset. Also change it from a data _null_ to a data tmp statement. You can then see for each file what the destination table name will be.
I'm trying to use a Macro that retrieves a single value from a CSV file. I've written a MACRO that works perfectly fine if there is only 1 CSV file, but does not deliver the expected results when I have to run it against more than one file. If there is more than one file it returns the value of the last file in each iteration.
%macro reporting_import( full_file_route );
%PUT The Source file route is: &full_file_route;
%PUT ##############################################################;
PROC IMPORT datafile = "&full_file_route"
out = file_indicator_tmp
dbms = csv
replace;
datarow = 3;
RUN;
data file_indicator_tmp (KEEP= lbl);
set file_indicator_tmp;
if _N_ = 1;
lbl = "_410 - ACCOUNTS"n;
run;
proc sql noprint ;
select lbl
into :file_indicator
from file_indicator_tmp;
quit;
%PUT The Source Reporting period states: &file_indicator;
%PUT ##############################################################;
%mend;
This is where I execute the Macro. Each excel file's full route exists as a seperate record in a dataset called "HELPERS.RAW_WAITLIST".
data _NULL_;
set HELPERS.RAW_WAITLIST;
call execute('%reporting_import('||filename||')');
run;
In the one example I just ran, The one file contains 01-JUN-2015 and the other 02-JUN-2015. But what the code returns in the LOG file is:
The Source file route is: <route...>\FOO1.csv
##############################################################
The Source Reporting period states: Reporting Date:02-JUN-2015
##############################################################
The Source file route is: <route...>\FOO2.csv
##############################################################
The Source Reporting period states: Reporting Date:02-JUN-2015
##############################################################
Does anybody understand why this is happening? Or is there perhaps a better way to solve this?
UPDATE:
If I remove the code from the MACRO and run it manually for each input file, It works perfectly. So it must have something to do with the MACRO overwriting values.
CALL EXECUTE has tricky timing issues. When it invokes a macro, if that macro generates macro variables from data set variables, it's a good idea to wrap the macro call in %NRSTR(). That way call execute generates the macro call, but doesn't actually execute the macro. So try changing your call execute statement to:
call execute('%nrstr(%%)reporting_import('||filename||')');
I posted a much longer explanation here.
I'm not too clear on the connections between your files. But instead of importing the CSV files and then searching for your string, couldn't you use a pipe command to save the results of a grep search on your CSV files to a dataset and then read just in the results?
Update:
I tried replicating your issue locally and it works for me if I set file_indicator with a call symput as below instead of your into :file_indicator:
data file_indicator_tmp (KEEP= lbl);
set file_indicator_tmp;
if _N_ = 1;
lbl = "_410 - ACCOUNTS"n;
data _null_ ;
set file_indicator_tmp ;
if _n_=1 then call symput('file_indicator',lbl) ;
run;