I want to get few details from my server like the last reboot time using a batch file.
I am using the command
systeminfo | findstr "Time:"
to get the current up time of the server
now my issue is I want to compare the current up time against the computer date and time and ensure that it shows "Success" if it rebooted less than 15 days else "Alert" if not rebooted in less than 15 days
There are ways to convert a MM/DD/YYYY date format to a Julian Date. If you do that, you can subtract now from boot to find the difference.
http://www.dostips.com/DtTipsDateTime.php
Adapting the Date2JDate formula, I was able to get the date the server booted, and today's date; and calculate the elapsed days:
#echo off
SETLOCAL
for /f "tokens=3,* delims= " %%i in ('net statistics server ^| find /i "Statistics since"') do set _boot=%%i
for /f "tokens=2,* delims= " %%i in ('date /t') do set _now=%%i
call :Date2JDate %_boot% _bootJD
call :Date2JDate %_now% _nowJD
set /a _Elapsed=%_nowJD%-%_bootJD%
if %_Elapsed% LSS 15 (echo Success) else (echo Alert)
ENDLOCAL
goto TheEnd
:Date2JDate
SETLOCAL
for /f "tokens=1,2,3 delims=/" %%i in ('echo %1') do (set mm=%%i&set dd=%%j&set yy=%%k)
if %yy% LSS 100 set /a yy+=2000
set /a JD=dd-32075+1461*(yy+4800+(mm-14)/12)/4+367*(mm-2-(mm-14)/12*12)/12-3*((yy+4900+(mm-14)/12)/100)/4
ENDLOCAL & SET %~2=%JD%
goto TheEnd
:TheEnd
Of course this assumes that the date format of the server is MM/DD/YYYY.
Related
I'm migrating some SAS software from a server Unix to a server Linux.
Currently, in a SAS program, I have the following instruction:
Filename myname pipe "ls -le &mypath." ;
(then the file myname is used in a data step as InFile myname truncover end=fine;).
The option -e, in the ls of Unix, produces a list of files where the year is always printed, even for the recently created files.
For example, compare:
myserver.myuser:/mypath> ls -l myfile
-rwxr-xr-x 1 auser auser 24422893965 Nov 5 06:17 myfile
with:
myserver.myuser:/mypath> ls -le myfile
-rwxr-xr-x 1 auser auser 24422893965 Nov 5 06:17:27 2021 myfile
In Linux, the option -e of ls does not exist, but you can have the same result with the command:
ls -l --time-style="+%b %d %T %Y"
The problem is how to use this command, which contains special characters like “ and %, in the SAS instruction.
I tried with :
Filename itt pipe "ls -l --time-style='+%%b %%d %%T %%Y' &mypath." ;
but I get:
total 1420
-rwxrwxr-x 1 myuser mygroup 1450000 052130864ov %d %T %Y myfile
I tried:
Filename itt pipe "ls -l --time-style='+%str(%)b %str(%)d %str(%)T %str(%)Y' &mypath." ;
but I get
total 1420
-rwxrwxr-x 1 myuser mygroup 1450000 )b )d )T )Y myfile
Is there a way to make that command work?
Any alternative solution to get the same result is welcome.
It is much easier to deal with strings that include macro trigger characters, & and %, in SAS code than it is in macro code.
If you really want to define the fileref MYNAME you could use the FILENAME() function call in a data step. Or use the QUOTE() function to generate a macro variable that has single quotes on the outside to protect the macro triggers that you could then use in your FILENAME statement.
data _null_;
length cmd $300;
cmd = catx(' ','ls -l --time-style="+%b %d %T %Y"',"&mypath");
call symputx('lscmd',quote(trim(cmd),"'"));
run;
filename myname pipe &lscmd;
But why not just build the ls command as part of the data step that reads the results? You could also use a style for the datetime value that is easier for SAS to read, such as something the DATETIME informat understands.
data ls_output;
length cmd $300;
cmd = catx(' ','ls -l --time-style="+%d%b%Y:%T"',"&mypath");
infile lscmd pipe filevar=cmd truncover ;
input mode :$11. links owner :$20. group :$20. size lastmod :datetime. file $256.;
format lastmod datetime19.;
run;
Use %nrstr. This will not resolve % signs at compilation time.
Filename itt pipe "ls -l --time-style='%nrstr(+%b %d %T %Y)' &mypath.";
I have a folder which has some tables. Once I opened it, it shows table name, date modified, type and size.
I am trying to read all the information including: table name, date modified, type and size using SAS. so I tried pipe first:
filename tbl pipe "dir /abc/sales";
data new;
infile tbl pad;
input all $500.;
run;
the result only has the table name, but no date modified, type and size.
so just wonder how to fix it.
An example folder 'sales' below:
table name size date modified type
sales1 490k 10/28/2020 9:32:50 am sas7bdat
sales2 85k 11/12/2020 4:28:23 pm sas7bdat
sales3 307k 12/17/2020 1:55:09 pm sas7bdat
From your path it looks like SAS is running on Unix. Not sure what the command dir does on your flavor of Unix, but ls -l should get the file details on any flavor of Unix.
data new;
infile "ls -l /abc/sales/" pipe truncover ;
input all $500.;
run;
I created and ran successfully Stored Procedure in Redshift but not working as expected.
For example, I'd like to delete data in the period set by the arguments.
-- Stored Procedure
CREATE OR REPLACE PROCEDURE sp_test(parm0 varchar(100), parm1 date, parm2 date)
AS '
BEGIN
EXECUTE
$_$ DELETE FROM test_table_b
WHERE $_$|| parm0 ||$_$
between $_$|| parm1 ||$_$ and $_$|| parm2 ||$_$ $_$;
end;
' language plpgsql;
-- Run Stored procedure
Begin;
Call sp_test('opsdt', '2021-01-16', '2021-01-17');
Commit;
-- Result
BEGIN executed successfully
Execution time: 0.07s
Statement 1 of 3 finished
0 rows affected
Call executed successfully
Execution time: 0.18s
Statement 2 of 3 finished
COMMIT executed successfully
Execution time: 0.13s
Statement 3 of 3 finished
Script execution finished
Total script execution time: 0.38s
Script ran successfully, but the record '2021-01-16' and '2021-01-17' is still remained in that table.
Any advice would be appreciated. Thanks in advance.
Thanks to #John Rotenstein, now I could run Stored Procedure as expected.
Just simple example for someone who has the same issue.
-- Revised Procedure
CREATE OR REPLACE PROCEDURE sp_del_test(tbl_name varchar(50), col_name varchar(50), start_dt date, end_dt date)
AS $PROC$
DECLARE
sql VARCHAR(MAX) := '';
BEGIN
sql := 'DELETE FROM ' || tbl_name || ' WHERE ' || col_name || ' BETWEEN ''' || start_dt || ''' AND ''' || end_dt || '''';
RAISE INFO '%', sql;
EXECUTE sql;
END;
$PROC$ language plpgsql;
-- Executed Commands
Begin;
Call sp_del_test('test_table_b', 'opsdt', '2021-01-23', '2021-01-24');
Commit;
-- Return Message
BEGIN executed successfully
Execution time: 0.05s
Statement 1 of 3 finished
**Warnings:
DELETE FROM test_table_b WHERE opsdt BETWEEN '2021-01-23' AND '2021-01-24'**
0 rows affected
Call executed successfully
Execution time: 0.2s
Statement 2 of 3 finished
COMMIT executed successfully
Execution time: 0.12s
Statement 3 of 3 finished
Script execution finished
Total script execution time: 0.38s
I am trying to use SAS to read multiple files from a directory and they were created before a date.
I have used this code to help me to read all the files. It works perfectly. Now I found out that only some files that were created before a certain date are what I need. I think that could be done either by FILENAME PIPE Dir options or by INFILE statement options, but I cannot find the answers.
code source:
http://support.sas.com/kb/41/880.html
filename DIRLIST pipe 'dir "C:\_today\file*.csv" /b ';
data dirlist ;
infile dirlist lrecl=200 truncover;
input file_name $100.;
run;
data _null_;
set dirlist end=end;
count+1;
call symputx('read'||put(count,4.-l),cats('c:\_today\',file_name));
call symputx('dset'||put(count,4.-l),scan(file_name,1,'.'));
if end then call symputx('max',count);
run;
options mprint symbolgen;
%macro readin;
%do i=1 %to &max;
data &&dset&i;
infile "&&read&i" lrecl=1000 truncover dsd;
input var1 $ var2 $ var3 $;
run;
%end;
%mend readin;
%readin;
Currently you are reading in just the file names using the dir command. The existing /b modifier is saying print just the file name and nothing else. You want to change it to read both the file name and the CREATED date of the file. In order to do that it gets a little messy. You will need to change that pipe command from:
filename DIRLIST pipe 'dir "C:\_today\file*.csv" /b ';
...to this... :
filename DIRLIST pipe 'dir "C:\_today\file*.csv" /tc ';
The output will change from something like this:
file1.csv
file2.csv
...
...to something like this... :
Volume in drive C has no label.
Volume Serial Number is 90ED-A122
Directory of C:\_today
01/13/2017 09:14 AM 1,991 file1.csv
01/11/2017 11:43 AM 169 file2.csv
...
...
...
01/11/2017 11:43 AM 169 file99.csv
99 File(s) 6,449 bytes
0 Dir(s) 57,999,806,464 bytes free
So you will then need to modify your data step that creates dirlist to clean up the results returned by the new dir statement. You will need to ignore the header and footer and read in the date and time etc. Once you have that date and time in the appropriate SAS format, you can then just use a SAS where clause to keep the rows you are interested in. I will leave this as an exercise for you to do. If you have trouble with it you can always open a new question.
If you need more information on the dir command, you can open up a command prompt (Start Menu->Run->"cmd"), and then type in dir /? to see a list of available switches for the dir command. You may find a slightly different combination of switches for it that better suits your task than what I listed above.
You can use powershell to leverage the features of the operating system.
filename get_them pipe
" powershell -command
""
dir c:\temp
| where {$_.LastWriteTime -gt '3/19/2019'}
| select -property name
| ft -hidetableheader
""
";
data _null_;
infile get_them;
input;
putlog _infile_;
run;
First time asking a questions so my apologies if I skipped over some of the basics before posting this question.
Basically my questions is fairly simple....I have a file that gets written to very often and the first string/column always has the word "CLEAR" or "CRITICAL", sometimes "WARNING", but I want to ignore those entries.
Around the 17th column there is a specific32bit alpha-numeric # that accompanies each entry. I'm trying to find a way to, without modifying the original file....write out just the 1st column and the 32bit alpha-numeric # into a new file for starters. Unfortunately the 32bit # is not always in column 17 or else I could do this on my own.
Here is a glance of a portion of the log file that I'm referring to. Please don't bash me to hard on my ignorance if my question is not detailed enough or has already been answered before.
CLEAR ; lnx20162.csxt.csx.com ; Database Instance ; actd ; Dec 14,
2012 4:46:31 PM EST ; D0C53D1FB19075C2E0405C0A6FF002BF ; Metric Alert
; Response:State ; The database status is OPEN.
CRITICAL ; lnx20016.csxt.csx.com ; Database Instance ; GISP_GISP2 ;
Dec 14, 2012 4:39:54 PM EST ; D0C53D32C0E53F85E0405C0A6FF002C9 ;
Metric Alert ; alertLog:genericErrStack ; ORA-error stack (4,031)
logged in
/oramisc01/oracle/diag/rdbms/gisp/GISP2/trace/alert_GISP2.log.
CRITICAL ; lnx20016.csxt.csx.com ; Database Instance ; GISP_GISP2 ;
Dec 14, 2012 4:40:00 PM EST ; D0C53D32C1093F85E0405C0A6FF002C9 ;
Metric Alert ; alertLog:genericErrStack ; ORA-error stack (04031,
04031) logged in
/oramisc01/oracle/diag/rdbms/gisp/GISP2/trace/alert_GISP2.log.
CRITICAL ; lnx20016.csxt.csx.com ; Database Instance ; GISP_GISP2 ;
Dec 14, 2012 4:39:55 PM EST ; D0C53D32C0EB3F85E0405C0A6FF002C9 ;
Metric Alert ; alertLog:genericErrStack ; ORA-error stack (04031,
04031, 04031, 04031, 04031) logged in
/oramisc01/oracle/diag/rdbms/gisp/GISP2/trace/alert_GISP2.log.
grep -E -o "EST ;.{0,33}" file1| cut -d ";" -f2 > outputfile
you need to find a consistent "hook" which is "EST ;"
if you want this done all the time say on the minute, make a script and put in on crontab