SAS Parameter logconfigloc and conventional log output - sas

I got the task to log user access to datasets in certain libraries.
To solve this I use the SAS Audit logger, which already provides the desired output.
To get this desired output, i use the start parameter logconfigloc with the following XML-file:
<?xml version="1.0" encoding="UTF-8"?>
<logging:configuration xmlns:logging="http://www.sas.com/xml/logging/1.0/">
<!-- Log file appender with immediate flush set to true -->
<appender name="AuditLog" class="FileAppender">
<param name="File" value="logconfig.xml.win.audit.file.xml.log"/>
<param name="ImmediateFlush" value="true" />
<filter class="StringMatchFilter">
<param name="StringToMatch" value="WORK"/>
<param name="AcceptOnMatch" value="false"/>
</filter>
<filter class="StringMatchFilter">
<param name="StringToMatch" value="Libref"/>
<param name="AcceptOnMatch" value="true"/>
</filter>
<!-- The DenyAllFilter filters all events not fullfilling the criteria of at least one filters before -->
<filter class="DenyAllFilter">
</filter>
<layout>
<param name="ConversionPattern"
value="%d - %u - %m"/>
</layout>
</appender>
<!-- Audit message logger -->
<logger name="Audit" additivity="false">
<appender-ref ref="AuditLog"/>
<level value="trace"/>
</logger>
<!-- root logger events not enabled -->
<root>
</root>
</logging:configuration>
My Problem is, that by using the logconfigloc parameter, the log parameter is not working any more hence I get no "conventional" SAS log.
I allready tried to enable the root logger, but it´s output only looks similar to the original logfiles but has some diffrences.
Is there an (easy) way to get the "conventional" SAS log in addition the to the afforementioned special access logging output?
Kind Regards,
MiKe

I found the answer to the question how to obtain the conventional log.
For this purpose the SAS logger named "App" with message level "info" can be used.
So the following XML does the trick:
<?xml version="1.0" encoding="UTF-8"?>
<logging:configuration xmlns:logging="http://www.sas.com/xml/logging/1.0/">
<appender name="AppLog" class="FileAppender">
<param name="File" value="D:\Jobs_MK\SAS_Logging_Facility\Advanced_Logging_Test_with_XML\logconfig_standard_log.log"/>
<param name="ImmediateFlush" value="true" />
<layout>
<param name="ConversionPattern"
value="%m"/>
</layout>
</appender>
<!-- Application message logger -->
<logger name="App" additivity="false">
<appender-ref ref="AppLog"/>
<level value="info"/>
</logger>
<!-- root logger events not enabled -->
<root>
</root>
</logging:configuration>

Related

WSO2 MI Validators and Array query parameters for a Data Service

I don't know how to create a validator for 2 variables:
Q1. An user code that needs to be checked if exists prior to running the SQL query. I have an endpoint to check if the user code exists or not, how do I make it into a validator?
Q2. An array of dates, the best I could do was to use a pattern validator. I know there's also the possibility to set a parameter with paramType="array" and sqlType="date", but how can I configure this in the query parameters of the resource?
Please check the Data Service definition below:
<data name="APIDataService" serviceNamespace="" serviceGroup="" transports="http https local">
<description />
<config id="postgresDataService">
<property name="carbon_datasource_name">APIPostgres</property>
</config>
<!-- TODO: Q1. Here is the resource I want to use to validate the user code -->
<query id="validateUserCode" useConfig="postgresDataService">
<sql>
SELECT ( EXISTS ( SELECT 1
FROM user
WHERE code ILIKE :user_code ) )::INT AS does_user_exist
</sql>
<param name="user_code" sqlType="string" />
<result element="result">
<element column="does_user_exist" name="doesUserExist" xsdType="integer" />
</result>
</query>
<resource method="GET" path="validateUserCode">
<call-query href="validateUserCode">
<with-param name="user_code" query-param="user_code" />
</call-query>
</resource>
<query id="apiReportUserTotalEventsByDay" useConfig="postgresDataService">
<sql>
SELECT
calendar_date,
record_date,
user_code,
event_type_id,
event_count
FROM user_event
WHERE 1=1
AND calendar_date BETWEEN :calendar_date_start AND :calendar_date_end
AND record_date = ANY( ('{'||:record_date_array||'}')::DATE[] )
AND user_code = :user_code
</sql>
<!-- TODO: Q1. Here is the user code I want to validate -->
<param name="user_code" paramType="scalar" sqlType="string" />
<param name="calendar_date_start" paramType="scalar" sqlType="date" />
<param name="calendar_date_end" paramType="scalar" sqlType="date" />
<!-- TODO: Q2. Here is the array of date that I want to know
how I could convert into an paramType="array" sqlType="date" -->
<param name="record_date_array" paramType="scalar" sqlType="string" >
<validatePattern pattern="\s*\d{4}-\d{2}-\d{2}\s*(,\s*\d{4}-\d{2}-\d{2}\s*)*" />
</param>
<result element="result">
<element column="calendar_date" name="calendar_date" xsdType="string" />
<element column="record_date" name="record_date" xsdType="string" />
<element column="user_code" name="user_code" xsdType="string" />
<element column="event_type_id" name="event_type_id" xsdType="integer" />
<element column="event_count" name="event_count" xsdType="integer" />
</result>
</query>
<resource method="GET" path="apiReportUserTotalEventsByDay">
<call-query href="apiReportUserTotalEventsByDay">
<with-param name="user_code" query-param="user_code" />
<with-param name="calendar_date_start" query-param="calendar_date_start" />
<with-param name="calendar_date_end" query-param="calendar_date_end" />
<with-param name="record_date_array" query-param="record_date_array" />
</call-query>
</resource>
</data>

How to mask the userName and password in requestBody logs

Below is my request body xml and I am making rest call with this request. Having custom LoggingInterceptor to log the request and response. I want to mask the user and password in logs.
<login><credentials user="user" Password="pass"/></login>
private void traceRequest(final HttpRequest request, final byte[] body) throws IOException {
logger.trace(
String.format(
"REQUEST uri=%s, method=%s, requestBody=%s",
request.getURI(),
request.getMethod(),
new String(body, "UTF-8")));
}
Currently I am printing my logs like below:
LoggingRequestInterceptor - REQUEST uri=http://localhost:8080/, method=POST, requestBody=<login><credentials user="user" Password="pass"/></login>
Below is my logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true" scanPeriod="30 seconds">
<property name="logFile" value="logs/employee.log" />
<property name="logFile-WS" value="logs/employee-ws.log" />
<appender name="employee" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${logFile}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${logFile}.%d{yyyy-MM-dd}.gz</fileNamePattern>
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>%d [%thread] %-5level %logger{64} - %msg%n</pattern>
</encoder>
</appender>
<appender name="mainAppender" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${logFile-WS}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${logFile-WS}.%d{yyyy-MM-dd}.gz</fileNamePattern>
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>%d [%thread] %-5level %logger{64} - %replace(%msg){'having masking logic for other property'}%n</pattern>
</encoder>
</appender>
<appender name="stdout" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.springframework.ws.client.MessageTracing" level="TRACE" additivity="false">
<appender-ref ref="mainAppender" />
</logger>
<logger name="org.springframework.ws.server.MessageTracing" level="TRACE" additivity="false">
<appender-ref ref="mainAppender" />
</logger>
<logger name="com.employee.LoggingRequestInterceptor" level="TRACE" additivity="false">
<appender-ref ref="mainAppender" />
</logger>
<root level="${root-log-level:-INFO}">
<appender-ref ref="stdout"/>
<appender-ref ref="mainAppender"/>
</root>
</configuration>
Please someone help me to solve this. Note: I am using spring boot 2 and slf4j logger
Referring to Mask sensitive data in logs with logback
Add logback-spring.xml in your project.
Customize regular expression in the <patternsProperty> value to match the content your want to mask.
Add the MaskingPatternLayout class (Use the updated one, the one in the beginning is not working) from the above answer
logback-spring.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="Console"
class="ch.qos.logback.core.ConsoleAppender">
<encoder
class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="com.example.springboot.MaskingPatternLayout">
<patternsProperty>(?:user|Password)="([a-zA-Z0-9]+)"
</patternsProperty>
<pattern>%d [%thread] %-5level %logger{35} - %msg%n</pattern>
</layout>
</encoder>
</appender>
<!-- LOG everything at INFO level -->
<root level="info">
<appender-ref ref="Console" />
</root>
</configuration>
HelloController class to test
#RestController
public class HelloController {
private static final Logger logger = LoggerFactory.getLogger(HelloController.class);
#RequestMapping("/")
public String index() {
logger.info("<login><credentials user=\"user\" Password=\"pass\"/></login>");
return "Greetings from Spring Boot!";
}
}
Expected output
2020-04-13 12:38:47,511 [http-nio-8080-exec-1] INFO c.e.springboot.HelloController - <login><credentials user="****" Password="****"/></login>
Update
Please check if "console" should be "stdout"
<root level="${root-log-level:-INFO}">
<appender-ref ref="console"/>
<appender-ref ref="mainAppender"/>
</root>
As no appender with name "console" is found.
Suppose the logger is in LoggingRequestInterceptor, you need to add the "stdout" appender also.
<logger name="com.employee.LoggingRequestInterceptor"
level="TRACE" additivity="false">
<appender-ref ref="stdout" />
<appender-ref ref="mainAppender" />
</logger>
I have added pattern with replace in logback.xml. It's masked user and password
<encoder>
<pattern>%d [%thread] %-5level %logger{64} - %replace( %replace( %replace(%msg){'user="[^"]+"', 'user=*****'} ){'Password="[^"]+"', 'Password=*****'} ){'my another pattern', 'replacement'}%n</pattern>
</encoder>

AWS CloudWatch log and disable log on server itself with springboot

In my springboot application, I configure to write logs to AWS CloudWatch, but the application also generates a log file log on the server itself in the folder /var/log/, now the log file is even larger than 19G
How can I disable the log in the server itself, and only write logs to CloudWatch?
The following is my current logback-spring.xml configuration. Any ideas will appreciate. Thanks in advance.
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/base.xml" />
<springProperty scope="context" name="ACTIVE_PROFILE" source="spring.profiles.active" />
<property name="clientPattern" value="payment" />
<logger name="org.springframework">
<level value="INFO" />
</logger>
<logger name="com.payment">
<level value="INFO" />
</logger>
<logger name="org.springframework.ws.client.MessageTracing.sent">
<level value="TRACE" />
</logger>
<logger name="org.springframework.ws.client.MessageTracing.received">
<level value="TRACE" />
</logger>
<logger name="org.springframework.ws.server.MessageTracing">
<level value="TRACE" />
</logger>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [${HOSTNAME}:%thread] %-5level%replace([${clientPattern}] ){'\[\]\s',''}%logger{50}: %msg%n
</pattern>
</layout>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>TRACE</level>
</filter>
</appender>
<springProfile name="local,dev">
<root level="INFO">
<appender-ref ref="CONSOLE" />
</root>
</springProfile>
<springProfile name="prod,uat">
<timestamp key="date" datePattern="yyyy-MM-dd" />
<appender name="AWS_SYSTEM_LOGS" class="com.payment.hybrid.log.CloudWatchLogsAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>TRACE</level>
</filter>
<layout>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [${HOSTNAME}:%thread] %-5level%replace([${clientPattern}] ){'\[\]\s',''}%logger{50}:
%msg%n
</pattern>
</layout>
<logGroupName>${ACTIVE_PROFILE}-hybrid-batch</logGroupName>
<logStreamName>HybridBatchLog-${date}</logStreamName>
<logRegionName>app-northeast</logRegionName>
</appender>
<appender name="ASYNC_AWS_SYSTEM_LOGS" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="AWS_SYSTEM_LOGS" />
</appender>
<root level="INFO">
<appender-ref ref="ASYNC_AWS_SYSTEM_LOGS" />
<appender-ref ref="CONSOLE" />
</root>
</springProfile>
</configuration>
The most likely fix is to remove this line:
<appender-ref ref="CONSOLE" />
I say "most likely" because this is just writing output to the console. Which means that there's something else that redirects the output to /var/log/whatever, probably in the startup script for your application.
It's also possible that the included default file, org/springframework/boot/logging/logback/base.xml, because this file defines a file appender. I don't know if the explicit <root> definition will completely override or simply update the included default, but unless you know you need the default I'd delete the <include> statement.
If you need to recover space from the existing logfile, you can truncate it:
sudo truncate -s 0 /var/log/WHATEVER
Deleting it is not the correct solution, because it won't actually be removed until the application explicitly closes it (which means restarting your server).
As one of the commenters suggested, you can use logrotate to prevent the on-disk file from getting too large.
But by far the most important thing you should do is read the Logback documentation.

Need to create WSO2 DSS service which can read parameter from url

I have created a simple dss service which by inputing cust_id it gives me the customer data.
I have exposed this webservice as http get resource with the following url
GET /services/getCustDetailDSS/getDetail?cid=101
Corrsponding xml code for my service is as follows
<data name="getCustomerDetailDSS" serviceGroup="" serviceNamespace="">
<description/>
<config id="mydb">
<property name="carbon_datasource_name">mydb</property>
</config>
<query id="get_customer_detail" useConfig="mydb">
<sql>select identifier,user_status from customer_detail where identifier = :cid</sql>
<param name="cid" paramType="SCALAR" sqlType="STRING"/>
<result element="customer">
<element column="identifier" name="cid" xsdType="xs:string"/>
<element column="user_status" name="status" xsdType="xs:string"/>
</result>
</query>
<operation name="get_customer_detail_operation">
<call-query href="get_customer_detail">
<with-param name="cid" query-param="identifier"/>
</call-query>
</operation>
<resource method="GET" path="/getDetail">
<call-query href="get_customer_detail">
<with-param name="cid" query-param="cid"/>
</call-query>
</resource>
</data>
But now i want the dss service to read cust_id from url instead of passing it as a parameter.
i.e i want to hit dss service as
GET /services/getCustDetailDSS/cid/101/getDetail
How can i do this in DSS ?
Can anyone provide wat changes i need to do in my dss?
For GET /services/getCustDetailDSS/getDetail/cid/101
You have to edit your resource path as follows.
<resource method="GET" path="getDetail/cid/{cid}">
With WSO2 DSS 3.2.1, you can now define your query string parameter as below:
<param name="cid" sqlType="QUERY_STRING"/>
instead of:
<param name="cid" paramType="SCALAR" sqlType="STRING"/>
and your URL should look like:
.../getDetail?cid=101

is it possible to have a new file for each new day with log4cXX

I know the rollingPolicy parameter for log4cxx config file, but I can't manage to have the config file which can tell the logger to create a new file each new day, how could I achieve this result ?
Yes. Using rolling style of Composite like this:
<appender name="LogAppender" type="log4net.Appender.RollingFileAppender">
<file type="log4j.Util.PatternString" value="LogFile.log" />
<appendToFile value="true" />
<rollingStyle value="Composite" />
<datePattern value="yyyyMMdd" />
<maxSizeRollBackups value="7" />
<maximumFileSize value="100MB" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date{ISO8601}: [%2thread] %-5level %logger: '%P{network}.%P{node}' %message%newline" />
</layout>
</appender>
Ref.:
Short introduction to Apache log4cxx
log4net Config Examples
I think the following appender will do the stuff ( can't test it on this pc )
<!-- the following appender with the name "TimeBasedLog.log", every night a few seconds after
12::00PM the old log will be renamed with append the date in filename, and a new log file
with the name "TimeBasedLog.log" will be create.
notice the RollingFileAppender is under "org.apache.log4j.rolling" namespace
-->
<appender name="MyRollingAppenderDaily" class="org.apache.log4j.rolling.RollingFileAppender">
<rollingPolicy class="org.apache.log4j.rolling.TimeBasedRollingPolicy">
<param name="FileNamePattern" value="TimeBasedLog.%d{yyyy-MM-dd}.log"/>
<param name="activeFileName" value="TimeBasedLog.log"/>
</rollingPolicy>
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d{yyyy-MM-dd HH:mm:ss,SSS} %x [%p] (%F:%L) %m%n"/>
</layout>
<param name="file" value="TimeBasedLog.log"/>
<param name="append" value="true"/>
</appender>
I am wondering if it is possible to combine inside an appender both timebasedrollingpolicy and MaxFileSize/MaxBackupIndex feature ?
<param name="MaxFileSize" value="5KB"/>
<param name="MaxBackupIndex" value="5"/>