In my springboot application, I configure to write logs to AWS CloudWatch, but the application also generates a log file log on the server itself in the folder /var/log/, now the log file is even larger than 19G
How can I disable the log in the server itself, and only write logs to CloudWatch?
The following is my current logback-spring.xml configuration. Any ideas will appreciate. Thanks in advance.
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<include resource="org/springframework/boot/logging/logback/base.xml" />
<springProperty scope="context" name="ACTIVE_PROFILE" source="spring.profiles.active" />
<property name="clientPattern" value="payment" />
<logger name="org.springframework">
<level value="INFO" />
</logger>
<logger name="com.payment">
<level value="INFO" />
</logger>
<logger name="org.springframework.ws.client.MessageTracing.sent">
<level value="TRACE" />
</logger>
<logger name="org.springframework.ws.client.MessageTracing.received">
<level value="TRACE" />
</logger>
<logger name="org.springframework.ws.server.MessageTracing">
<level value="TRACE" />
</logger>
<appender name="CONSOLE" class="ch.qos.logback.core.ConsoleAppender">
<layout class="ch.qos.logback.classic.PatternLayout">
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [${HOSTNAME}:%thread] %-5level%replace([${clientPattern}] ){'\[\]\s',''}%logger{50}: %msg%n
</pattern>
</layout>
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>TRACE</level>
</filter>
</appender>
<springProfile name="local,dev">
<root level="INFO">
<appender-ref ref="CONSOLE" />
</root>
</springProfile>
<springProfile name="prod,uat">
<timestamp key="date" datePattern="yyyy-MM-dd" />
<appender name="AWS_SYSTEM_LOGS" class="com.payment.hybrid.log.CloudWatchLogsAppender">
<filter class="ch.qos.logback.classic.filter.ThresholdFilter">
<level>TRACE</level>
</filter>
<layout>
<pattern>%d{yyyy-MM-dd HH:mm:ss.SSS} [${HOSTNAME}:%thread] %-5level%replace([${clientPattern}] ){'\[\]\s',''}%logger{50}:
%msg%n
</pattern>
</layout>
<logGroupName>${ACTIVE_PROFILE}-hybrid-batch</logGroupName>
<logStreamName>HybridBatchLog-${date}</logStreamName>
<logRegionName>app-northeast</logRegionName>
</appender>
<appender name="ASYNC_AWS_SYSTEM_LOGS" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="AWS_SYSTEM_LOGS" />
</appender>
<root level="INFO">
<appender-ref ref="ASYNC_AWS_SYSTEM_LOGS" />
<appender-ref ref="CONSOLE" />
</root>
</springProfile>
</configuration>
The most likely fix is to remove this line:
<appender-ref ref="CONSOLE" />
I say "most likely" because this is just writing output to the console. Which means that there's something else that redirects the output to /var/log/whatever, probably in the startup script for your application.
It's also possible that the included default file, org/springframework/boot/logging/logback/base.xml, because this file defines a file appender. I don't know if the explicit <root> definition will completely override or simply update the included default, but unless you know you need the default I'd delete the <include> statement.
If you need to recover space from the existing logfile, you can truncate it:
sudo truncate -s 0 /var/log/WHATEVER
Deleting it is not the correct solution, because it won't actually be removed until the application explicitly closes it (which means restarting your server).
As one of the commenters suggested, you can use logrotate to prevent the on-disk file from getting too large.
But by far the most important thing you should do is read the Logback documentation.
Related
I got the task to log user access to datasets in certain libraries.
To solve this I use the SAS Audit logger, which already provides the desired output.
To get this desired output, i use the start parameter logconfigloc with the following XML-file:
<?xml version="1.0" encoding="UTF-8"?>
<logging:configuration xmlns:logging="http://www.sas.com/xml/logging/1.0/">
<!-- Log file appender with immediate flush set to true -->
<appender name="AuditLog" class="FileAppender">
<param name="File" value="logconfig.xml.win.audit.file.xml.log"/>
<param name="ImmediateFlush" value="true" />
<filter class="StringMatchFilter">
<param name="StringToMatch" value="WORK"/>
<param name="AcceptOnMatch" value="false"/>
</filter>
<filter class="StringMatchFilter">
<param name="StringToMatch" value="Libref"/>
<param name="AcceptOnMatch" value="true"/>
</filter>
<!-- The DenyAllFilter filters all events not fullfilling the criteria of at least one filters before -->
<filter class="DenyAllFilter">
</filter>
<layout>
<param name="ConversionPattern"
value="%d - %u - %m"/>
</layout>
</appender>
<!-- Audit message logger -->
<logger name="Audit" additivity="false">
<appender-ref ref="AuditLog"/>
<level value="trace"/>
</logger>
<!-- root logger events not enabled -->
<root>
</root>
</logging:configuration>
My Problem is, that by using the logconfigloc parameter, the log parameter is not working any more hence I get no "conventional" SAS log.
I allready tried to enable the root logger, but it´s output only looks similar to the original logfiles but has some diffrences.
Is there an (easy) way to get the "conventional" SAS log in addition the to the afforementioned special access logging output?
Kind Regards,
MiKe
I found the answer to the question how to obtain the conventional log.
For this purpose the SAS logger named "App" with message level "info" can be used.
So the following XML does the trick:
<?xml version="1.0" encoding="UTF-8"?>
<logging:configuration xmlns:logging="http://www.sas.com/xml/logging/1.0/">
<appender name="AppLog" class="FileAppender">
<param name="File" value="D:\Jobs_MK\SAS_Logging_Facility\Advanced_Logging_Test_with_XML\logconfig_standard_log.log"/>
<param name="ImmediateFlush" value="true" />
<layout>
<param name="ConversionPattern"
value="%m"/>
</layout>
</appender>
<!-- Application message logger -->
<logger name="App" additivity="false">
<appender-ref ref="AppLog"/>
<level value="info"/>
</logger>
<!-- root logger events not enabled -->
<root>
</root>
</logging:configuration>
Below is my request body xml and I am making rest call with this request. Having custom LoggingInterceptor to log the request and response. I want to mask the user and password in logs.
<login><credentials user="user" Password="pass"/></login>
private void traceRequest(final HttpRequest request, final byte[] body) throws IOException {
logger.trace(
String.format(
"REQUEST uri=%s, method=%s, requestBody=%s",
request.getURI(),
request.getMethod(),
new String(body, "UTF-8")));
}
Currently I am printing my logs like below:
LoggingRequestInterceptor - REQUEST uri=http://localhost:8080/, method=POST, requestBody=<login><credentials user="user" Password="pass"/></login>
Below is my logback.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration scan="true" scanPeriod="30 seconds">
<property name="logFile" value="logs/employee.log" />
<property name="logFile-WS" value="logs/employee-ws.log" />
<appender name="employee" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${logFile}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${logFile}.%d{yyyy-MM-dd}.gz</fileNamePattern>
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>%d [%thread] %-5level %logger{64} - %msg%n</pattern>
</encoder>
</appender>
<appender name="mainAppender" class="ch.qos.logback.core.rolling.RollingFileAppender">
<file>${logFile-WS}</file>
<rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
<fileNamePattern>${logFile-WS}.%d{yyyy-MM-dd}.gz</fileNamePattern>
<maxHistory>30</maxHistory>
</rollingPolicy>
<encoder>
<pattern>%d [%thread] %-5level %logger{64} - %replace(%msg){'having masking logic for other property'}%n</pattern>
</encoder>
</appender>
<appender name="stdout" class="ch.qos.logback.core.ConsoleAppender">
<encoder>
<pattern>%d{HH:mm:ss.SSS} [%thread] %-5level %logger{36} - %msg%n</pattern>
</encoder>
</appender>
<logger name="org.springframework.ws.client.MessageTracing" level="TRACE" additivity="false">
<appender-ref ref="mainAppender" />
</logger>
<logger name="org.springframework.ws.server.MessageTracing" level="TRACE" additivity="false">
<appender-ref ref="mainAppender" />
</logger>
<logger name="com.employee.LoggingRequestInterceptor" level="TRACE" additivity="false">
<appender-ref ref="mainAppender" />
</logger>
<root level="${root-log-level:-INFO}">
<appender-ref ref="stdout"/>
<appender-ref ref="mainAppender"/>
</root>
</configuration>
Please someone help me to solve this. Note: I am using spring boot 2 and slf4j logger
Referring to Mask sensitive data in logs with logback
Add logback-spring.xml in your project.
Customize regular expression in the <patternsProperty> value to match the content your want to mask.
Add the MaskingPatternLayout class (Use the updated one, the one in the beginning is not working) from the above answer
logback-spring.xml
<?xml version="1.0" encoding="UTF-8"?>
<configuration>
<appender name="Console"
class="ch.qos.logback.core.ConsoleAppender">
<encoder
class="ch.qos.logback.core.encoder.LayoutWrappingEncoder">
<layout class="com.example.springboot.MaskingPatternLayout">
<patternsProperty>(?:user|Password)="([a-zA-Z0-9]+)"
</patternsProperty>
<pattern>%d [%thread] %-5level %logger{35} - %msg%n</pattern>
</layout>
</encoder>
</appender>
<!-- LOG everything at INFO level -->
<root level="info">
<appender-ref ref="Console" />
</root>
</configuration>
HelloController class to test
#RestController
public class HelloController {
private static final Logger logger = LoggerFactory.getLogger(HelloController.class);
#RequestMapping("/")
public String index() {
logger.info("<login><credentials user=\"user\" Password=\"pass\"/></login>");
return "Greetings from Spring Boot!";
}
}
Expected output
2020-04-13 12:38:47,511 [http-nio-8080-exec-1] INFO c.e.springboot.HelloController - <login><credentials user="****" Password="****"/></login>
Update
Please check if "console" should be "stdout"
<root level="${root-log-level:-INFO}">
<appender-ref ref="console"/>
<appender-ref ref="mainAppender"/>
</root>
As no appender with name "console" is found.
Suppose the logger is in LoggingRequestInterceptor, you need to add the "stdout" appender also.
<logger name="com.employee.LoggingRequestInterceptor"
level="TRACE" additivity="false">
<appender-ref ref="stdout" />
<appender-ref ref="mainAppender" />
</logger>
I have added pattern with replace in logback.xml. It's masked user and password
<encoder>
<pattern>%d [%thread] %-5level %logger{64} - %replace( %replace( %replace(%msg){'user="[^"]+"', 'user=*****'} ){'Password="[^"]+"', 'Password=*****'} ){'my another pattern', 'replacement'}%n</pattern>
</encoder>
I has been breaking my head trying with regular expresions. I want to extract the sslconnector in a tomcat service.xml file.
this is the imput from my file.
<?xml version='1.0' encoding='utf-8'?>
<Server port="${shutdown.port}" shutdown="5ijXSyVl4Y9r">
<Listener className="org.apache.catalina.startup.VersionLoggerListener" />
<Listener className="org.apache.catalina.core.AprLifecycleListener" SSLEngine="on" />
<Listener className="org.apache.catalina.core.JasperListener" />
<Listener className="org.apache.catalina.core.JreMemoryLeakPreventionListener" />
<Listener className="org.apache.catalina.mbeans.GlobalResourcesLifecycleListener" />
<Listener className="org.apache.catalina.core.ThreadLocalLeakPreventionListener" />
<GlobalNamingResources>
<Resource name="UserDatabase" auth="Container"
type="org.apache.catalina.UserDatabase"
description="User database that can be updated and saved"
factory="org.apache.catalina.users.MemoryUserDatabaseFactory"
pathname="conf/tomcat-users.xml" />
</GlobalNamingResources>
<Service name="Catalina">
<Connector port="${http.port}" protocol="HTTP/1.1"
connectionTimeout="20000"
redirectPort="${https.port}" />
<Connector port="${https.port}" protocol="org.apache.coyote.http11.Http11Protocol"
maxThreads="150" SSLEnabled="true" scheme="https" secure="true"
keystoreFile="/opt/ais/install/tomcat/security/ais.jks" keystorePass="a1ss3cr3t"
clientAuth="false" sslEnabledProtocols="TLSv1.1,TLSv1.2" />
<Connector port="${ajp.port}" protocol="AJP/1.3" redirectPort="${https.port}" connectionTimeout="20000"/>
<Connector port="${ajp.port}" protocol="AJP/1.3" redirectPort="${https.port}" connectionTimeout="20000"/>
<Engine name="Catalina" defaultHost="localhost" jvmRoute="${tomcat.node.name}">
<Realm className="org.apache.catalina.realm.LockOutRealm">
<Realm className="org.apache.catalina.realm.UserDatabaseRealm"
resourceName="UserDatabase"/>
</Realm>
<Host name="localhost" appBase="webapps"
unpackWARs="true" autoDeploy="true">
<Valve className="org.apache.catalina.valves.AccessLogValve" directory="logs"
prefix="access_log" pattern="%h %l %u %t %r %s %b %D %I %{JSESSIONID}c" resolveHosts="false" rotatable="false"/>
</Host>
</Engine>
</Service>
</Server>
I was trying with this sed sentence. "sed -n '/<.[Cc]onnector./>/p'" but not look, I'm only able to get the ajp connector.
some ideas?
The following (GNU) sed might work :
sed -n '/<Connector /,/\/>/p'
It prints from the line containing <Connector up to the line containing the next /> (which can be the same line).
It works with your sample data but could fail under many conditions, such as the tag having children and being closed by a </Connector> rather than self-closing. If your data has enough variation you'll probably want to use an XML selection language such as XPath, in which //Connector would be enough to get you all the Connector nodes whatever their format is.
Been wasting a full day arround this problem.
This had never happen to me before.
My service is working in an Azure machine.
I'm trying to generate a .svclog file to trace whats causing my ios app not being able to login on https, but the svclog file is not being generated.
I tried to use fiddler to capture the requests but i get nothing when i try to login by the app.
If I try to use the app with http address im able to login but the svclog is also not generated.
If I use tcptrace i can see a request and an answer but it is encrypted.
I have given permissions to "Everyone" on the folder at c:\logs\
any idea what i might be missing here? is there some sort of pre requirement to make the tracer work
This is my system diagnostics in web.Config:
<system.diagnostics>
<sources>
<source name="System.ServiceModel.MessageLogging" switchValue="Verbose">
<listeners>
<add type="System.Diagnostics.DefaultTraceListener" name="Default">
<filter type="" />
</add>
<add initializeData="c:\logs\myMessages.svclog" type="System.Diagnostics.XmlWriterTraceListener"
name="messagelistener" traceOutputOptions="DateTime, Timestamp">
<filter type="" />
</add>
</listeners>
</source>
</sources>
<sharedListeners>
<add initializeData="C:\logs\Traces.svclog" type="System.Diagnostics.XmlWriterTraceListener"
name="xmlTraceListener" traceOutputOptions="Timestamp">
<filter type="" />
</add>
</sharedListeners>
<trace autoflush="true" />
</system.diagnostics>
In our produciton system log files are pretty big (300MB etc.). I need smaller files to do analysis, I need these Sitecore log files to breakdown into 50MB chunks, how can I do this?
Sitecore 6.6
can you try to modify log4net in web.config file ?
You will have :
<appender name="LogFileAppender" type="log4net.Appender.SitecoreLogFileAppender, Sitecore.Logging">
<file value="$(dataFolder)/logs/log.{date}.txt"/>
<appendToFile value="true"/>
<!--Add this property maximumFileSize -->
<maximumFileSize value="50MB" />
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%4t %d{ABSOLUTE} %-5p %m%n"/>
</layout>
</appender>
<appender name="WebDAVLogFileAppender" type="log4net.Appender.SitecoreLogFileAppender, Sitecore.Logging">
<file value="$(dataFolder)/logs/WebDAV/WebDAV.log.{date}.txt"/>
<appendToFile value="true"/>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%4t %d{ABSOLUTE} %-5p %m%n"/>
</layout>
</appender>