GithubHelp home page GithubHelp logo

sap / cf-java-logging-support Goto Github PK

View Code? Open in Web Editor NEW
74.0 19.0 46.0 1.03 MB

The Java Logging Support for Cloud Foundry supports the creation of structured log messages and the collection of request metrics

License: Apache License 2.0

Ruby 0.23% Python 1.53% Java 98.24%

cf-java-logging-support's Introduction

Java Logging Support for Cloud Foundry

Build Status REUSE status

Note: The default branch has been renamed from master to main. Please execute on your local repository:

git branch -m master main
git fetch origin
git branch -u origin/main main
git remote set-head origin -a

Summary

This is a collection of support libraries for Java applications (Java 8 and above) that serves three main purposes:

  1. Provide means to emit structured application log messages
  2. Instrument parts of your application stack to collect request metrics
  3. Allow production of custom metrics.

The libraries started out to support applications running on Cloud Foundry. This integration has become optional. The library can be used in any runtime environment such as Kubernetes or Kyma.

When we say structured, we actually mean in JSON format. In that sense, it shares ideas with logstash-logback-encoder, but takes a simpler approach as we want to ensure that these structured messages adhere to standardized formats. With such standardized formats in place, it becomes much easier to ingest, process and search such messages in log analysis stacks such as ELK.

If you're interested in the specifications of these standardized formats, you may want to have a closer look at the fields.yml files in the beats folder.

While logstash-logback-encoder is tied to logback, we've tried to keep implementation neutral and have implemented the core functionality on top of slf4j, but provided implementations for both logback and log4j2 (and we're open to contributions that would support other implementations).

The instrumentation part is currently focusing on providing request filters for Java Servlets, but again, we're open to contributions for other APIs and frameworks.

The custom metrics instrumentation allows users to easily define and emit custom metrics. The different modules configure all necessary components and make it possible to define custom metrics with minimal code change. Once collected, custom metrics are sent as special log message.

Lastly, there is also a project on node.js logging support.

Features and dependencies

As you can see from the structure of this repository, we're not providing one uber JAR that contains everything, but provide each feature separately. We also try to stay away from wiring up too many dependencies by tagging almost all of them as provided. As a consequence, it's your task to get all runtime dependencies resolved in your application POM file.

All in all, you should do the following:

  1. Make up your mind which features you actually need.
  2. Adjust your Maven dependencies accordingly.
  3. Pick your favorite logging implementation. And
  4. Adjust your logging configuration accordingly.

Let's say you want to make use of the servlet filter feature, then you need to add the following dependency to your POM with property cf-logging-version referring to the latest nexus version (currently 3.8.3):

<properties>
	<cf-logging-version>3.8.3</cf-logging-version>
</properties>
<dependency>
  <groupId>com.sap.hcp.cf.logging</groupId>
  <artifactId>cf-java-logging-support-servlet</artifactId>
  <version>${cf-logging-version}</version>
</dependency>

This feature only depends on the servlet API which you have included in your POM anyhow. You can find more information about the servlet filter feature (like e.g. how to adjust the web.xml) in the Wiki. Note, that we provide two different servlet instrumentations:

  • cf-java-logging-support-servlet linked against javax.servlet
  • cf-java-logging-support-servlet-jakarta linked against jakarta.servlet

Both modules build on the same code but use the respective API.

If you want to use the custom metrics, just define the following dependency:

  • Spring Boot Support:
<dependency>
  <groupId>com.sap.hcp.cf.logging</groupId>
  <artifactId>cf-custom-metrics-clients-spring-boot</artifactId>
  <version>${cf-logging-version}</version>
</dependency>
  • Plain Java Support:
<dependency>
  <groupId>com.sap.hcp.cf.logging</groupId>
  <artifactId>cf-custom-metrics-clients-java</artifactId>
  <version>${cf-logging-version}</version>
</dependency>

Implementation variants and logging configurations

The core feature (on which all other features rely) is just using the org.slf4j API, but to actually get logs written, you need to pick an implementation feature. As stated above, we have two implementations:

  • cf-java-logging-support-logback based on logback, and
  • cf-java-logging-support-log4j2 based on log4j2.

Again, we don't include dependencies to those implementation backends ourselves, so you need to provide the corresponding dependencies in your POM file:

Using logback:

<dependency>
	<groupId>com.sap.hcp.cf.logging</groupId>
  	<artifactId>cf-java-logging-support-logback</artifactId>
  	<version>${cf-logging-version}</version>
</dependency>

<dependency>
  	<groupId>ch.qos.logback</groupId>
   	<artifactId>logback-classic</artifactId>
   	<version>1.2.11</version>
 </dependency>

Using log4j2:

<dependency>
	<groupId>com.sap.hcp.cf.logging</groupId>
  	<artifactId>cf-java-logging-support-log4j2</artifactId>
  	<version>${cf-logging-version}</version>
</dependency>
<dependency>
	<groupId>org.apache.logging.log4j</groupId>
	<artifactId>log4j-slf4j-impl</artifactId>
	<version>2.20.0</version>
</dependency>
	<dependency>
	<groupId>org.apache.logging.log4j</groupId>
	<artifactId>log4j-core</artifactId>
	<version>2.20.0</version>
</dependency>

As they have slightly differ in configuration, you again will need to do that yourself. But we hope that we've found an easy way to accomplish that. The one thing you have to do is pick our encoder in your logback.xml if you're using logback or our layout in your log4j2.xmlif you're using log4j2.

Here are the minimal configurations you'd need:

logback.xml:

<configuration debug="false" scan="false">
	<appender name="STDOUT-JSON" class="ch.qos.logback.core.ConsoleAppender">
       <encoder class="com.sap.hcp.cf.logback.encoder.JsonEncoder"/>
    </appender>
    <!-- for local development, you may want to switch to a more human-readable layout -->
    <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
            <pattern>%date %-5level [%thread] - [%logger] [%mdc] - %msg%n</pattern>
        </encoder>
    </appender>
    <root level="${LOG_ROOT_LEVEL:-WARN}">
       <!-- Use 'STDOUT' instead for human-readable output -->
       <appender-ref ref="STDOUT-JSON" />
    </root>
  	<!-- request metrics are reported using INFO level, so make sure the instrumentation loggers are set to that level -->
    <logger name="com.sap.hcp.cf" level="INFO" />
</configuration>

log4j2.xml:

<Configuration
   status="warn" strict="true"
   packages="com.sap.hcp.cf.log4j2.converter,com.sap.hcp.cf.log4j2.layout">
	<Appenders>
        <Console name="STDOUT-JSON" target="SYSTEM_OUT" follow="true">
            <JsonPatternLayout charset="utf-8"/>
        </Console>
        <Console name="STDOUT" target="SYSTEM_OUT" follow="true">
            <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} [%mdc] - %msg%n"/>
        </Console>
	</Appenders>
  <Loggers>
     <Root level="${LOG_ROOT_LEVEL:-WARN}">
        <!-- Use 'STDOUT' instead for human-readable output -->
        <AppenderRef ref="STDOUT-JSON" />
     </Root>
  	 <!-- request metrics are reported using INFO level, so make sure the instrumentation loggers are set to that level -->
     <Logger name="com.sap.hcp.cf" level="INFO"/>
  </Loggers>
</Configuration>

Custom Metrics

With the custom metrics feature you can send metrics defined inside your code. Metrics are emitted as log messages when your application is bound to a service called application-logs. This is done because of the special format supported by the SAP BTP Application Logging Service and for compatibility with the prior apporach. To use the feature you'd need:

  1. Instrumenting Spring Boot 2 applications:
<dependency>
  <groupId>com.sap.hcp.cf.logging</groupId>
  <artifactId>cf-custom-metrics-clients-spring-boot</artifactId>
  <version>${cf-logging-version}</version>
</dependency>

The Spring Boot instrumentation uses Spring Boot Actuator which allows to read predefined metrics and write custom metrics. The Actuator supports Micrometer and is part of Actuator's dependencies. In your code you work directly with Micrometer. Define your custom metrics and iterate with them:

import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.concurrent.atomic.AtomicInteger;

import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import io.micrometer.core.instrument.Counter;
import io.micrometer.core.instrument.LongTaskTimer;
import io.micrometer.core.instrument.Metrics;
import io.micrometer.core.instrument.Tag;

@RestController
public class DemoController {

	private Counter counter;
	private AtomicInteger concurrentHttpRequests;
	private LongTaskTimer longTimer;

	DemoController() {
		this.counter = Metrics.counter("demo.controller.number.of.requests", "unit", "requests");
		List<Tag> tags = new ArrayList<Tag>(Arrays.asList(new Tag[] { Tag.of("parallel", "clients") }));
		this.concurrentHttpRequests = Metrics.gauge("demo.controller.number.of.clients.being.served", tags,
				new AtomicInteger(0));
		this.longTimer = Metrics.more().longTaskTimer("demo.controller.time.spends.in.serving.clients");
	}

	@RequestMapping("/")
	public String index() {
		longTimer.record(() -> {
			this.counter.increment();
			concurrentHttpRequests.addAndGet(1);
			try {
				Thread.sleep(1000);
			} catch (InterruptedException e) {
				LOGGER.error(e);
			} finally {
				concurrentHttpRequests.addAndGet(-1);
			}
		});

		return "Greetings from Custom Metrics!";
	}
}

In the example above, three custom metrics are defined and used. The metrics are Counter, LongTaskTimer and Gauge.

  1. Instrumenting plain Java applications:
<dependency>
  <groupId>com.sap.hcp.cf.logging</groupId>
  <artifactId>cf-custom-metrics-clients-java</artifactId>
  <version>${cf-logging-version}</version>
</dependency>

The Java instrumentation uses Dropwizard which allows to define all kind of metrics supports by Dropwizard. The following metrics are available: com.codahale.metrics.Gauge, com.codahale.metrics.Counter, com.codahale.metrics.Histogram, com.codahale.metrics.Meter and com.codahale.metrics.Timer. More information about the metric types and their usage. Define your custom metrics and iterate with them:

import java.io.IOException;

import javax.servlet.ServletException;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;

import com.codahale.metrics.Counter;
import com.codahale.metrics.Meter;
import com.sap.cloud.cf.monitoring.java.CustomMetricRegistry;

public class CustomMetricsServlet extends HttpServlet {
    private static Counter counter = CustomMetricRegistry.get().counter("custom.metric.request.count");
    private static Meter meter = CustomMetricRegistry.get().meter("custom.metric.request.meter");

    @Override
    public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
        counter.inc(3);
        meter.mark();
        response.getWriter().println("<p>Greetings from Custom Metrics</p>");
    }
}

Custom metrics Configurations

This library supports the following configurations regarding sending custom metrics:

  • interval: the interval for sending metrics, in millis. Default value: 60000
  • enabled: enables or disables the sending of metrics. Default value: true
  • metrics: array of whitelisted metric names. Only mentioned metrics would be processed and sent. If it is an empty array all metrics are being sent. Default value: []
  • metricQuantiles: enables or disables the sending of metric's quantiles like median, 95th percentile, 99th percentile. SpringBoot does not support this configuration. Default value: false

Configurations are read from environment variable named CUSTOM_METRICS. To change the default values, you should override the environment variable with your custom values. Example:

{
    "interval": 30000,
    "enabled": true,
    "metrics": [
        "my.whitelist.metric.1",
        "my.whitelist.metric.2"
    ],
    "metricQuantiles":true
}

Dynamic Log Levels

This library provides the possibility to change the log-level threshold for a single thread by adding a token in the header of a request. A detailed description about how to apply this feature can be found here.

Logging Stacktraces

Stacktraces can be logged within one log message. Further details can be found here.

Sample Applications

In order to illustrate how the different features are used, this repository includes two sample applications:

Documentation

More info on the actual implementation can be found in the Wiki.

Licensing

Please see our LICENSE for copyright and license information. Detailed information including third-party components and their licensing/copyright information is available via the REUSE tool.

cf-java-logging-support's People

Contributors

altenhof avatar anklei avatar c-otto avatar christiand93 avatar christopheichhorn avatar dependabot[bot] avatar deyanzhelyazkov avatar haraldfuchs avatar hariharan-gandhi avatar istvanballok avatar j-denner avatar juergen-walter avatar karstenschnitter avatar kiranponnuswamy avatar mofterdinger avatar nicklas-dohrn avatar pc-jedi avatar rahuldeepattri avatar stefan0001 avatar timgerlach avatar torax242 avatar vtintillier avatar wolfgangtheilmann avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

cf-java-logging-support's Issues

Info logs are not reported, if instrumented via Servlet(RequestLoggingFilter)

Description: Our service is integrated with sap-app-logging-service(latest version). And we are able to see error logs, but are unable to see the info logs in kibana dashboard.

The logback.xml looks something like this:

<configuration debug="false" scan="false"> <appender name="STDOUT-JSON" class="ch.qos.logback.core.ConsoleAppender"> <encoder class="com.sap.hcp.cf.logback.encoder.JsonEncoder" /> </appender> <!-- for local development, you may want to switch to a more human-readable layout --> <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender"> <encoder> <pattern>%d %-5level [%thread] %logger{0} [%mdc]>: %msg %replace(%xEx){'\n', ' | '}%nopex%n</pattern> </encoder> </appender> <root level="${LOG_ROOT_LEVEL:-INFO}"> <!-- Use 'STDOUT' instead for human-readable output --> <appender-ref ref="STDOUT-JSON" /> </root> <!-- request metrics are reported using INFO level, so make sure the instrumentation loggers are set to that level --> <logger name="com.sap.hcp.cf" level="INFO" /> <turboFilter class="com.sap.hcp.cf.logback.filter.CustomLoggingTurboFilter" /> </configuration>

Can you let us know why we are not able to see info logs?

Why "X-CorrelationID" instead of "X-Correlation-ID"?

A quick Google search shows a lot of results for "X-Correlation-ID", but not too much for "X-CorrelationID". The Wikipedia article about HTTP headers even lists "X-Correlation-ID" as a common non-standard header for correlating requests between clients and servers. I know that this header is not standard and therefore there's no limitation for its naming, but why are we not going for the common approach?
Also, is it possible to support both names?

Not able to create executable spring jar.. getting EXCEPTION:: Exception in thread "main" java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication at com.movie.cataloge.moviecatalogservice.MovieCatalogServiceApplication.main(MovieCatalogServiceApplication.java:16) Caused by: java.lang.ClassNotFoundException: org.springframework.boot.SpringApplication at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581) at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522) ... 1 more


4.0.0

org.springframework.boot
spring-boot-starter-parent
2.4.0


com.movie.cataloge
movie-catalog-service
0.0.1
movie-catalog-service
Demo project for Spring Boot

	<start-class>com.movie.cataloge.moviecatalogservice.MovieCatalogServiceApplication</start-class>
	<java.version>11</java.version>
	<spring-cloud.version>2020.0.0</spring-cloud.version>
</properties>
<dependencies>
	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-web</artifactId>
	</dependency>
	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-webflux</artifactId>
	</dependency>


	<dependency>
		<groupId>org.projectlombok</groupId>
		<artifactId>lombok</artifactId>
		<optional>true</optional>
	</dependency>

	<dependency>
		<groupId>org.springframework.cloud</groupId>
		<artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
	</dependency>
	<!--<dependency>
		<groupId>javax.xml.bind</groupId>
		<artifactId>jaxb-api</artifactId>
		<version>2.3.0</version>
	</dependency>-->
	<dependency>
		<groupId>javax.xml.bind</groupId>
		<artifactId>jaxb-api</artifactId>
		<version>2.3.0</version>
	</dependency>
	<dependency>
		<groupId>org.glassfish.jaxb</groupId>
		<artifactId>jaxb-runtime</artifactId>
		<version>2.3.0</version>
		<scope>runtime</scope>
	</dependency>
	<dependency>
		<groupId>javax.activation</groupId>
		<artifactId>javax.activation-api</artifactId>
		<version>1.2.0</version>
	</dependency>
	<dependency>
		<groupId>javax.xml.ws</groupId>
		<artifactId>jaxws-api</artifactId>
		<version>2.3.0</version>
		<scope>test</scope>
	</dependency>

	<dependency>
		<groupId>org.springframework.boot</groupId>
		<artifactId>spring-boot-starter-test</artifactId>
		<scope>test</scope>
	</dependency>
</dependencies>

<dependencyManagement>
	<dependencies>
		<dependency>
			<groupId>org.springframework.cloud</groupId>
			<artifactId>spring-cloud-dependencies</artifactId>
			<version>${spring-cloud.version}</version>
			<type>pom</type>
			<scope>import</scope>
		</dependency>
	</dependencies>
</dependencyManagement>

<build>
	<plugins>
		<plugin>
			<groupId>org.springframework.boot</groupId>
			<artifactId>spring-boot-maven-plugin</artifactId>
		</plugin>
	</plugins>
</build>
<repositories>
	<repository>
		<id>spring-milestones</id>
		<name>Spring Milesstones</name>
		<url>https://repo.spring.io/milestone</url>
	</repository>
</repositories>

This is the pom.xml file
using JDK 11
and maven 3.2.0

Wrong output for async servlets

Summary

Currently, this library produces a misleading output if the Servlet is async enabled. This is because the filter would be called twice: once on the REQUEST dispatch, where the async processing would start and once on the ASYNC dispatch.

I understand that the logging context / correlation ID propagation is out of scope, but the request / response logging is actually faulty in this particular case.

Steps to reproduce:

It easiest to reproduce this with Spring (a lot easier to setup all the stuff):

  1. Create a new Spring Boot App.
  2. Add the filter.
@Bean
public FilterRegistrationBean loggingFilter() {
	FilterRegistrationBean bean = new FilterRegistrationBean(new RequestLoggingFilter());
	bean.addUrlPatterns("/*");
	return bean;
}
  1. Create a controller which returns a Future.
@RestController
class DemoController {
	private static final Logger LOG = LoggerFactory.getLogger(DemoController.class);

	@GetMapping("/something")
	public CompletableFuture<ResponseEntity<String>> doSomething() {
		LOG.info("Inside sync method.");
		return CompletableFuture.supplyAsync(() -> {
			LOG.info("Inside async supply.");
			return new ResponseEntity<>("Something!", HttpStatus.CREATED);
		});
	}
}

Expected behaviour:

The response is logged only once, with the correct size, status, etc.

Actual behaviour:

We get two logs from the RequestLoggingFilter (one for each dispatcher type). The first one has incorrect status / size / etc:

c.s.h.c.l.c.LogContext                   : generated new correlation id
c.e.d.DemoController                     : Inside sync method.
c.e.d.DemoController                     : Inside async supply.
c.s.h.c.l.s.f.RequestLoggingFilter       : {"request":"/something","referer":"-","response_sent_at":"2018-05-14T05:16:12.781Z","response_status":200,"method":"GET","response_size_b":-1,"request_size_b":-1,"remote_port":"redacted","layer":"[SERVLET]","remote_host":"redacted","x_forwarded_for":"-","remote_user":"-","protocol":"HTTP/1.1","remote_ip":"redacted","response_content_type":"-","request_received_at":"2018-05-14T05:16:12.759Z","response_time_ms":22.124503,"direction":"IN"}
c.s.h.c.l.c.LogContext                   : generated new correlation id
c.s.h.c.l.s.f.RequestLoggingFilter       : {"request":"/something","referer":"-","response_sent_at":"2018-05-14T05:16:12.817Z","response_status":201,"method":"GET","response_size_b":10,"request_size_b":-1,"remote_port":"redacted","layer":"[SERVLET]","remote_host":"redacted","x_forwarded_for":"-","remote_user":"-","protocol":"HTTP/1.1","remote_ip":"redacted","response_content_type":"text/plain;charset=UTF-8","request_received_at":"2018-05-14T05:16:12.803Z","response_time_ms":13.84298,"direction":"IN"}

Configure default dashboard

Hello,

Is there a way to determine the default dashboard (homepage) called when accessing Kibana from SCP? Currently, it always displays the standard dashboards, and I would like to show a custom one we have created for our application.

Hugs

Provide default Filters used in RequestLoggingFilter by static method

When customising the RequestLoggingFilter a developer cannot get the default filters. This makes it harder to replace one of the filters with an own implementation. There should be an additional static function to retrieve the filters, e.g.:

public static Filter[] getDefaultFilters() {
    return new Filter[] {
        new AddVcapEnvironmentToLogContextFilter(),
        new AddHttpHeadersToLogContextFilter(),
        new CorrelationIdFilter(),
        new DynamicLogLevelFilter(),
        new GenerateRequestLogFilter()
    };
}
 
public RequestLoggingFilter() {
    super(getDefaultFilters());
}

A custom subclass of RequestLoggingFilter could use this function like this to substitute the DynamicLogLevelFilter for example:

public CustomRequestLoggingFilter() {
  super(Stream.of(getDefaultFilters())
    .map(f -> f instanceof DynamicLogLevelFilter ? new CustomDynamicLogLevelFilter() : f)
    .toArray(Filter[]::new)
  );
}

Provide option to not emit fields with default values "-".

cf-java-logging-support attaches a default set of fields to every log messages defined as CTX_FIELDS in LogContext:

private static Map<String, String> CTX_FIELDS = new HashMap<String, String>() {
    {
        put(Fields.CORRELATION_ID, Defaults.UNKNOWN);
        put(Fields.TENANT_ID, Defaults.UNKNOWN);
        put(Fields.TENANT_SUBDOMAIN, Defaults.UNKNOWN);
        put(Fields.COMPONENT_ID, Defaults.UNKNOWN);
        put(Fields.COMPONENT_NAME, Defaults.UNKNOWN);
        put(Fields.COMPONENT_TYPE, Defaults.COMPONENT_TYPE);
        put(Fields.COMPONENT_INSTANCE, Defaults.COMPONENT_INDEX);
        put(Fields.CONTAINER_ID, Defaults.UNKNOWN);
        put(Fields.ORGANIZATION_ID, Defaults.UNKNOWN);
        put(Fields.ORGANIZATION_NAME, Defaults.UNKNOWN);
        put(Fields.SPACE_ID, Defaults.UNKNOWN);
        put(Fields.SPACE_NAME, Defaults.UNKNOWN);
    }
};

These fields are populated from environment variables using VcapEnvReader except for CORRELATION_ID, TENANT_ID and TENANT_SUBDOMAIN. If variables are unknown, cf-java-logging-support will still add the fields to the generated log message with a value of "-". This behaviour enlarges the log messages without giving any benefit.

Compare the following test messages, the first containing the default values, the second with the respective fields removed:

{ "written_at":"2021-02-28T09:18:33.452Z","written_ts":1614503913455790000,"tenant_id":"-","component_type":"application","component_id":"-","space_name":"-","component_name":"-","component_instance":"0","organization_id":"-","correlation_id":"-","organization_name":"-","space_id":"-","container_id":"-","tenant_subdomain":"-","type":"log","logger":"com.sap.hcp.cf.logging.common.TestAppLog","thread":"main","level":"INFO","categories":[],"msg":"Running test()" }
{ "written_at":"2021-02-28T09:19:06.653Z","written_ts":1614503946656499000,"component_type":"application","component_instance":"0","type":"log","logger":"com.sap.hcp.cf.logging.common.TestAppLog","thread":"main","level":"INFO","categories":[],"msg":"Running test()" }

Their sizes are 465 vs 268 characters. In the example 200 bytes of unnecessary log volume was created. There should be an option to suppress the generation of context fields with default values. So that instead of the first message of the example the library would generate the second message of the example.

Is there a way to use this library and keep the console logs sane

Our project is using logback and we are trying to improve the Kibana logging for our CF application. We have bound the usual application-logs service, but out-of-the-box the log level of our Java server logs is not properly recognized and stack traces get logged as multiple records.

Now when I follow the configuration here:

    <appender name="STDOUT-JSON" class="ch.qos.logback.core.ConsoleAppender">
        <encoder class="com.sap.hcp.cf.logback.encoder.JsonEncoder"/>
    </appender>

I see this JSON log on the console always, which is not acceptable for me. I actually don't want to change my console logs in any way.

Do you provide an appender that writes into logstash directly?

Like
https://github.com/logstash/logstash-logback-encoder

net.logstash.logback.appender.LogstashSocketAppender
net.logstash.logback.appender.LogstashTcpSocketAppender

Or even populate logstash from files persisted by the application:
https://www.baeldung.com/java-application-logs-to-elastic-stack

Generated correlation ID overwritten

In our new project I used version 3.0.0, and later 3.0.2, but no correlation ID was in the logs, although I had configured RequestLoggingFilter as a bean in our Spring Boot 2 app,
Debugging revealed that a correlation ID is generated (and a log is written documenting that), but then the correlation ID in the MDC is overwritten by accessing the request again, and as it has no correlation ID header, UNKNOWN (i.e. "-") is written to the MDC (which is NOT logged).
I downgraded to version 2.2.3, which we used in an older project, and that works without any other changes: correlation IDs are generated, written to the MDC, and appear in the logs.
Is this a new feature? Should I do something in our code to recover the "old" behavior?
Thanks & best regards
D042555

Allow dynamic log level based on Logger names

Introduction

Log levels are usually configured by logger names. Typically the fully qualified class name is used. This allows configuration of log levels on a package basis. The current dynamic log level feature only allows to change the log level globally without any finer control. There should be the possibility to have more control on what classes the log level is changed dynamically.

Proposal

Provide new configurable filters for logback and log4j2, that can be used to apply the dynamic log levels. The JWT token should be extended by a field containing a comma-separated list of prefixes, by which the logger names are filtered. The dynamic log level should only be applied to loggers, that match the prefix.

Log Tenant ID with custom Filter

Hi colleagues,

In our project, we are using the SAP's XSUAA Library to parse JWT Tokens. We want to add a custom Filter to add the Zone ID to the LogContext. Our implementation, aligned on what your sample application does, looks like this:

@Configuration
class LoggingConfiguration {
  private static final Logger LOGGER = LoggerFactory.getLogger(LoggingConfiguration.class);
  private static final Marker MARKER = LoggingMarker.LOGGING.getMarker();

  /**
   * Adds a servlet filter which extracts the correlation id from the request
   * header and adds it to the log context. Filter is set to be applied last
   * (after spring security filter chain)
   */
  @Bean
  public FilterRegistrationBean<Filter> loggingFilter() {
    final FilterRegistrationBean<Filter> filterRegistrationBean = new FilterRegistrationBean<>();
    filterRegistrationBean.setFilter(new LoggingFilter());
    filterRegistrationBean.setName("request-logging");
    filterRegistrationBean.addUrlPatterns("/*");
    filterRegistrationBean.setDispatcherTypes(DispatcherType.REQUEST);
    filterRegistrationBean.setOrder(Integer.MAX_VALUE);
    return filterRegistrationBean;
  }

  private static class TenantIdFilter extends AbstractLoggingFilter {
    @Override
    protected void beforeFilter(HttpServletRequest request, HttpServletResponse response) {
      try {
        String tenantId = SpringSecurityContext.getToken().getZoneId();
        LogContext.add(Fields.TENANT_ID, tenantId);
      } catch (AccessDeniedException ade) {
        LOGGER.info(MARKER, "tenant_id not available, so not written to log context: {}", ade.getMessage());
      }
    }

    @Override
    protected void cleanup(HttpServletRequest request, HttpServletResponse response) {
      LogContext.remove(Fields.TENANT_ID);
    }
  }

  private static class LoggingFilter extends CompositeFilter {
    public LoggingFilter() {
      super(new AddVcapEnvironmentToLogContextFilter(), new AddHttpHeadersToLogContextFilter(),
          new CorrelationIdFilter(), new TenantIdFilter(), new DynamicLogLevelFilter(), new GenerateRequestLogFilter());
    }
  }
}

However, we found that the GenerateRequestLogFilter somehow overrides the tenant_id field in LogContext. Seems like the corresponding header is marked as propagated, and this line of code here overwrites the field from the headers:

This seems strange -- we would have expected that our custom fields are not overwritten. We tried to work around this by trying to set the Header on the HttpServletRequest, but this isn't possible unless you (uglyly) subclass HttpServletRequestWrapper. We found that when we set the tenantid header on the response, though, this somehow works (as it works with the correlation id as well) -- but we don't understand why:

    @Override
    protected void beforeFilter(HttpServletRequest request, HttpServletResponse response) {
      try {
        final String tenantId = SpringSecurityContext.getToken().getZoneId();
        LogContext.add(Fields.TENANT_ID, tenantId);
        if (!response.isCommitted())
          response.addHeader(HttpHeaders.TENANT_ID.getName(), tenantId);
      } catch (AccessDeniedException ade) {
        LOGGER.info(MARKER, "tenant_id not available, so not written to log context: {}", ade.getMessage());
      }
    }

Is this intended (and stable behavior in your api)? Is there a better way to do this?
To me, it seems like addContextTag shouldn't overwrite the LogContext if a field has already been set, especially as the headers should have already been set there by the AddHttpHeadersToLogContextFilter -- but I don't know if there's something else at play here.

Thank you for your support!
Kind regards
Lukas

Custom metrics are not appearing in Kibana

Hey!

I might have missed a point but how do I actually get my custom metrics into Kibana? I've got this in my service:

2020-08-17T11:30:48.61+0200 [APP/PROC/WEB/0] OUT 2020-08-17 09:30:48.612 INFO 6 --- [ main] c.s.c.c.m.spring.CustomMetricWriter : Starting custom metrics reporting with the following configuration: CustomMetricsConfiguration[interval=30000, enabled=true, metrics=[HelloController.number.of.requests]], metricQuantiles=false

this from my actuator
{"name":"HelloController.number.of.requests","description":null,"baseUnit":null,"measurements":[{"statistic":"COUNT","value":6.0}],"availableTags":[{"tag":"unit","values":["requests"]}]}

and "normal" log messages appear in Kibana.

Would be great if someone could point me into the right direction...

RequestLoggingFilter manipulates inputstream - Streaming not achievable

Hi colleagues,

We are from "C21- Sustainability services". We are building application in SCP CF. Previously, we noticed that the usage of library - cf-java-logging-support-servlet, breaks the streaming ability of the application.

When dug through the underlying code in RequestLoggingFilter.java , could notice this piece,

           if (wrapResponse) {
                responseWrapper = new ContentLengthTrackingResponseWrapper(httpResponse);
            }
            if (wrapRequest) {

                requestWrapper = new ContentLengthTrackingRequestWrapper(httpRequest);
            }

This is consuming the "InputStream" and "OutputStream" which breaks the streaming ability.

Please check here : https://stackoverflow.com/questions/39748536/spring-upload-non-multipart-file-as-a-stream

Can you propose a solution for this?

The kind of applications we deal with are
-> CAP based : OData & REST
-> Spring-Boot based : REST

Thank You
Regards
Sreeram

requests - Support for field tenant_id

Hello All,

Despite the definition of tenant_id field for requests, I could not find any evidence in the source code that this field is productively supported. Could you please confirm whether this supported or not?

Best Regards,
Guilherme.

Log4j2 JsonPatternLayout might generate invalid JSON in v3.5.3

With v3.5.3 come a new feature, that fields with default value are by default not emitted in the generated log messages. For the cf-java-logging-support-log4j2, this can result in invalid JSON messages. The feature can be disabled with the XML attribute sendDefaultValues="true.

<Configuration
   status="warn" strict="true"
   packages="com.sap.hcp.cf.log4j2.converter,com.sap.hcp.cf.log4j2.layout">
	<Appenders>
        <Console name="STDOUT-JSON" target="SYSTEM_OUT" follow="true">
            <JsonPatternLayout charset="utf-8" sendDefaultValues="true" />
        </Console>
        <Console name="STDOUT" target="SYSTEM_OUT" follow="true">
            <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} [%mdc] - %msg%n"/>
        </Console>
	</Appenders>
  <Loggers>
     <Root level="${LOG_ROOT_LEVEL:-WARN}">
        <!-- Use 'STDOUT' instead for human-readable output -->
        <AppenderRef ref="STDOUT-JSON" />
     </Root>
  	 <!-- request metrics are reported using INFO level, so make sure the instrumentation loggers are set to that level -->
     <Logger name="com.sap.hcp.cf" level="INFO"/>
  </Loggers>
</Configuration>

TenantId not taken from Request

we want to use the field tenant_id in the MDC properties to add our tenant id to the logger. So far we set the tenant id to the MDC properties by ourselves in a own servlet filter. Would it be possible to add the tenant id automatically to the MDC properties from the Request, like it is already done for the correlation id? This would be very helpful for our applications

Dynamic setting of log level for individual tenants

Hi,

As a Developer, Quality Engineer or Operations Experts I want to set the log level on a tenants individual basis in order to debug issues reported for one tenant only.

This allows me to:

  1. keep the amount of log below the technical request limitations/plan of the backend logging service,
  2. reduce logs to focus on the problems reported by that specific tenant,
  3. not log potentially critical data of tenants which are not impacted by a dedicated issue (data avoidence/minimization).

Optional: As a Developer, Quality Engineer or Operations Experts I want to set the log level on a tenants individual basis at runtime without restarting any service instances, in order to avoid unexpected service interruptions

Thanks in advance

Add support for Spring MVC

Adding a Servlet Filter does not seem to be the right approach when using Spring MVC (Spring Boot). Instead, a Spring Interceptor should be used.

Custom Dashboard/KQL in SAP SCP Kibana

Hi colleagues,

We are using SAP CAP oData service in SAP SCP with SAP SCP Kibana binding. All work well, however we are not able to create custom KQL query neither custom dashboards. We have created IT Ticket https://support.wdf.sap.corp/sap/support/message/2180098526
They are saying that custom things are not available in this SAP SCP kibana, that we may chack https://pages.github.tools.sap/perfx/cloud-logging-service/. However I can not see there such information.

Can you say how we can create custom dashboard/kql in kibana based on logs of applications in SAP SCP?

Thanks
Regards Peter

Application crashes at startup with LogBack binding

Dear colleagues,

When I push my app into CF, it crashes with the following error messages:
2017-09-22T11:07:11.23+0200 [APP/PROC/WEB/0] OUT 09:07:11,218 |-ERROR in ch.qos.logback.core.joran.util.PropertySetter@26336332 - A "com.sap.hcp.cf.logback.encoder.JsonEncoder" object is not assignable to a "ch.qos.logback.core.encoder.Encoder" variable.
2017-09-22T11:07:11.23+0200 [APP/PROC/WEB/0] OUT 09:07:11,218 |-ERROR in ch.qos.logback.core.joran.util.PropertySetter@26336332 - The class "ch.qos.logback.core.encoder.Encoder" was loaded by
2017-09-22T11:07:11.23+0200 [APP/PROC/WEB/0] OUT 09:07:11,219 |-ERROR in ch.qos.logback.core.joran.util.PropertySetter@26336332 - [ParallelWebappClassLoader
2017-09-22T11:07:11.23+0200 [APP/PROC/WEB/0] OUT context: ROOT
2017-09-22T11:07:11.24+0200 [APP/PROC/WEB/0] OUT delegate: false
2017-09-22T11:07:11.24+0200 [APP/PROC/WEB/0] OUT ----------> Parent Classloader:
2017-09-22T11:07:11.24+0200 [APP/PROC/WEB/0] OUT java.net.URLClassLoader [id=17300,parents=System@14021]

My logback.xml file:

<configuration debug="false" scan="false">
  <appender name="STDOUT-JSON" class="ch.qos.logback.core.ConsoleAppender">
    <encoder class="com.sap.hcp.cf.logback.encoder.JsonEncoder" />
  </appender>
  <!-- for local development, you may want to switch to a more human-readable 
    layout -->
  <appender name="STDOUT" class="ch.qos.logback.core.ConsoleAppender">
    <encoder>
      <pattern>%date %-5level [%thread] - [%logger] [%mdc] - %msg%n
      </pattern>
    </encoder>
  </appender>
  <root level="${LOG_ROOT_LEVEL:-WARN}">
    <!-- Use 'STDOUT' instead for human-readable output -->
    <appender-ref ref="STDOUT-JSON" />
  </root>
  <!-- request metrics are reported using INFO level, so make sure the instrumentation 
    loggers are set to that level -->
  <logger name="com.sap.hcp.cf" level="INFO" />
</configuration>

My pom file:

<dependency>
      <groupId>com.sap.hcp.cf.logging</groupId>
      <artifactId>cf-java-logging-support-logback</artifactId>
      <version>2.0.10</version>
    </dependency>
<dependency>
      <groupId>ch.qos.logback</groupId>
      <artifactId>logback-classic</artifactId>
      <version>1.1.11</version>
</dependency>

I use logback 1.1.11 as I want to be on par with one pulled by Spring Boot

Any help is appreciated.

Regards,

Remove dependency to jaxb

In the module cf-java-logging-support-servlet the classes TokenCreator and PublicKeyReader introduce a dependency to jaxb sole for Base64 encoding. This leads to additional dependencies for java 11.

Since Java 8 is prerequisite, java.util.Base64 can be used for this reducing the dependency graph.

Remove StringBuilder in Logback TimestampConverter

TimestampConverter in cf-java-logging-support-logback contains the following code:

    @Override
    public String convert(ILoggingEvent event) {
        StringBuilder appendTo = new StringBuilder();
		Instant now = Instant.now();
		long timestamp = now.getEpochSecond() * 1_000_000_000L + now.getNano();
		appendTo.append(timestamp);
        return appendTo.toString();
    }

Creating the StringBuilder is a time consuming implementation for creating a string representation of the long timestamp. This should be changed to a more efficient implementation.

Version referenced in sample app is not in sync

We tend to forget to bump the reference to the library version in the sample pom.xml. This time, it's a real deal breaker as the app will crash with:

Caused by: java.lang.NoClassDefFoundError: com/sap/hcp/cf/logging/common/converter/DefaultArgsConverter
	at com.sap.hcp.cf.log4j2.converter.ArgsConverter.<init>(ArgsConverter.java:26)
	at com.sap.hcp.cf.log4j2.converter.ArgsConverter.newInstance(ArgsConverter.java:38)
	... 69 more
Caused by: java.lang.ClassNotFoundException: com.sap.hcp.cf.logging.common.converter.DefaultArgsConverter
	at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1293)
	at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1145)
	... 71 more

I suggest that we fix this once and for all, either be removing this line or changing it to

        <cf-logging.version>${project.version}</cf-logging.version>

Anybody any preferences?

Failure to find com.sap.hcp.cf.logging:java-logging-support-log4j2

Hi Experts,

I am trying to build (clean install) project within webide. However it failed. Below is the error.

(Java Build) [ERROR] Failed to execute goal on project cf_web_app: Could not resolve dependencies for project com.company.java_web_ide:cf_web_app:war:0.0.1-SNAPSHOT: Failure to find com.sap.hcp.cf.logging:java-logging-support-log4j2:jar:3.0.0 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced -> [Help 1]

Feature request: EclipseLink support

Currently, the library does not format log messages emitted by EclipseLink. It would be great if (debug) log messages like the one shown below would be printed in the JSON format. I'd appreciate native support, but some guidelines on how to modify the code so that the log messages are routed to SLF4J are also appreciated.

[EL Fine]: sql: 2016-06-23 12:00:12.706--ServerSession(400836922)--Connection(1567361246)--Thread(Thread[http-apr-8080-exec-2,5,main])--SELECT COUNT(ID) FROM XXX

log4j2 doesn't work in spring

Even though we disable the standard logging in Spring and add the log4j2 implementation, the SAP Java Buildpack already loads the logback in its BOM. So, whenever you try to implement log4j2 configuration/setup it only works locally. In CF you will get errors about conflicting implementation related to logback and log4j2 being both in the path.

How to add Request Context fields (such as remote_user) on all log message performed in the context of a Request?

Hi.

I would like the logs performed in a context of a request processing to log some fields from the Request Context (such as remote_user).

Let's say I have a controller that calls a service that performs some operation and in the middle of this operation, I want to log if something went wrong. I would like this log on the Kibana to have the field remote_user filled with the user that called the endpoint defined in the controller.
In this project I'm using Spring Security and the authentication is done using JWT tokens from SAP XSUAA.

Is there a configuration that I can make to add this information (and maybe others in the future) to all logs done this way?

custom_fields was not included in request log

We want to use some custom_fields to filter request logs in Kibana, but custom_fields were not included in request logs generated using RequestLoggingFilter.

Is it possible to include custom_fields in request logs?

Logging request header and body

Hi

Do we have the capability of logging request information like request headers, request body? If yes, then how to enable it? If no, then are there any plans for the same?

Regards

Micrometer custom metrics on Kibana

Hi experts,

I followed the sample code below to create some custom metrics using micrometer.

this.counter = Metrics.counter("demo.contoller.number.of.requests", "unit", "requests");
List tags = new ArrayList(Arrays.asList(new Tag[] { Tag.of("parallel", "clients") }));
this.concurrentHttpRequests = Metrics.gauge("demo.controller.number.of.clients.being.served", tags,
new AtomicInteger(0));
this.longTimer = Metrics.more().longTaskTimer("demo.controller.time.spends.in.serving.clients");

I am able to deploy and run the application. However, I can't find these custom metrics in Kibana dashboard. Any thoughts?

Deprecation of cf-java-logging-support-jersey

Hi,

as the maintainer of cf-java-logging-support I want to deprecate the Jersey support of the library. This means everything in the artifact cf-java-logging-support-jersey. This part has not been extended in the same way the servlet instrumentation has been improved. For any new features you already need to use cf-java-logging-support-servlet instead.

If you use cf-java-logging-support-jersey and need it to be maintained, please comment on this issue.

Best Regards,
Karsten

Errors on application startup

Hi Guys,
from time to time we get some errors on application startup:

2018-03-15T14:28:32.486+0000 [APP/PROC/WEB/0] OUT 14:28:32,486 |-ERROR in ch.qos.logback.core.joran.spi.Interpreter@53:30 - no applicable action for [springProfile], current ElementPath is [[configuration][root][springProfile]]
2018-03-15T14:28:32.495+0000 [APP/PROC/WEB/0] OUT 14:28:32,495 |-ERROR in ch.qos.logback.core.joran.spi.Interpreter@54:38 - no applicable action for [appender-ref], current ElementPath is [[configuration][root][springProfile][appender-ref]]
2018-03-15T14:28:32.496+0000 [APP/PROC/WEB/0] OUT 14:28:32,495 |-ERROR in ch.qos.logback.core.joran.spi.Interpreter@57:31 - no applicable action for [springProfile], current ElementPath is [[configuration][root][springProfile]]
2018-03-15T14:28:32.496+0000 [APP/PROC/WEB/0] OUT 14:28:32,495 |-ERROR in ch.qos.logback.core.joran.spi.Interpreter@58:33 - no applicable action for [appender-ref], current ElementPath is [[configuration][root][springProfile][appender-ref]]

After the errors occurred, Cloud Foundry restarts our application and then it works (no errors this time). Hopefully you can help me.

Best regards,
Steve

Maven warning for com.sun.xml.bind on JDK11

I am using latest version v3.0.7 on SapMachine 11.0.6.0.1+10-LTS.

mvn clean install (-X for debug output) leads to the following output:

[WARNING] The POM for com.sun.xml.bind:jaxb-core:jar:2.2.11 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] The POM for com.sun.xml.bind:jaxb-impl:jar:2.2.11 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details

with debug info

[WARNING] The POM for com.sun.xml.bind:jaxb-core:jar:2.2.11 is invalid, transitive dependencies (if any) will not be available: 1 problem was encountered while building the effective model for com.sun.xml.bind:jaxb-core:2.2.11
[ERROR] 'dependencyManagement.dependencies.dependency.systemPath' for com.sun:tools:jar must specify an absolute path but is ${tools.jar} @

[WARNING] The POM for com.sun.xml.bind:jaxb-impl:jar:2.2.11 is invalid, transitive dependencies (if any) will not be available: 1 problem was encountered while building the effective model for com.sun.xml.bind:jaxb-impl:2.2.11
[ERROR] 'dependencyManagement.dependencies.dependency.systemPath' for com.sun:tools:jar must specify an absolute path but is ${tools.jar} @

A similar issue was solved here authzforce/restful-pdp#4 (comment) by replacing it with the post-java8 dependencies.

<dependency>
    <groupId>javax.xml.bind</groupId>
    <artifactId>jaxb-api</artifactId>
    <version>2.3.0</version>
</dependency>
<dependency>
    <groupId>org.glassfish.jaxb</groupId>
    <artifactId>jaxb-runtime</artifactId>
    <version>2.3.0</version>
    <scope>runtime</scope>
</dependency>

Cross-Site Scripting: Reflected

@PostMapping("/log/{logger}/{logLevel}")
public ResponseEntity<String> generateLog(@PathVariable("logger") String loggerName,
@PathVariable("logLevel") String logLevel,
@RequestParam(name = "m", required = false, defaultValue = DEFAULT_LOG_MESSAGE) String message) {
Logger logger = LoggerFactory.getLogger(loggerName);
switch (logLevel.toLowerCase()) {
case "error":
logger.error(message);
return ResponseEntity.ok().body("Generated error log with message: \"" + message + "\".");
case "warn":
case "warning":
logger.warn(message);
return ResponseEntity.ok().body("Generated warn log with message: \"" + message + "\".");
case "info":
case "informational":
logger.info(message);
return ResponseEntity.ok().body("Generated info log with message: \"" + message + "\".");
case "debug":
logger.debug(message);
return ResponseEntity.ok().body("Generated debug log with message: \"" + message + "\".");
case "trace":
logger.trace(message);
return ResponseEntity.ok().body("Generated trace log with message: \"" + message + "\".");
}
return ResponseEntity.badRequest().body("Unknows log level \"" + logLevel + "\".");
}

Sending unvalidated data to a web browser can result in the browser executing malicious code.

In line 33,‘message’ was Contaminated,It could affect line 38,42,46,49,52

Emit numbers with original format instead of strings in custom fields

When I use something like CustomField.customField("result", 23) and check the log message in Kibana, the output looks like:

"custom_fields": {
    "result": "23"
}

Which makes it difficult to do numeric comparison or aggregations like avg, sum, min, max in Kibana. It would be nice to emit numeric values as is. As a proposal it should be possible to add another overload CustomField.customField(String key, Number value) and emit numbers in original format.

Update logback-classic version

The logback configuration example references version 1.1.3 of logback-classic. With that I get the exception show below upon startup. With versions 1.1.6 or later it works as expected.

Failed to instantiate [ch.qos.logback.classic.LoggerContext]
Reported exception:
java.lang.NoSuchMethodError: ch.qos.logback.core.util.Loader.getResourceOccurrenceCount(Ljava/lang/String;Ljava/lang/ClassLoader;)Ljava/util/Set;
    at ch.qos.logback.classic.util.ContextInitializer.multiplicityWarning(ContextInitializer.java:173)
    at ch.qos.logback.classic.util.ContextInitializer.statusOnResourceSearch(ContextInitializer.java:196)
    at ch.qos.logback.classic.util.ContextInitializer.getResource(ContextInitializer.java:143)
    at ch.qos.logback.classic.util.ContextInitializer.findURLOfDefaultConfigurationFile(ContextInitializer.java:137)
    at ch.qos.logback.classic.util.ContextInitializer.autoConfig(ContextInitializer.java:150)
    at org.slf4j.impl.StaticLoggerBinder.init(StaticLoggerBinder.java:85)
    at org.slf4j.impl.StaticLoggerBinder.<clinit>(StaticLoggerBinder.java:55)
    at org.slf4j.LoggerFactory.bind(LoggerFactory.java:150)
    at org.slf4j.LoggerFactory.performInitialization(LoggerFactory.java:124)
    at org.slf4j.LoggerFactory.getILoggerFactory(LoggerFactory.java:412)
    at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:357)
    at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:155)
    at org.apache.commons.logging.impl.SLF4JLogFactory.getInstance(SLF4JLogFactory.java:132)
    at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:273)
    at org.springframework.boot.SpringApplication.<clinit>(SpringApplication.java:190)

Provide extension points for custom JWT post-processings

Introduction

The dynamic log level feature uses JWT tokens from HTTP headers to specify the desired log level. These JWT tokens are validated and decoded to extract the relevant information. If a developer wants to extract further information from the same JWT token, this process cannot easily be accessed. This limits the possibility to extract more fields to enrich the log messages with, e.g. a tenant id. Furthermore, a developer may want to apply own checks on the validity of the log level change and execute own login, e.g. create audit logs.

Proposal

Allow developers to access the decode JWT tokens by subclassing DynamicLogLevelProcessor. Introduce overloadable methods to allow further validation and actions by the subclass. Furthermore, extend RequestLoggingFilter to allow registration or creation of a custom extension of DynamicLogLevelProcessor.

How to escape special characters like\n,\r,\t e.t.c

Hello,

We are using com.sap.hcp.cf.logback.encoder.JsonEncoder to encode our log messages before sending them to ELK. But when we view the logs in kibana these special characters like \n,\t,\r are getting executed and text is coming in new line, If I want to escape them and display them as it is like a string (test new line \n)what property inside encoder I can use?

or is there any way to replace these special characters with something else like replace(p)("r","t") in normal logback conversion?

And also I would like to know do we need to take care of any precautions to prevent log forging when we have opted to send log messages after passing it to this encoder com.sap.hcp.cf.logback.encoder.JsonEncoder?
Please let me know.
Thank you!

Performance issue with high concurrency

Hi,
We are developing one SAAS application based on sap_java_buildback and deploying it on SCP CF.
The application uses 'com.sap.hcp.cf.logging:cf-java-logging-support-logback' and 'com.sap.hcp.cf.logging:cf-java-logging-support-servlet'. Meanwhile the application binds application-logs service provided by SCP.

Currently we are testing the performance of application with the help of jmeter
Scenario 1(the level of application is info):
the ReentrantLock took up more than 2 seconds. And the average response time is about 3 seconds.

Scenario 2(the level of application is error):
Once we changed the log level to error, there is hardly ReentrantLock issue. And the average response time is less than 1 second.

Considering above two scenarios, it seems there are some IO performance issue on application-logs services with high concurrency so that all logging threads needs to wait for each other.

Could you help to have a look it and give me some suggestions?

Thanks,
Simon

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.