GithubHelp home page GithubHelp logo

outr / scribe Goto Github PK

View Code? Open in Web Editor NEW
506.0 10.0 39.0 2.66 MB

The fastest logging library in the world. Built from scratch in Scala and programmatically configurable.

License: MIT License

Scala 95.85% Shell 0.44% Java 3.70%
scribe scala slf4j logging logging-library logback logging-framework

scribe's Introduction

scribe

CI Codacy Grade Codacy Coverage Gitter Maven Central Latest version Javadocs

Scribe is a completely different way of thinking about logging. Instead of wrapping around existing logging frameworks and bearing their performance and design flaws, Scribe is built from the ground up to provide fast and effective logging in Scala, Scala.js, and Scala Native without the need of configuration files or additional dependencies. All management of logging can be handled programmatically (of course, classic logging configuration can be utilized as well if desired) in Scala itself, giving the developer the freedom to use whatever configuration framework, if any, they should choose to use.

Availability

Scribe is available on the JVM, Scala.js, and ScalaNative with cross-compiling for Scala 2.12, 2.13, and 3

Quick Start

For people that want to skip the explanations and see it action, this is the place to start!

Dependency Configuration

libraryDependencies += "com.outr" %% "scribe" % "3.13.4"

For Cross-Platform projects (JVM, JS, and/or Native):

libraryDependencies += "com.outr" %%% "scribe" % "3.13.4"

Or, if you want interoperability with SLF4J (to allow better interoperability with existing libraries using other loggers):

libraryDependencies += "com.outr" %% "scribe-slf4j" % "3.13.4"

Usage

scribe.info("Yes, it's that simple!")

SBT Tip

Using the default logger in Scribe supports auto-line wrapping, but in SBT, the [info] prefixes cause that to get messed up. It's recommended to set:

outputStrategy := Some(StdoutOutput)

This will disable the [info] and [error] prefixes so logging looks correct when running your application within SBT.

Why Another Logging Framework

Yes, we know there are too many Java logging frameworks to count, and a large number of decent logging frameworks in Scala, so why did we write yet another logging framework? Nearly every Scala logging framework is mostly just a wrapper around Java logging frameworks (usually SLF4J, Log4J, or Logback). This comes with a few problems:

  1. No support for Scala.js
  2. No support for Scala Native
  3. Performance cost (Blog Post: https://matthicks.com/2018/02/06/scribe-2-0-fastest-jvm-logger-in-the-world/)
  4. Additional dependencies
  5. Substantial cost logging method and line numbers
  6. Lack of programmatic configuration support

A few of the main features that Scribe offers (for a complete list):

  1. Performance is a critical consideration. We leverage Macros to handle optimization of everything possible at compile-time to avoid logging slowing down your production application. As far as we are aware, Scribe is the fastest logging framework on the JVM.
  2. Programmatic configuration. No need to be bound to configuration files to configure your logging. This means you can rely on any configuration framework or you can configure real-time changes to your logging in your production environment. This particularly comes in handy if you need to enable debug logging on something going wrong in production. No need to restart your server, simply provide a mechanism to modify the logging configuration in real-time.
  3. Clean logging. Macros allow us to introduce logging into a class via an import instead of a mix-in or unnecessary setup code.
  4. Zero cost class, method, and line number logging built-in. Never worry about your logger working up the stack to figure out the position of the logging statement at runtime. With Macros we determine that information at compile-time to avoid any runtime cost.
  5. Asynchronous logging support. Scribe's logger is very fast, but if real-time performance is critical, the asynchronous logging support completely removes logging impact from your application's thread impact.

Documentation

Check out the wiki for complete documentation

Community

The best way to receive immediate feedback for any questions is via our Gitter channel

Acknowledgements

YourKit supports open source projects with its full-featured Java Profiler. YourKit, LLC is the creator of YourKit Java Profiler and YourKit .NET Profiler, innovative and intelligent tools for profiling Java and .NET applications.

scribe's People

Contributors

asakaev avatar blazingsiyan avatar ckipp01 avatar cornerman avatar darkfrog26 avatar densh avatar greyplane avatar keynmol avatar lolgab avatar philippus avatar quafadas avatar rlebran avatar shawjef3 avatar sungjk avatar tindzk avatar yuriy-yarosh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

scribe's Issues

Mapping over FormatBlock

Before release 2.0 I had my threadname truncated which would remove a prefix like akka.actor.default-dispatcher with something shorter. As of now, I am not sure, how I would implement something like this (except implementing my own FormatBlock):

val formatter = FormatterBuilder()
    .add(l => "[" + l.threadName.replaceFirst("server-akka.actor.default-dispatcher-", "") + "]")
    .string(" ")
    .message.newLine

Can we have a def map(f: String => String): FormatBlock method in FormatBlock?

val truncatedThreadName = threadName.map(_.replaceFirst("server-akka.actor.default-dispatcher-", ""))
val logFormatter: Formatter = formatter"[${truncatedThreadName}] $message$newLine"

Better AsynchronousLogHandler

Better support for AsynchronousLogHandler using a ConcurrentLinkedQueue and a background thread pulling. Include support for overflow (block or drop) and max queue size.

Lighter Logger

We should support a Logger simple trait and a LoggerSettings coupled to the same name space to support configuration (parent, handlers, levels, filters, etc.).

ExecutionContext implicit for Future traceback

Currently, there is no good way to trace back the original caller of a Future in Scala. Providing a Macro-based ExecutionContext could offer such a feature to expose stack tracing additional information as well as non-stack derived traceback.

FileNotFoundException with daily filewriter

At midnight, I just got a FileNotFoundException in my application from logging. This came from a rotating filewriter FileWriter.daily(nio = false) and somehow the file was not there yet.

[ERROR] [05/11/2018 00:00:25.108] [server-akka.actor.default-dispatcher-8] [akka://server/user/$l] logs/app.2018-05-11.log
core java.nio.file.NoSuchFileException: logs/app.2018-05-11.log
core 	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
core 	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
core 	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
core 	at sun.nio.fs.UnixFileSystemProvider.isSameFile(UnixFileSystemProvider.java:338)
core 	at java.nio.file.Files.isSameFile(Files.java:1504)
core 	at scribe.writer.FileIOWriter.$anonfun$validate$1(FileIOWriter.scala:27)
core 	at scribe.writer.FileIOWriter.$anonfun$validate$1$adapted(FileIOWriter.scala:27)
core 	at scala.Option.exists(Option.scala:240)
core 	at scribe.writer.FileIOWriter.validate(FileIOWriter.scala:27)
core 	at scribe.writer.FileIOWriter.write(FileIOWriter.scala:18)
core 	at scribe.handler.SynchronousLogHandler.$anonfun$log$3(SynchronousLogHandler.scala:19)
core 	at scribe.handler.SynchronousLogHandler.$anonfun$log$3$adapted(SynchronousLogHandler.scala:18)
core 	at scala.Option.foreach(Option.scala:257)
core 	at scribe.handler.SynchronousLogHandler.log(SynchronousLogHandler.scala:18)
core 	at scribe.Logger.$anonfun$log$4(Logger.scala:39)
core 	at scribe.Logger.$anonfun$log$4$adapted(Logger.scala:39)
core 	at scala.collection.immutable.List.foreach(List.scala:389)
core 	at scribe.Logger.$anonfun$log$3(Logger.scala:39)
core 	at scribe.Logger.$anonfun$log$3$adapted(Logger.scala:38)
core 	at scala.Option.foreach(Option.scala:257)
core 	at scribe.Logger.log(Logger.scala:38)
core 	at scribe.Logger.$anonfun$log$6(Logger.scala:40)
core 	at scribe.Logger.$anonfun$log$6$adapted(Logger.scala:40)
core 	at scala.Option.foreach(Option.scala:257)
core 	at scribe.Logger.$anonfun$log$3(Logger.scala:40)
core 	at scribe.Logger.$anonfun$log$3$adapted(Logger.scala:38)
core 	at scala.Option.foreach(Option.scala:257)
core 	at scribe.Logger.log(Logger.scala:38)
core 	at scribe.package$.log(package.scala:7)
core 	at covenant.ws.api.ApiRequestHandler.onClientConnect(ApiRequestHandler.scala:24)
core 	at mycelium.server.ConnectedClient.connected(ConnectedClient.scala:62)
core 	at mycelium.server.ConnectedClient$$anonfun$receive$1.applyOrElse(ConnectedClient.scala:67)
core 	at akka.actor.Actor.aroundReceive(Actor.scala:517)
core 	at akka.actor.Actor.aroundReceive$(Actor.scala:515)
core 	at mycelium.server.ConnectedClient.aroundReceive(ConnectedClient.scala:20)
core 	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:588)
core 	at akka.actor.ActorCell.invoke(ActorCell.scala:557)
core 	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
core 	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
core 	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
core 	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
core 	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
core 	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
core 	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)

FileWriterManager

We need to be able to couple lots of features like:

  • Maximum number of logs (LogFileManager)
  • Rolling logs (PathBuilder)
  • Maximum sized files (PathBuilder)
  • Date formatted files (PathBuilder)
  • Compression of files (LogFileManager)

In #67 we could easily write a rolling FileLoggingManager that rolls when it reaches a certain size, but it doesn't properly handle compression, a maximum number of files, etc. that may be useful to other types. We should create a builder pattern that allows mixing these features together in a more powerful way.

If we create a FileWriterManager that maintains paths, writers (removing the need for IO and NIO variants of Writer), etc. this would help both increase performance as well as maximize flexibility.

Misleading memory consumption bar chart

The readme shows a bar chart comparing memory consumption of scribe and scala logging. As the y-axis is cut off at some arbitrary value (590... which unit btw?),i it makes the difference seem enormous while in reality it is much smaller.

I really like this library and the performance information provided is nice, but I think we should show the data in an honest way. When recommending the library to other people, I already got negative feedback about this chart. Can we change this?

Exceptions are stringified too early

LogRecord has no trace of exceptions, except in the message.

This makes it hard to implement handlers that are interested in exceptions only.

Asynchronous logging for JVM

My understanding is that Javascript is always single threaded, so I am unsure if this only applies to the JVM implementation or not.

Logging frameworks such as Log4j use asynchronous logging. This has the benefits of

  • exceptions while logging happen in another thread, not your own
  • CPU time used for logging does not delay your thread (if a hardware thread is available)

I have a benchmark showing Scribe's overhead vs. log4j 2.10. Note that this measures the thread-local CPU time used when calling log methods, and is not intended to show anything else. Because log4j is asynchronous, I expect its logging calls to return more quickly.

[info] Benchmark                            Mode  Cnt   Score    Error  Units
[info] LocalThreadOverhead.baseLine         avgt    5   0.004 ยฑ  0.001  ns/op
[info] LocalThreadOverhead.withJavaLogging  avgt    5  20.766 ยฑ  0.635  ns/op
[info] LocalThreadOverhead.withLog4j        avgt    5   1.746 ยฑ  0.019  ns/op
[info] LocalThreadOverhead.withScribe       avgt    5  47.656 ยฑ  6.855  ns/op

See https://github.com/shawjef3/scribe/blob/benchmarks/benchmarks/src/main/scala/scribe/benchmarks/LocalThreadOverhead.scala for the implementation.

Duplicate date value in package scribe and scribe.format

With the new date implementation (:+1:), we have a conflicting symbol date in packages scribe and scribe.format.

This happens when using format and scribe import. There is the scribe.date package and the format value date:

import scribe._
import scribe.format._

val fomatter = formatter"$date bla"

Can we do something about this?

Faster than Log4j

Currently, because of all the asynchronous magic log4j does, it's faster than Scribe. This should be dealt with. There's no reason Scribe shouldn't be faster than any Java solution.

Configuring scribe.slf4j.ScribeLoggerAdapter

All output from slf4j-dependent code goes through scribe with scribe-slf4j, but how does one configure the Logger? Clearing handlers on the root Logger and setting Level there seems to have no effect. ScribeLoggerAdapter logs on DEBUG level.

Rolling FileWriter with maximum size

I would like to have a logger which writes to a file but occupies only a fixed maximum size on disk. I just want to keep the most rececent xxx mb from these logs.

Can we therefore have a rolling file writer with a maximum bytes per file and maximum number of files setting? Or is this already possible?

FileWriter Builder Pattern

In 2.5 basic builder pattern support was added, but for the more complex scenarios, it is a pain to configure manually. We need to support clean setup of the FileWriter.

Add Better Date Formatting Consistency

In the Scala.js logging, the month and day will show only a single digit if the value is below 10 while in the JVM it will always format to two digits. Though a minor issue, they should be consistent.

Timer Support

Add functionality to allow a scoped timer to track time elapsed and other stats easily in logs

SLF4J: No SLF4J providers were found for SLF4J 1.8 beta.

SLF4J API cannot use scribe-slf4j as a backend.

Please note that slf4j-api version 1.8.x and later use the ServiceLoader mechanism. Earlier versions relied on the static binder mechanism which is no longer honored by slf4j-api.

https://www.slf4j.org/codes.html#noProviders
https://www.slf4j.org/faq.html#changesInVersion18

This has something to do with Jigsaw. The slf4j-simple implementation uses ServiceLoader mechanism:

https://github.com/qos-ch/slf4j/tree/master/slf4j-simple/src/main/resources/META-INF
https://github.com/qos-ch/slf4j/blob/master/slf4j-simple/src/main/java/org/slf4j/simple/SimpleServiceProvider.java

Problems with Interpolation

With s and f interpolation in Scribe 2.1 it seems to have problems sometimes like the Macros aren't playing well together (see #34). Though sfi is a better solution, we should still be able to support s and f.

Huge increase in fullOptJS file size

With version 2.0, we see a huge increase in the js file size. Before the update to scribe our bundled size in fullOptJS was 1.4mb, now with the new version of scribe it is 2.8mb.

I suspect this is due to the usage of scala-java-time and locales. We had similar issue in our own app, which is why we have removed these two packages from our app. Now, those are pulled in by scribe. Can we make this optional?

FormatBlock for filename

Using the position with class- and methodname is nice, but sometimes not uniquely identifying the log origin. Can we add a FormatBlock for the filename? In the best case, we would abbreviate the path or maybe just have the basename of the file.

Multithread-Capable MDC

MDC sucks as a paradigm on Logback and SLF4J as it's a single-threaded / ThreadLocal nightmare and entirely useless in asynchronous operations. We should add a better core system (perhaps building upon ExecutionContext: #62 ) to easily and conveniently support thread-safe usage.

Level format block without padding

Currently we just have levelPaddedRight, but for me a level without padding would make more sense. Can we have one simple level format block?

Better Support for Modifying Loggers

Currently Logger.update is used to modify an existing logger. This is a bit confusing and frustrating to use. A better and less side-effect implementation would be ideal.

Merge AsynchronousLogHandler back into core

The asynchronous logging using Akka is not scaling well (ironic, I know). This should be simplified to just using a daemon thread + ConcurrentLinkedQueue so it has no additional dependencies and can scale better.

Compilation error after transition to 2.1

java.util.NoSuchElementException: key not found: value arg$macro$1
	at scala.collection.MapLike.default(MapLike.scala:232)
	at scala.collection.MapLike.default$(MapLike.scala:231)
	at scala.collection.AbstractMap.default(Map.scala:59)
	at scala.collection.mutable.HashMap.apply(HashMap.scala:65)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder$locals$.load(BCodeSkelBuilder.scala:391)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:356)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.$anonfun$genLoadArguments$1(BCodeBodyBuilder.scala:937)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:937)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:630)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:298)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genBlock(BCodeBodyBuilder.scala:815)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:368)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.$anonfun$genLoadArguments$1(BCodeBodyBuilder.scala:937)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:937)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:667)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:298)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.$anonfun$genLoadArguments$1(BCodeBodyBuilder.scala:937)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoadArguments(BCodeBodyBuilder.scala:937)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genApply(BCodeBodyBuilder.scala:667)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:298)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genStat(BCodeBodyBuilder.scala:82)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.$anonfun$genBlock$1(BCodeBodyBuilder.scala:814)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genBlock(BCodeBodyBuilder.scala:814)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:368)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:372)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genStat(BCodeBodyBuilder.scala:82)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.$anonfun$genBlock$1(BCodeBodyBuilder.scala:814)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genBlock(BCodeBodyBuilder.scala:814)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:368)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.emitNormalMethodBody$1(BCodeSkelBuilder.scala:603)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.genDefDef(BCodeSkelBuilder.scala:635)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen(BCodeSkelBuilder.scala:509)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.$anonfun$gen$7(BCodeSkelBuilder.scala:511)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen(BCodeSkelBuilder.scala:511)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.genPlainClass(BCodeSkelBuilder.scala:113)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase$Worker1.visit(GenBCode.scala:190)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase$Worker1.$anonfun$run$1(GenBCode.scala:139)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase$Worker1.run(GenBCode.scala:139)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase.buildAndSendToDisk(GenBCode.scala:381)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase.run(GenBCode.scala:350)
	at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1431)
	at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1416)
	at scala.tools.nsc.Global$Run.compileSources(Global.scala:1412)
	at scala.tools.nsc.Global$Run.compile(Global.scala:1515)
	at xsbt.CachedCompiler0.run(CompilerInterface.scala:131)
	at xsbt.CachedCompiler0.run(CompilerInterface.scala:106)
	at xsbt.CompilerInterface.run(CompilerInterface.scala:32)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:237)
	at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:111)
	at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:90)
	at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:83)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
	at sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:134)
	at sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:74)
	at sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:117)
	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:305)
	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:305)
	at sbt.internal.inc.Incremental$.doCompile(Incremental.scala:101)
	at sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:82)
	at sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:117)
	at sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:64)
	at sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:84)
	at sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:129)
	at sbt.internal.inc.Incremental$.compile(Incremental.scala:75)
	at sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:70)
	at sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:309)
	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:267)
	at sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:158)
	at sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:237)
	at sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:68)
	at sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:1406)
	at sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:1388)
	at scala.Function1.$anonfun$compose$1(Function1.scala:44)
	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:42)
	at sbt.std.Transform$$anon$4.work(System.scala:64)
	at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
	at sbt.Execute.work(Execute.scala:266)
	at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[error] Error while emitting Main.scala
[error] key not found: value arg$macro$1
java.util.NoSuchElementException: key not found: value name$1
	at scala.collection.MapLike.default(MapLike.scala:232)
	at scala.collection.MapLike.default$(MapLike.scala:231)
	at scala.collection.AbstractMap.default(Map.scala:59)
	at scala.collection.mutable.HashMap.apply(HashMap.scala:65)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder$locals$.load(BCodeSkelBuilder.scala:391)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:356)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:272)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genStat(BCodeBodyBuilder.scala:82)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.$anonfun$genBlock$1(BCodeBodyBuilder.scala:814)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genBlock(BCodeBodyBuilder.scala:814)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:368)
	at scala.tools.nsc.backend.jvm.BCodeBodyBuilder$PlainBodyBuilder.genLoad(BCodeBodyBuilder.scala:372)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.emitNormalMethodBody$1(BCodeSkelBuilder.scala:603)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.genDefDef(BCodeSkelBuilder.scala:635)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen(BCodeSkelBuilder.scala:509)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.$anonfun$gen$7(BCodeSkelBuilder.scala:511)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.gen(BCodeSkelBuilder.scala:511)
	at scala.tools.nsc.backend.jvm.BCodeSkelBuilder$PlainSkelBuilder.genPlainClass(BCodeSkelBuilder.scala:113)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase$Worker1.visit(GenBCode.scala:190)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase$Worker1.$anonfun$run$1(GenBCode.scala:139)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase$Worker1.run(GenBCode.scala:139)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase.buildAndSendToDisk(GenBCode.scala:381)
	at scala.tools.nsc.backend.jvm.GenBCode$BCodePhase.run(GenBCode.scala:350)
	at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1431)
	at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1416)
	at scala.tools.nsc.Global$Run.compileSources(Global.scala:1412)
	at scala.tools.nsc.Global$Run.compile(Global.scala:1515)
	at xsbt.CachedCompiler0.run(CompilerInterface.scala:131)
	at xsbt.CachedCompiler0.run(CompilerInterface.scala:106)
	at xsbt.CompilerInterface.run(CompilerInterface.scala:32)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at sbt.internal.inc.AnalyzingCompiler.call(AnalyzingCompiler.scala:237)
	at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:111)
	at sbt.internal.inc.AnalyzingCompiler.compile(AnalyzingCompiler.scala:90)
	at sbt.internal.inc.MixedAnalyzingCompiler.$anonfun$compile$3(MixedAnalyzingCompiler.scala:83)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
	at sbt.internal.inc.MixedAnalyzingCompiler.timed(MixedAnalyzingCompiler.scala:134)
	at sbt.internal.inc.MixedAnalyzingCompiler.compileScala$1(MixedAnalyzingCompiler.scala:74)
	at sbt.internal.inc.MixedAnalyzingCompiler.compile(MixedAnalyzingCompiler.scala:117)
	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1(IncrementalCompilerImpl.scala:305)
	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileInternal$1$adapted(IncrementalCompilerImpl.scala:305)
	at sbt.internal.inc.Incremental$.doCompile(Incremental.scala:101)
	at sbt.internal.inc.Incremental$.$anonfun$compile$4(Incremental.scala:82)
	at sbt.internal.inc.IncrementalCommon.recompileClasses(IncrementalCommon.scala:117)
	at sbt.internal.inc.IncrementalCommon.cycle(IncrementalCommon.scala:64)
	at sbt.internal.inc.Incremental$.$anonfun$compile$3(Incremental.scala:84)
	at sbt.internal.inc.Incremental$.manageClassfiles(Incremental.scala:129)
	at sbt.internal.inc.Incremental$.compile(Incremental.scala:75)
	at sbt.internal.inc.IncrementalCompile$.apply(Compile.scala:70)
	at sbt.internal.inc.IncrementalCompilerImpl.compileInternal(IncrementalCompilerImpl.scala:309)
	at sbt.internal.inc.IncrementalCompilerImpl.$anonfun$compileIncrementally$1(IncrementalCompilerImpl.scala:267)
	at sbt.internal.inc.IncrementalCompilerImpl.handleCompilationError(IncrementalCompilerImpl.scala:158)
	at sbt.internal.inc.IncrementalCompilerImpl.compileIncrementally(IncrementalCompilerImpl.scala:237)
	at sbt.internal.inc.IncrementalCompilerImpl.compile(IncrementalCompilerImpl.scala:68)
	at sbt.Defaults$.compileIncrementalTaskImpl(Defaults.scala:1406)
	at sbt.Defaults$.$anonfun$compileIncrementalTask$1(Defaults.scala:1388)
	at scala.Function1.$anonfun$compose$1(Function1.scala:44)
	at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:42)
	at sbt.std.Transform$$anon$4.work(System.scala:64)
	at sbt.Execute.$anonfun$submit$2(Execute.scala:257)
	at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
	at sbt.Execute.work(Execute.scala:266)
	at sbt.Execute.$anonfun$submit$1(Execute.scala:257)
	at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:167)
	at sbt.CompletionService$$anon$2.call(CompletionService.scala:32)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)
[error] Error while emitting RepresentationController.scala
[error] key not found: value name$1
[error] two errors found
[error] (compile:compileIncremental) Compilation failed

Have no idea where to dig further. The same code compiled ok with 1.4.6, except that

scribe.Logger.root.clearHandlers()
scribe.Logger.root.addHandler(scribe.LogHandler(cfg.logLevel))

is now

scribe.Logger.root.clearHandlers().withHandler(minimumLevel = cfg.logLevel)

Filter / Disable Support for LoggerAdapter

The ScribeLoggerAdapter works well for intercepting SLF4J log events, but some applications are incredibly chatty. It would be beneficial to support better filtering.

NPE on logger name from SLF4J via Netty.

First off, thanks again for this library, it's been tremendous, filling a gap in Scala.js.

I've been able to get everything working, but when I reference postgres-async, which is built on netty, I get the following stack trace:

Caused by: java.lang.NullPointerException: name
	at io.netty.util.internal.logging.AbstractInternalLogger.<init>(AbstractInternalLogger.java:41)
	at io.netty.util.internal.logging.Slf4JLogger.<init>(Slf4JLogger.java:30)
	at io.netty.util.internal.logging.Slf4JLoggerFactory.newInstance(Slf4JLoggerFactory.java:73)
	at io.netty.util.internal.logging.InternalLoggerFactory.getInstance(InternalLoggerFactory.java:97)
	at io.netty.util.internal.logging.InternalLoggerFactory.getInstance(InternalLoggerFactory.java:90)
	at io.netty.channel.MultithreadEventLoopGroup.<clinit>(MultithreadEventLoopGroup.java:34)
	at com.github.mauricio.async.db.util.NettyUtils$.DefaultEventLoopGroup$lzycompute(NettyUtils.scala:24)
	at com.github.mauricio.async.db.util.NettyUtils$.DefaultEventLoopGroup(NettyUtils.scala:24)
	at com.github.mauricio.async.db.postgresql.pool.PostgreSQLConnectionFactory$.$lessinit$greater$default$2(PostgreSQLConnectionFactory.scala:47)
	at services.database.Database$.open(Database.scala:33)

You can clone the scribe branch from https://github.com/KyleU/boilerplay/tree/scribe to test this. As is, the application runs fine, but uncommenting Database.open from https://github.com/KyleU/boilerplay/blob/scribe/app/util/Application.scala#L60 causes the above stack trace.

Attaching a debugger shows that the name isn't provided. Is this something netty should fix, or is there a way to fix it in scribe?

Easier Support to Update Logger

Updating the root logger requires something like: Logger.update(Logger.rootName)(_...) which should be able to simplified greatly. In addition, the updateLogger method is problematic.

Logging not happening in chronological order

While testing #57 I noticed that my log looks like this:

2018.05.01 23:59:55 [ajp-nio-8089-exec-5] [andreak] [F98914598614QIBY58] DEBUG
2018.05.01 23:59:55 [ajp-nio-8089-exec-5] [andreak] [F98914598614QIBY58] DEBUG
2018.05.01 23:59:55 [ajp-nio-8089-exec-5] [andreak] [F98914598614QIBY58] DEBUG
2018.05.01 23:59:55 [ajp-nio-8089-exec-5] [andreak] [F98914598614QIBY58] INFO
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598629HO0O94] INFO
2018.05.01 23:59:57 [ajp-nio-8089-exec-36] [andreak] [F98914598679GY3A3S] INFO
2018.05.01 23:59:58 [ajp-nio-8089-exec-33] [andreak] [F98914598710INYLOA] INFO
2018.05.01 23:59:58 [ajp-nio-8089-exec-50] [andreak] [F98914598720FV1XJT] INFO
2018.05.01 23:59:58 [ajp-nio-8089-exec-1] [andreak] [F98914598721IPR1VC] INFO
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] DEBUG
2018.05.01 23:59:57 [ajp-nio-8089-exec-39] [andreak] [F98914598717MRQ5M0] INFO
2018.05.01 23:59:58 [ajp-nio-8089-exec-51] [andreak] [F98914598731XRGSXA] INFO
2018.05.01 23:59:58 [ajp-nio-8089-exec-33] [andreak] [F98914598781CHX141] INFO
2018.05.01 23:59:59 [ajp-nio-8089-exec-5] [andreak] [F98914598812VTBXLT] INFO
2018.05.01 23:59:59 [ajp-nio-8089-exec-50] [andreak] [F98914598849GV2WYH] INFO
2018.05.01 23:59:59 [ajp-nio-8089-exec-36] [andreak] [F98914598867ZIMC2S] INFO
2018.05.01 23:59:59 [ajp-nio-8089-exec-1] [andreak] [F989145989016XEYOQ] INFO
2018.05.01 23:59:59 [ajp-nio-8089-exec-4] [andreak] [F9891459889859ADF0] INFO
2018.05.01 23:59:59 [ajp-nio-8089-exec-51] [andreak] [F98914598953SVMZ7B] INFO

(only 6 first fields shown)

Notice that some lines logged at 23:59:57 are logged after 23:59:58. I don't recall seeing this with Log4J/Slf4J/Commons-logging etc.

Option to exclude "current-date" from FileWriter.daily

Using FileWriter.daily, the current ("today") logfile's date is part of the filename.
It would be nice to make an option for excluding the date from the current-file. This would make it easier to make existing monitoring-jobs (which today monitors log4j/slf4j files) to continue monitoring the logfiles, without needing to change them.

Per call tracing support

Scenario: In production, we generally have WARN as log level. However, in case of issues with clients, we want to enable all logs(say DEBUG and above) for a particular client.

Is it possible to achieve this scribe? Basically can we pass an additional(implicit/normal) flag or traceid to scribe and control the logging?


If this is not available by default, any hints on how to implement this in caller(application) code?

Implicit stringify converter from log methods

Just stumbled on this error in my code which confused me:

[error] /home/cornerman/projects/wust2/util/shared/src/main/scala/package.scala:19:21: No implicit view available from Throwable => String.
[error]         scribe.error(e)

So, the log methods in LoggerSupport take an implicit converter from the given type to String. But I am not sure what the use case is: Let's say I would want to log something else than a String. Then I would need to put an implicit function Something => String in my scope, which I would not want to have as it is to generic. Or I would need to supply it explicitly which is kind of tedious.

How about using a typeclass Loggable instead of an implicit view? So we could define certain types as loggable and just pass them to the appropriate log methods, which would feel much better to have as an implicit.

trait Loggable[T] {
  def logString(t: T): String
}

Furthermore, we could have overloads just expecting a String, to not have the typeclass overhead on every logline.

Better positionAbbreviated support

The abbreviated position information is a bit too abbreviated sometimes making it very hard to determine the actual package and class names.

FileWriter autoflush usage

How is the autoflush property of FileWriters used? I do not see any usages in the implementations FileIOWriter and FileNIOWriter. Am I missing something or is this just not implemented yet?

Support Level Comparison

Add support for code like if (level >= Level.Info) { ... } instead of having to write if (level.value >= Level.Info.value). As well, we should be able to say: if (record.value >= Level.Info) { ... } to properly handle boosted records.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.