GithubHelp home page GithubHelp logo

bruno-garcia / log4net.elasticsearch Goto Github PK

View Code? Open in Web Editor NEW
219.0 219.0 92.0 21.98 MB

log4net appender to ElasticSearch

Home Page: https://bruno-garcia.github.io/log4net.ElasticSearch/

License: Other

Shell 0.70% PowerShell 2.95% C# 95.37% Batchfile 0.97%
apm c-sharp dotnet elasticsearch log4net logging nuget

log4net.elasticsearch's People

Contributors

aateeque avatar bruno-garcia avatar hippasus avatar jc74 avatar jptoto avatar mastoj avatar mickdelaney avatar moconnell avatar mpdreamz avatar nickcanz avatar sdragos avatar ttingen avatar tylerrbrown avatar vboctor avatar vishalpatel-te avatar wallymathieu avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

log4net.elasticsearch's Issues

ConversionPattern not enforced

Hello,

first thanks for your contribution which is really useful to us. :-)

All is running fine except that the ConversionPattern is not applied to messages.
In ElasticSearch we have the raw messages specified in logger.Debug without any decoration.

I've tried with a minimal configuration :

<appender name="ElasticSearch" type="log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch">
    <connectionString value="Server=localhost;Index=log;Port=9200;rolling=true"/>
    <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="*** %-4timestamp [%thread] %-5level %logger %ndc - %message%newline" />
    </layout>
</appender>

I have debugged your code and found that in Repository.Add the **logEvent**s text is raw.
Not sure if this is normal at this step as I don't know when the log4net plumbing applies messages formatting.

Your feedback will be greatly appreciated.

Thanks in advance.

Best regards.

Mickael

log4net elasticsearch logging with custom parameters

I am using log4net together with ElasticSearch and Kibana. By now my web.config looks like this:

<log4net>
<appender name="ElasticSearchAppender" type="log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch">
  <layout type="log4net.Layout.PatternLayout,log4net">
    <param name="ConversionPattern" value="%date - %level - %message %property{location} %property{label} %property{mstimeload} %property{applicationid} %property{page} 
           %property{ipclient} %property{browser} %property{browsersignature} %property{appversion} %property{sessionuniquecodetag} %property{globalcountertailsloaded} 
           %property{ipserveraddress} %newline" />
  </layout>
  <connectionString value="Server=myip;Index=logstash;Port=9200;rolling=true"/>
  <lossy value="true" />
  <bufferSize value="100" />
  <evaluator type="log4net.Core.LevelEvaluator">
    <threshold value="ERROR"/>
  </evaluator>
</appender>
<root>
  <level value="ALL"/>
  <appender-ref ref="ElasticSearchAppender" />
</root>

I have some custom parameters like location, label, mstimeload, applicationid, page, ipclient, ... Everything works fine, but all of these parameters are of string type, instead I would like to have also integer or geo_point type, but I don't know how to tell log4net of which type is my parameter.
Then in c# I write my log like this:

private static readonly log4net.ILog log = log4net.LogManager.GetLogger(System.Reflection.MethodBase.GetCurrentMethod().DeclaringType);

log4net.ThreadContext.Properties["label"] = label;
log4net.ThreadContext.Properties["ipclient"] = ipaddress;
log4net.ThreadContext.Properties["browser"] = browserType;
log4net.ThreadContext.Properties["browsersignature"] = browserHashSignature;
log4net.ThreadContext.Properties["appversion"] = ASSettings.ApplicationVersion;
log4net.ThreadContext.Properties["mstimeload"] = msTime == null ? null : Convert.ToString(Convert.ToInt32(msTime.Value), CultureInfo.InvariantCulture);
log4net.ThreadContext.Properties["globalcountertailsloaded"] = globalCounter_tilesloaded == null ? null : Convert.ToString(globalCounter_tilesloaded.Value, CultureInfo.InvariantCulture);
log4net.ThreadContext.Properties["ipserveraddress"] = ipserveraddress;
log4net.ThreadContext.Properties["page"] = page;
log4net.ThreadContext.Properties["sessionuniquecodetag"] = sessionuniquecodetag;
log4net.ThreadContext.Properties["applicationid"] = applicationid;

log.Error(description, ex);

If I do like this then both in ElasticSearch and in Kibana all of this properties are of string type. And I cannot change the type anymore.

The process was terminated due to an unhandled exception. log4net.ElasticSearch.WebElasticClient.FinishGetResponse

Application: Crawler.exe
Framework Version: v4.0.30319
Description: The process was terminated due to an unhandled exception.
Exception Info: System.Net.WebException
Stack:
at System.Net.HttpWebRequest.EndGetResponse(System.IAsyncResult)
at log4net.ElasticSearch.WebElasticClient.FinishGetResponse(System.IAsyncResult)
at System.Net.LazyAsyncResult.Complete(IntPtr)
at System.Net.ContextAwareResult.CompleteCallback(System.Object)
at System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object, Boolean)
at System.Threading.ExecutionContext.Run(System.Threading.ExecutionContext, System.Threading.ContextCallback, System.Object)
at System.Net.ContextAwareResult.Complete(IntPtr)
at System.Net.LazyAsyncResult.ProtectedInvokeCallback(System.Object, IntPtr)
at System.Net.HttpWebRequest.Abort(System.Exception, Int32)
at System.Net.HttpWebRequest.AbortWrapper(System.Object)
at System.Threading.QueueUserWorkItemCallback.System.Threading.IThreadPoolWorkItem.ExecuteWorkItem()
at System.Threading.ThreadPoolWorkQueue.Dispatch()
at System.Threading._ThreadPoolWaitCallback.PerformWaitCallback()

update changed casing on property names

The older version of the log4net.ElasticSearch camelCased properties, and it appears that they now end up PascalCase...not a huge issue, but if you were running with the previous version, kibana no longer likes to display the entries unless you update the spec you are using. Would be nice to revert to camelCase.

It doesn't look like there is a good way of doing this with the .net JavascriptSerializer, but personally, I wouldn't mind going back to the Json.net serializer for that feature.

http://stackoverflow.com/questions/17700213/how-to-force-camelcase-with-javascriptserializer

Rolling appender

It would be nice to have an auto cleanup of the logs in elasticsearch, which of course would be optional.

What I am thinking is that you provide a parameter which defines the lenght of a log and how many you should keep. Under the hood an index is created with the defined length and an alias is put on top of those indexes. Example:

Alias: LogEvent
Index1: LogEvent-20131305121212-20131212121212
Index2: LogEvent-20131312121212-20131219121212

And when an event is triggered which should require a new index to be created you create a new index, delete the oldest one and update the alias to include the new one.

If you have a lot of logs this would definitely save space, which might not be a problem :)

Indexing logs to Elasticsearch succeeds, but error is thrown

Hello, as a spin off from #55 I have one issue I'm seeing. I'm running the following example code:

File source.cs:

using log4net;
using log4net.Config;
using System.Threading;

public class LogTest
{
   private static readonly ILog logger = 
       LogManager.GetLogger(typeof(LogTest));

   static void Main(string[] args)
   {
       log4net.Util.LogLog.InternalDebugging  = true;
       //BasicConfigurator.Configure();
       XmlConfigurator.Configure(new System.IO.FileInfo(args[0]));
       Thread.Sleep (2000);
       logger.Debug("Here is a debug log.");
       logger.Info("... and an Info log.");
       logger.Warn("... and a warning.");
       logger.Error("... and an error.");
       logger.Fatal("... and a fatal error.");
       Thread.Sleep (2000);
   }
}

With the following configuration (file called test.xml):

<log4net>
   <appender name="ElasticSearchAppender" type="log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch">
       <layout type="log4net.Layout.PatternLayout,log4net">
           <param name="ConversionPattern" value="%d{ABSOLUTE} %-5p" />
       </layout>
       <connectionString value="Scheme=http;Server=localhost/;Index=aaa-bbb-ccc/syslog;Port:9200;rolling=true"/>
       <lossy value="false" />
       <evaluator type="log4net.Core.LevelEvaluator">
               <threshold value="ALL" />
       </evaluator>
       <bufferSize value="1000" />
   </appender>

   <root>
       <level value="DEBUG" />
       <appender-ref ref="ElasticSearchAppender" />
   </root>
</log4net>

Before starting the code I've started new Elasticsearch instance and checked if there are no indices:

$ curl -XGET '192.168.1.15:9200/_search?pretty'
{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 0,
    "successful" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : 0,
    "max_score" : 0.0,
    "hits" : [ ]
  }
}

Started the code and got the following result:

D:\>source.exe test.xml
log4net: configuring repository [log4net-default-repository] using file [test_local.xml]
log4net: configuring repository [log4net-default-repository] using stream
log4net: loading XML configuration
log4net: Configuring Repository [log4net-default-repository]
log4net: Configuration update mode [Merge].
log4net: Logger [root] Level string is [DEBUG].
log4net: Logger [root] level set to [name="DEBUG",value=30000].
log4net: Loading Appender [ElasticSearchAppender] type: [log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch]
log4net: Converter [message] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [newline] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Setting Property [ConversionPattern] to String value [%d{ABSOLUTE} %-5p]
log4net: Converter [d] Option [ABSOLUTE] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [literal] Option [ ] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [p] Option [] Format [min=5,max=2147483647,leftAlign=True]
log4net: Setting Property [Layout] to object [log4net.Layout.PatternLayout]
log4net: Setting Property [ConnectionString] to String value [Scheme=http;Server=192.168.1.15;Index=aaa-bbb-ccc;Port=9200;rolling=true]
log4net: Setting Property [Lossy] to Boolean value [False]
log4net: Setting Property [Threshold] to Level value [ALL]
log4net: Setting Property [Evaluator] to object [log4net.Core.LevelEvaluator]
log4net: Setting Property [BufferSize] to Int32 value [1000]
log4net: Created Appender [ElasticSearchAppender]
log4net: Adding appender named [ElasticSearchAppender] to logger [root].
log4net: Hierarchy Threshold []
log4net:ERROR [ElasticSearchAppender] ErrorCode: GenericFailure. ElasticSearchAppender [ElasticSearchAppender]: Failed to addd logEvents to Repository in SendBufferCallback.
log4net:ERROR [ElasticSearchAppender] ErrorCode: GenericFailure. ElasticSearchAppender [ElasticSearchAppender]: Failed to addd logEvents to Repository in SendBufferCallback.
log4net:ERROR [ElasticSearchAppender] ErrorCode: GenericFailure. ElasticSearchAppender [ElasticSearchAppender]: Failed to addd logEvents to Repository in SendBufferCallback.
log4net:ERROR [ElasticSearchAppender] ErrorCode: GenericFailure. ElasticSearchAppender [ElasticSearchAppender]: Failed to addd logEvents to Repository in SendBufferCallback.
System.Net.WebException: Failed to post {"index" : {} }
{"timeStamp":"2016-01-15T19:50:12.7032935Z","message":"... and an Info log.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"INFO","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7032935Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
 to http://192.168.1.15:9200/aaa-bbb-ccc-2016.01.15/logEvent/_bulk.
   w log4net.ElasticSearch.Infrastructure.HttpClient.PostBulk[T](Uri uri, T items)
   w log4net.ElasticSearch.Repository.Add(IEnumerable`1 logEvents, Int32 bufferSize)
   w log4net.ElasticSearch.ElasticSearchAppender.SendBufferCallback(Object state)
System.Net.WebException: Failed to post {"index" : {} }
{"timeStamp":"2016-01-15T19:50:12.7052929Z","message":"... and a fatal error.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"FATAL","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7052929Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
 to http://192.168.1.15:9200/aaa-bbb-ccc-2016.01.15/logEvent/_bulk.
   w log4net.ElasticSearch.Infrastructure.HttpClient.PostBulk[T](Uri uri, T items)
   w log4net.ElasticSearch.Repository.Add(IEnumerable`1 logEvents, Int32 bufferSize)
   w log4net.ElasticSearch.ElasticSearchAppender.SendBufferCallback(Object state)
System.Net.WebException: Failed to post {"index" : {} }
{"timeStamp":"2016-01-15T19:50:12.7042935Z","message":"... and an error.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"ERROR","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7042935Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
 to http://192.168.1.15:9200/aaa-bbb-ccc-2016.01.15/logEvent/_bulk.
   w log4net.ElasticSearch.Infrastructure.HttpClient.PostBulk[T](Uri uri, T items)
   w log4net.ElasticSearch.Repository.Add(IEnumerable`1 logEvents, Int32 bufferSize)
   w log4net.ElasticSearch.ElasticSearchAppender.SendBufferCallback(Object state)
System.Net.WebException: Failed to post {"index" : {} }
{"timeStamp":"2016-01-15T19:50:12.6562692Z","message":"Here is a debug log.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"DEBUG","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.6562692Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
 to http://192.168.1.15:9200/aaa-bbb-ccc-2016.01.15/logEvent/_bulk.
   w log4net.ElasticSearch.Infrastructure.HttpClient.PostBulk[T](Uri uri, T items)
   w log4net.ElasticSearch.Repository.Add(IEnumerable`1 logEvents, Int32 bufferSize)
   w log4net.ElasticSearch.ElasticSearchAppender.SendBufferCallback(Object state)
log4net: Shutdown called on Hierarchy [log4net-default-repository]

At the same time, I've got the data in Elasticsearch index, tested it as follows:

curl -XGET '192.168.1.15:9200/_search?pretty'
{
  "took" : 18,
  "timed_out" : false,
  "_shards" : {
    "total" : 5,
    "successful" : 5,
    "failed" : 0
  },
  "hits" : {
    "total" : 5,
    "max_score" : 1.0,
    "hits" : [ {
      "_index" : "aaa-bbb-ccc-2016.01.15",
      "_type" : "logEvent",
      "_id" : "AVJG1pVfJtwAF5WwlxaL",
      "_score" : 1.0,
      "_source":{"timeStamp":"2016-01-15T19:50:12.7042935Z","message":"... and a warning.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"WARN","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7042935Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
    }, {
      "_index" : "aaa-bbb-ccc-2016.01.15",
      "_type" : "logEvent",
      "_id" : "AVJG1pVfJtwAF5WwlxaJ",
      "_score" : 1.0,
      "_source":{"timeStamp":"2016-01-15T19:50:12.7032935Z","message":"... and an Info log.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"INFO","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7032935Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
    }, {
      "_index" : "aaa-bbb-ccc-2016.01.15",
      "_type" : "logEvent",
      "_id" : "AVJG1pVfJtwAF5WwlxaK",
      "_score" : 1.0,
      "_source":{"timeStamp":"2016-01-15T19:50:12.7052929Z","message":"... and a fatal error.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"FATAL","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7052929Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
    }, {
      "_index" : "aaa-bbb-ccc-2016.01.15",
      "_type" : "logEvent",
      "_id" : "AVJG1pVkJtwAF5WwlxaM",
      "_score" : 1.0,
      "_source":{"timeStamp":"2016-01-15T19:50:12.6562692Z","message":"Here is a debug log.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"DEBUG","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.6562692Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
    }, {
      "_index" : "aaa-bbb-ccc-2016.01.15",
      "_type" : "logEvent",
      "_id" : "AVJG1pVfJtwAF5WwlxaI",
      "_score" : 1.0,
      "_source":{"timeStamp":"2016-01-15T19:50:12.7042935Z","message":"... and an error.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"ERROR","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:50:12.7042935Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}
    } ]
  }
}

So all 5 events were indexed properly, still each indexation request thrown an error.

To give more information, a single indexing request body looked like this (according to Fidler):

{"index" : {} }
{"timeStamp":"2016-01-15T19:54:50.6778816Z","message":"... and an error.","messageObject":{},"exception":{},"loggerName":"LogTest","domain":"source.exe","identity":"","level":"ERROR","className":"LogTest","fileName":null,"lineNumber":"0","fullInfo":"LogTest.Main(:0)","methodName":"Main","fix":"LocationInfo, UserName, Identity, Partial","properties":{"log4net:HostName":"onyxia","log4net:Identity":"","log4net:UserName":"ONYXIA\\gr0","@timestamp":"2016-01-15T19:54:50.6778816Z"},"userName":"ONYXIA\\gr0","threadName":"1","hostName":"ONYXIA"}

And the response for that event was like this (according to Fidler):

HTTP/1.1 200 OK
Content-Type: application/json; charset=UTF-8
Content-Length: 206

{"took":403,"errors":false,"items":[{"create":{"_index":"aaa-bbb-ccc-2016.01.15","_type":"logEvent","_id":"AVJG2tOTzwvYsJHkdP68","_version":1,"_shards":{"total":2,"successful":1,"failed":0},"status":201}}]}

Let me know if I can provide more information.

error caused by dependency on NEST version 0.12.0.0

I've upgraded my project to NEST 1.1.0-rc1 and, as a result, log4net throws an error when it attempts to locate Nest 0.12.0.0. With log4net debugging enabled, I see this error:
log4net:ERROR [ElasticSearchAppender] ErrorCode: GenericFailure. Failed in DoAppend
System.IO.FileLoadException: Could not load file or assembly 'Nest, Version=0.12.0.0, Culture=neutral, PublicKeyToken=null' or one of its dendencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040)
File name: 'Nest, Version=0.12.0.0,...

I can see in log4net.ElasticSearch.nuspec that there is a dependency on NEST version 0.12.0.0. Do you have plans to update the code and package to NEST rc1? I will get your project and compile it with a reference to NEST rc1, but considering the breaking changes with this NEST update, I would be surprised if updating will require a bit of refactoring.

Thanks,
Ethan

LogStash

I am new to ELK. I was wondering why you didnt use logStash (write a logStash log4net appender) and instead wrote an appender directly into ElasticSearch?

Can't read host entries

If you add a host entry for your elasticsearch server it fails to connect with elasticsearch when running in a console app. When running as a web application under IIS it can read your host entries and will connect to elasticsearch as expected.

The work around I found is to use the ip of your elasticsearch server.

Dynamic index name on app.conf

is it possible to have a dynamic index name in the app.conf, some thing like :

<connectionString value="Server=localhost;Index={indexName};Port=9200;rolling=true"/>.

The idea is to use the same configuration for differents classes.

Thnx.

Rolling appender does not create new index when under constant load

I have rolling=true in my connection string, and everything works ok for the first day (when started).
The next day when it should create new index it does not. It's still writing into old index file (today we are 2014/12/23 but it writes into 2014/12/22).
I suspect that it doesn't create new index file because the application I'm logging from is non stop under load and writing into log.

Appender works via iisexpress but not IIS

Hi,

I've got a project where the appender is appending when the site is hosted out of iis express but when i shift it to iis (same box) it doesn't send the log event to elasticsearch.

Elasticsearch is on another machine. I can see the call is made to client.IndexAsync but I can see my fiddler traffic no http request was made.. any thoughts?

Documentation

@jptoto I'm looking for documentation, there some place to find it?.
My first need (I think it's pretty common) is about the configuration part where even a quick & dirty guide for all the possibile parameters could be a great help for any guys like me and it shouldn't be a huge effort for you. (I hope)

Rolling option

Is there an option to specify the number of indecies like you can do with RollingFileAppender?

Performance

Hi,

Apologies is this isn't the correct forum for this query.

We're currently looking to use the ELK stack - and deciding whether or not to use your project/library or the udp appender.

I notice you're creating an elasticsearch connection and LogClient for every LoggingEvent. Is there a reason for this? Could the client/connection be re-used? Have you done any performance testing?

Joe

Issue with SQLException

Hi,

With the last version of elasticsearch (2.*) all sql exception are not logged.

I search a little and when i clear all "Exception.Data" the exception is logged.

here a sample of data serialized with newtonsoft :
{
"HelpLink.ProdName":"Microsoft SQL Server",
"HelpLink.ProdVer":"09.00.5000",
"HelpLink.EvtSrc":"MSSQLServer",
"HelpLink.EvtID":"2812",
"HelpLink.BaseHelpUrl":"http://go.microsoft.com/fwlink",
"HelpLink.LinkId":"20476"
}

The key of dictionnary contains '.'

I make an extention method that clear all data befor log but it's not a good solution.

cordially

Michael.

Sign assembly log4net.ElasticSearch

log4net.ElasticSearch, Version=2.2.0.0, Culture=neutral, PublicKeyToken=null

After signing PublicKeyToken should not be null.

All popular libraries are signed: Newtonsoft.json, NEST, etc.

When I send messages to the Appender that has first been sent with a remotingAppender

I get no Exception logged :(
When I debugged the code I found that the ExceptionObject is empty (but GetExceptionString) is not.

The documentation for ExceptionObject says:
"Gets the exception object used to initialize this event. Note that this event may not have a valid exception object. If the event is serialized the exception object will not be transferred. To get the text of the exception the GetExceptionString method must be used not this property."
http://logging.apache.org/log4net/release/sdk/log4net.Core.LoggingEvent.ExceptionObject.html

Is it possible to add a fallback so it gets the Exception from GetExceptionString if no ExcpetionObject is avaible?

Or do I do somwthing wrong when I gon't get a ExceptionObject?

log data retention

Hi
Great little library. How would you recommend handling log data retention?

Decompose HttpClient and serialization method

Hi,

I have built new appender with custom logic based on your ElasticsearchAppender. I was easy to inherit and reuse most of the code including repository but I was not able to easily reuse HttpClient. The reason is that there is ToJson method used and it can't be easily replaced by some other implementation.

I think it would be better if HttpClient has dependency to some IJsonSerializer or something like that. If you agree with this I can do it and send you pull request.

Michal

Field Naming / Mapping in Elasticsearch 2.0 (emailed by Martin Beck)

Unfortunately, I have found a bug in log4net.ElasticSearch with ElasticSearch 2.0.

In ElasticSearch 2.0, it is no longer possible, to add fields with dot in the fieldname:

https://www.elastic.co/guide/en/elasticsearch/reference/2.0/_mapping_changes.html
21210ae2b1c74e65a6ca7393a5f1bc46

The problem is the data-Property in the Exception-class (IDictionary).

For Example the SQL-Exception uses „HelpLink.LinkId“ –Keys für the Data-Dictionary

[LogEvents, with Exceptions, which uses a Key-Value mit Dot in the Data-Dictionary are no longer logged in ElasticSearch (with no Error-Message).

One Solution could be, to use a custom-Json-Serializer for the Data-Dictionary

aed05748a6e24d63a6f08b96968211a1

e.g: Dictionaries could be serialized as:
[{Key: „HelpLink.LinkId“, Value: „20476“}, {Key: „HelpLink.BaseHelpUrl“, Value: „http:/…“}]

Maybe you could solve this problem in an upcoming version of log4net.ElasticSearch.

Many thanks.

Elastic search throws mapping exception for messageObject property

On logEvent request to elastic search messageObject is either string or object. Having that, elastic search throws mapping exception:

MapperParsingException[object mapping for [logEvent] tried to parse as object, but got EOF, has a concrete value been provided to it? status 400

I suppose messageObject should always be a specific object or should always be a string, otherwise elasticsearch is confused how to store it

image

image

Bulk API not working with Logsene

I'm sending logs to Logsene (https://sematext.com/logsene/) I don't know much about ElasticSearch which is why I'm using a provider like this.

Logsene creates a UUID index for each customer which I've substituted here for "123456789".

I've managed to get it to work where bufferSize = 1, but when greater it gets an exception.

The console log with log4net debugging enabled shows:

System.Net.WebException: Failed to post {"create" : {} }
... to https://logsene-receiver.sematext.com/123456789/syslog/_bulk.

Note - I created a fork where logEvent type can be overridden in the connection string. As I had thought it was failing to recognise the "/_bulk" . But it was working before and sending with "/123456789/syslog/logEvent/" and bulk was still failing with:
https://logsene-receiver.sematext.com/123456789/syslog/logEvent/_bulk.

Any suggestions?

WCF and log4net.elasticsearch

Is it possible to use log4net.elasticsearch in WCF Service application? I was not able to log messages when using it on WCF Service application. But when I moved logging to WCF Host application(self hosted console) I was able to log messages. Why is that?

Doesn't work with Shield when SSL is not Enabled

@jptoto

This library doesn't work with Shield using http with credentials.

We get a unauthorized exception because the HttpWebRequest doesn't contain the authorization header with the user information. In log4net.ElasticSearch.HttpClient:

            if (uri.Scheme == "https" && !string.IsNullOrWhiteSpace(uri.UserInfo))
            {
                httpWebRequest.Headers.Remove(HttpRequestHeader.Authorization);
                httpWebRequest.Headers.Add(HttpRequestHeader.Authorization, "Basic " + Convert.ToBase64String(Encoding.ASCII.GetBytes(uri.UserInfo)));
            }

If Shield is enabled in Elasticsearch, and we are sending the User Information, we should set the authorization headers anyway. We should remove uri.Scheme == "https" from the condition. No matter which protocol we are using, if the user is providing credentials, then we should add this to the authorization header.

If the Authorization is provided, and there is no Authentication in Elasticsearch side, then the user information will be avoided.

The code should look like:

            if (!string.IsNullOrWhiteSpace(uri.UserInfo))
            {
                httpWebRequest.Headers.Remove(HttpRequestHeader.Authorization);
                httpWebRequest.Headers.Add(HttpRequestHeader.Authorization, "Basic " + Convert.ToBase64String(Encoding.ASCII.GetBytes(uri.UserInfo)));
            }

Thanks!

Any plan to support elastic search _bulk operations?

I've been playing with the buffer and when it flushed it makes individual calls to elasticsearch. Do you have any plan on adding ability to use the Elasticsearch Bulk Api to package many messages into a sing "/_bulk" command?

LogicalThreadContext.Properties int and long are indexed and analyzed as string

We are adding the following info to the LogEvent using the LogicalThreadContext.Properties

  • LogicalThreadContext.Properties["Url"] = requestUri;
  • LogicalThreadContext.Properties["TimespanInMillis"] = elapsedMilliseconds; // => long
  • LogicalThreadContext.Properties["StatusCode"] = statusCode; // => int

Yet, in Elastic this is all indexed as string. Is there a way to control it so that we can create dashboards in Kibana?

Thx for an otherwise excellent library!

Generalise approach for specifying index urls

Currently the appender has a single property for specifying a ConnectionString which is then parsed into the index url [where log entries are posted]

However, it might be convenient to support a ConnectionStringName which refers to the ConnectionStrings element.

We could obviously just have another property but a more extensible approach would be to introduce a nested element to the xml config. This would support different parsing mechanisms including l4n's properties.

A potential advantage to this approach is that we could support multiple indexes.

Thoughts?

MessageObject Not Populating

The logEvent() class is not properly handling the MessageObject population. To duplicate, log a custom application exception - no data is recorded to ES.

Timeout crashing application, shouldn't it just write to log4net InternalDebug?

System.InvalidOperationException: An unhandled exception was detected ---> System.Net.WebException: The operation has timed out
at System.Net.HttpWebRequest.EndGetResponse(IAsyncResult asyncResult)
at log4net.ElasticSearch.WebElasticClient.FinishGetResponse(IAsyncResult result) in c:\source\log4stash\src\log4net.ElasticSearch\ElasticClient.cs:line 110
at System.Net.LazyAsyncResult.Complete(IntPtr userToken)
at System.Net.ContextAwareResult.CompleteCallback(Object state)
at System.Threading.ExecutionContext.RunInternal(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean preserveSyncCtx)
at System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)
at System.Net.ContextAwareResult.Complete(IntPtr userToken)
at System.Net.LazyAsyncResult.ProtectedInvoke...

If bufferSize not set, logging does not work

If log4net setting not set, then messages are not logged to elastic search, so the config is not backwards compatible with the earlier version.
Maybe it would make sense to set bufferSize = 1 by default ?

Nuget packages in the repo

Hi,

This isn't an overwhelming concern, but have you thought about removing nuget packages from the repo - and use package restore during the build? Typically, this is only ever a problem when you use corporate build servers behind a proxy which aren't allowed external access.

Joe

Cannot log to ES instance on a different machine.

I can't get any log messages into ES with an MVC4 .NET4.5 application.

log4net config looks like this:

  <log4net>
    <appender name="ElasticSearchAppender40" type="log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch.Net40">
      <layout type="log4net.Layout.PatternLayout,log4net">
        <param name="ConversionPattern" vale="%d{ABSOLUTE} %-5p %c{1}:%L -40 %m%n" />
      </layout>
      <connectionString value="Server=arcturus;Index=log;Port=9200;rolling=true" />
    </appender>

    <appender name="ElasticSearchAppender" type="log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch">
      <layout type="log4net.Layout.PatternLayout,log4net">
        <param name="ConversionPattern" vale="%d{ABSOLUTE} %-5p %c{1}:%L -not40 %m%n" />
      </layout>
      <connectionString value="Server=arcturus;Index=log;Port=9200;rolling=true" />
    </appender>

    <appender name="TestAppender" type="log4net.Appender.FileAppender">
      <file value="yeager.web.log" />
      <appendToFile value="true" />
      <layout type="log4net.Layout.PatternLayout">
        <conversionPattern value="%date [%thread] %-5level %logger [%property{NDC}] -- %message%newline" />
      </layout>
    </appender>

    <root>
      <level value="All" />
      <appender-ref ref="ElasticSearchAppender" />
      <!--<appender-ref ref="ElasticSearchAppender40" /> ----- This doesn't work either -->
      <appender-ref ref="TestAppender" />
    </root>

  </log4net>

My test logger is generating messages, though, so it must be something with the ES configuration?
I'm running a packet capture and I don't see any traffic headed to my ES machine.

log info content is missing

Hi,

I found an issue that I can't find log info content in es.

Here's the log code:

public ActionResult Index()
        {
            _log.Info("test!", new ApplicationException("shit went down and it wasn't good")); 
            return View();
        }

And here's record in es,you can find that the message is null!

{

"_index": "testlog",
"_type": "logEvent",
"_id": "AVSj2CU-lQLA0uGOM-km",
"_version": 1,
"_score": 1,
"_source": {
    "timeStamp": "2016-05-12T07:22:12.1540257Z",
    "message": null,
    "messageObject": { },
    "exception": {
        "Type": "System.ApplicationException",
        "Message": "shit went down and it wasn't good",
        "HelpLink": null,
        "Source": null,
        "HResult": -2146232832,
        "StackTrace": null,
        "Data": { },
        "InnerException": null
    },
    "loggerName": "ESLogger",
    "domain": null,
    "identity": "",
    "level": "INFO",
    "className": "Log4netWithElasticSearch.Controllers.HomeController",
    "fileName": "c:\Users\tu\Desktop\TuStudy\Log4netWithElasticSearch\Controllers\HomeController.cs",
    "lineNumber": "16",
    "fullInfo": "Log4netWithElasticSearch.Controllers.HomeController.Index(c:\Users\tu\Desktop\TuStudy\Log4netWithElasticSearch\Controllers\HomeController.cs:16)",
    "methodName": "Index",
    "fix": "None",
    "properties": {
        "log4net:Identity": "",
        "log4net:UserName": "tu-PC\tu",
        "log4net:HostName": "tu-PC",
        "@timestamp": "2016-05-12T07:22:12.1540257Z"
    },
    "userName": "tu-PC\tu",
    "threadName": "9",
    "hostName": "TU-PC"
}

}

Failed to log to logstash

Failed to log to remote logstash, erro log is below:

log4net:ERROR [ElasticSearchAppender] ErrorCode: GenericFailure. ElasticSearchAppender [ElasticSearchAppender]: Failed to send all queued events in OnClose.
log4net: No appenders could be found for logger [VCC] repository [log4net-default-repository]
log4net: Please initialize the log4net system properly.
log4net: Current AppDomain context information:
log4net: BaseDirectory : C:\Repo\vcc\AttendAnywhere\wwwroot
log4net: FriendlyName : /LM/W3SVC/5/ROOT-14-131225972062670747
log4net: DynamicDirectory: C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Temporary ASP.NET Files\root\7ea2a552\eb8ae04c

Adding Log4Net.ElasticSearh via Nuget

Good Morning,

I have added the package to my project via nuget. I am receiving the following error when trying to log the first time: Could not create Appender [ElasticSearchAppender] of type [log4net.ElasticSearch.ElasticSearchAppender, log4net.ElasticSearch]. Reported error follows.
System.IO.FileNotFoundException: Could not load file or assembly 'log4net.ElasticSearch' or one of its dependencies. The system cannot find the file specified.
File name: 'log4net.ElasticSearch'.

I can see the assembly referenced in references. It is references as log4net.ElasticSearch40

Add a *Context override to specify ES server

The %property{name} convention does not work for arbitrary properties, only in the conversionLayout node.

Add a detection so that if ElasticsearchServer is set in the GlobalContext, ThreadContext, or ThreadLocalContext, it will override the server in the connection string. Good for when we need to determine dev, staging, prod servers.

@jc74 interested to hear what you think of this?

Support for elastic cloud

Hello there , I am wondering if this project supports sending logs to elastic cloud server.
If yes , how would you do that ? and what configrations i need for the cloud and for the .config file ?

Elasticsearch Centos 7 box

Log4net Appender works on localhost but when I try and use across the network on to Centos 7 Elasticsearch server nothing. And can't see way, its making me sad.

Log4net throws circular reference exception

While trying to log an exception in log4net using debug mode

try
{
var z = decimal.Divide(10, 0);
}
catch (Exception ex)
{
Log.Error(ex);
}

throws circular reference exception

image

at ExtensionMethods.ToJson

image

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.