GithubHelp home page GithubHelp logo

elastic / logstash Goto Github PK

View Code? Open in Web Editor NEW
14.0K 832.0 3.5K 123.52 MB

Logstash - transport and process your logs, events, or other data

Home Page: https://www.elastic.co/products/logstash

License: Other

Ruby 47.14% Shell 1.76% Batchfile 0.14% PowerShell 0.27% Java 47.71% HTML 1.29% Makefile 0.19% C 0.06% CSS 0.27% Dockerfile 0.05% Go 0.11% Python 0.79% Groovy 0.21%
etl-framework streaming logging java jruby real-time-processing

logstash's Introduction

Logstash

Logstash is part of the Elastic Stack along with Beats, Elasticsearch and Kibana. Logstash is a server-side data processing pipeline that ingests data from a multitude of sources simultaneously, transforms it, and then sends it to your favorite "stash." (Ours is Elasticsearch, naturally.). Logstash has over 200 plugins, and you can write your own very easily as well.

For more info, see https://www.elastic.co/products/logstash

Documentation and Getting Started

You can find the documentation and getting started guides for Logstash on the elastic.co site

For information about building the documentation, see the README in https://github.com/elastic/docs

Downloads

You can download officially released Logstash binaries, as well as debian/rpm packages for the supported platforms, from downloads page.

Need Help?

Logstash Plugins

Logstash plugins are hosted in separate repositories under the logstash-plugins github organization. Each plugin is a self-contained Ruby gem which gets published to RubyGems.org.

Writing your own Plugin

Logstash is known for its extensibility. There are hundreds of plugins for Logstash and you can write your own very easily! For more info on developing and testing these plugins, please see the working with plugins section

Plugin Issues and Pull Requests

Please open new issues and pull requests for plugins under its own repository

For example, if you have to report an issue/enhancement for the Elasticsearch output, please do so here.

Logstash core will continue to exist under this repository and all related issues and pull requests can be submitted here.

Developing Logstash Core

Prerequisites

  • Install JDK version 11 or 17. Make sure to set the JAVA_HOME environment variable to the path to your JDK installation directory. For example set JAVA_HOME=<JDK_PATH>
  • Install JRuby 9.2.x It is recommended to use a Ruby version manager such as RVM or rbenv.
  • Install rake and bundler tool using gem install rake and gem install bundler respectively.

RVM install (optional)

If you prefer to use rvm (ruby version manager) to manage Ruby versions on your machine, follow these directions. In the Logstash folder:

gpg --keyserver hkp://keys.gnupg.net --recv-keys 409B6B1796C275462A1703113804BB82D39DC0E3
\curl -sSL https://get.rvm.io | bash -s stable --ruby=$(cat .ruby-version)

Check Ruby version

Before you proceed, please check your ruby version by:

$ ruby -v

The printed version should be the same as in the .ruby-version file.

Building Logstash

The Logstash project includes the source code for all of Logstash, including the Elastic-Licensed X-Pack features and functions; to run Logstash from source using only the OSS-licensed code, export the OSS environment variable with a value of true:

export OSS=true
  • Set up the location of the source code to build
export LOGSTASH_SOURCE=1
export LOGSTASH_PATH=/YOUR/LOGSTASH/DIRECTORY

Install dependencies with gradle (recommended)1

  • Install development dependencies
./gradlew installDevelopmentGems
  • Install default plugins and other dependencies
./gradlew installDefaultGems

Verify the installation

To verify your environment, run the following to start Logstash and send your first event:

bin/logstash -e 'input { stdin { } } output { stdout {} }'

This should start Logstash with stdin input waiting for you to enter an event

hello world
2016-11-11T01:22:14.405+0000 0.0.0.0 hello world

Advanced: Drip Launcher

Drip is a tool that solves the slow JVM startup problem while developing Logstash. The drip script is intended to be a drop-in replacement for the java command. We recommend using drip during development, in particular for running tests. Using drip, the first invocation of a command will not be faster but the subsequent commands will be swift.

To tell logstash to use drip, set the environment variable JAVACMD=`which drip`.

Example (but see the Testing section below before running rspec for the first time):

JAVACMD=`which drip` bin/rspec

Caveats

Drip does not work with STDIN. You cannot use drip for running configs which use the stdin plugin.

Building Logstash Documentation

To build the Logstash Reference (open source content only) on your local machine, clone the following repos:

logstash - contains main docs about core features

logstash-docs - contains generated plugin docs

docs - contains doc build files

Make sure you have the same branch checked out in logstash and logstash-docs. Check out master in the docs repo.

Run the doc build script from within the docs repo. For example:

./build_docs.pl --doc ../logstash/docs/index.asciidoc --chunk=1 -open

Testing

Most of the unit tests in Logstash are written using rspec for the Ruby parts. For the Java parts, we use junit. For testing you can use the test rake tasks and the bin/rspec command, see instructions below:

Core tests

1- To run the core tests you can use the Gradle task:

./gradlew test

or use the rspec tool to run all tests or run a specific test:

bin/rspec
bin/rspec spec/foo/bar_spec.rb

Note that before running the rspec command for the first time you need to set up the RSpec test dependencies by running:

./gradlew bootstrap

2- To run the subset of tests covering the Java codebase only run:

./gradlew javaTests

3- To execute the complete test-suite including the integration tests run:

./gradlew check

4- To execute a single Ruby test run:

SPEC_OPTS="-fd -P logstash-core/spec/logstash/api/commands/default_metadata_spec.rb" ./gradlew :logstash-core:rubyTests --tests org.logstash.RSpecTests

5- To execute single spec for integration test, run:

./gradlew integrationTests -PrubyIntegrationSpecs=specs/slowlog_spec.rb

Sometimes you might find a change to a piece of Logstash code causes a test to hang. These can be hard to debug.

If you set LS_JAVA_OPTS="-agentlib:jdwp=transport=dt_socket,server=y,suspend=n,address=5005" you can connect to a running Logstash with your IDEs debugger which can be a great way of finding the issue.

Plugins tests

To run the tests of all currently installed plugins:

rake test:plugins

You can install the default set of plugins included in the logstash package:

rake test:install-default

Note that if a plugin is installed using the plugin manager bin/logstash-plugin install ... do not forget to also install the plugins development dependencies using the following command after the plugin installation:

bin/logstash-plugin install --development

Building Artifacts

Built artifacts will be placed in the LS_HOME/build directory, and will create the directory if it is not already present.

You can build a Logstash snapshot package as tarball or zip file

./gradlew assembleTarDistribution
./gradlew assembleZipDistribution

OSS-only artifacts can similarly be built with their own gradle tasks:

./gradlew assembleOssTarDistribution
./gradlew assembleOssZipDistribution

You can also build .rpm and .deb, but the fpm tool is required.

rake artifact:rpm
rake artifact:deb

and:

rake artifact:rpm_oss
rake artifact:deb_oss

Using a Custom JRuby Distribution

If you want the build to use a custom JRuby you can do so by setting a path to a custom JRuby distribution's source root via the custom.jruby.path Gradle property.

E.g.

./gradlew clean test -Pcustom.jruby.path="/path/to/jruby"

Project Principles

  • Community: If a newbie has a bad time, it's a bug.
  • Software: Make it work, then make it right, then make it fast.
  • Technology: If it doesn't do a thing today, we can make it do it tomorrow.

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, complaints, and even something you drew up on a napkin.

Programming is not a required skill. Whatever you've seen about open source and maintainers or community members saying "send patches or die" - you will not see that here.

It is more important that you are able to contribute.

For more information about contributing, see the CONTRIBUTING file.

Footnotes

Footnotes

  1. Use bundle instead of gradle to install dependencies

    Alternatively, instead of using gradle you can also use bundle:

    • Install development dependencies

      bundle config set --local path vendor/bundle
      bundle install
      
    • Bootstrap the environment:

      rake bootstrap
      
    • You can then use bin/logstash to start Logstash, but there are no plugins installed. To install default plugins, you can run:

      rake plugin:install-default
      

    This will install the 80+ default plugins which makes Logstash ready to connect to multiple data sources, perform transformations and send the results to Elasticsearch and other destinations.

    โ†ฉ

logstash's People

Contributors

andrewvc avatar andsel avatar colinsurprenant avatar danhermann avatar dedemorton avatar dliappis avatar electrical avatar fetep avatar jakelandis avatar jordansissel avatar jsvd avatar kaisecheng avatar karenzone avatar kares avatar kurtado avatar lcawl avatar louiszuckerman avatar lusis avatar mashhurs avatar nickethier avatar original-brownbear avatar ph avatar piavlo avatar roaksoax avatar robbavey avatar suyograo avatar talevy avatar untergeek avatar wiibaa avatar yaauie avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash's Issues

Showstopper bugs on Windows 7

On win7 64 bit (and possibly other windows versions) logstash is currently useless for these reasons:

  1. Logstash locks logfiles when using file input. This means that rolling logs doesn't work at all. Even editing a logfile monitored by logstash is impossible.
  2. There are clearly line ending issues as I get events with text "\r" and nothing more.
  3. If there is something wrong with the config file (such as specifying the wrong port number for an output) Ctrl+C doesn't work - the process must be killed manually.

grok filters are ignored for the first line when using multiline

Given the following configuration:

input { stdin {}
}

filter {
        multiline {
                pattern => "^\s+%{JAVASTACKTRACEPART}"
                what => "previous"
                add_field => { "multi" => "yes" }
        }

        # Parse logs.
        grok {
                match => [
                        "message", "^\[%{TIMESTAMP_ISO8601:log_time}\] %{WORD:level} (?<description>.*?)$"
                        ]
        }
}

output { stdout { codec => rubydebug } }

Running logstash with this configuration and pasting a few lines in:

* bin/logstash -f simple.conf
[2014-05-15 00:01:20,278] INFO Leader of the pack
[2014-05-15 00:01:20,278] INFO Leader of the pack 2
{
       "message" => "[2014-05-15 00:01:20,278] INFO Leader of the pack",
      "@version" => "1",
    "@timestamp" => "2014-05-15T17:01:28.132Z",
          "host" => "redacted"
}
[2014-05-15 00:01:20,278] INFO Leader of the pack 3
{
        "message" => "[2014-05-15 00:01:20,278] INFO Leader of the pack 2",
       "@version" => "1",
     "@timestamp" => "2014-05-15T17:01:30.526Z",
           "host" => "redacted",
       "log_time" => "2014-05-15 00:01:20,278",
          "level" => "INFO",
    "description" => "Leader of the pack 2"
}
[2014-05-15 00:01:20,278] INFO Leader of the pack 4
{
        "message" => "[2014-05-15 00:01:20,278] INFO Leader of the pack 3",
       "@version" => "1",
     "@timestamp" => "2014-05-15T17:01:32.718Z",
           "host" => "redacted",
       "log_time" => "2014-05-15 00:01:20,278",
          "level" => "INFO",
    "description" => "Leader of the pack 3"
}

The first line isn't parsed until I've sent the second line, which is expected. However, my grok filter is ignored for first line! I don't get a "level" field nor a _grokparsefailure.

init scripts do not have $HOME set

If you start logstash from the init scripts, then the $HOME variable is not defined.

This results in the following messages in the logs

No SINCEDB_DIR or HOME environment variable set, I don't know where to keep track of the files I'm watching. Either set HOME or SINCEDB_DIR in your environment, or set sincedb_path in in your logstash config for the file input with path

Full output

{:timestamp=>"2014-02-12T14:45:28.939000-0700", :message=>"Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.3.3/plugin-milestones", :level=>:warn}
{:timestamp=>"2014-02-12T14:45:29.155000-0700", :message=>"You are using a deprecated config setting \"debug_format\" set in stdout. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future.  If you have any questions about this, please visit the #logstash channel on freenode irc.", :name=>"debug_format", :plugin=><LogStash::Outputs::Stdout --->, :level=>:warn}
{:timestamp=>"2014-02-12T14:45:34.243000-0700", :message=>"No SINCEDB_DIR or HOME environment variable set, I don't know where to keep track of the files I'm watching. Either set HOME or SINCEDB_DIR in your environment, or set sincedb_path in in your logstash config for the file input with path '[\"/var/log/messages\"]'", :level=>:error}
{:timestamp=>"2014-02-12T14:45:34.253000-0700", :message=>"+---------------------------------------------------------+\n| An unexpected error occurred. This is probably a bug.   |\n| You can find help with this problem in a few places:    |\n|                                                         |\n| * chat: #logstash IRC channel on freenode irc.          |\n|     IRC via the web: http://goo.gl/TI4Ro                |\n| * email: [email protected]                |\n| * bug system: https://logstash.jira.com/                |\n|                                                         |\n+---------------------------------------------------------+\nThe error reported is: \n  "}

This results in a confusing out of the box experience for new users. Especially because many of the examples in the documentation refer to $HOME

The init script should export the $HOME variable.

GELF input doesnt reach elasticsearch output, but stdout output.

Hello.
ES 1.1.1
Logstash 1.4

Im running logstash like so,

./bin/logstash --debug -v -e ' input { tcp { port => 5141 } gelf { } } output { stdout { } elasticsearch_http { host => "es-host" } }'

Im sending a tcp package like so,

$ echo "Hello, Im from nc" | nc localhost 5141

And a GELF message like so

gelf = GELF::Notifier.new("logstash-host", 12201, "WAN", { :facility => "appname" })
gelf.notify! "Hello, im a gelf message"

The logstash spits out the following on stdout

Compiled pipeline code:
@inputs = []
@filters = []
@outputs = []
@input_tcp_1 = plugin("input", "tcp", LogStash::Util.hash_merge_many({ "port" => 5141 }))

@inputs << @input_tcp_1
@input_gelf_2 = plugin("input", "gelf")

@inputs << @input_gelf_2
@output_stdout_3 = plugin("output", "stdout")

@outputs << @output_stdout_3
@output_elasticsearch_http_4 = plugin("output", "elasticsearch_http", LogStash::Util.hash_merge_many({ "host" => ("spr-prod-dogo-es-01.my-domain.com".force_encoding("UTF-8")) }))

@outputs << @output_elasticsearch_http_4
  @filter_func = lambda do |event, &block|
    extra_events = []
    @logger.debug? && @logger.debug("filter received", :event => event.to_hash)
    extra_events.each(&block)
  end
  @output_func = lambda do |event, &block|
    @logger.debug? && @logger.debug("output received", :event => event.to_hash)
    @output_stdout_3.handle(event)
    @output_elasticsearch_http_4.handle(event)

  end {:level=>:debug, :file=>"logstash/pipeline.rb", :line=>"26"}
Using milestone 2 input plugin 'tcp'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.0/plugin-milestones {:level=>:warn, :file=>"logstash/config/mixin.rb", :line=>"209"}
config LogStash::Codecs::Line/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@port = 5141 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@debug = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@codec = <LogStash::Codecs::Line charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@add_field = {} {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@host = "0.0.0.0" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@data_timeout = -1 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@mode = "server" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@ssl_enable = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@ssl_verify = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Tcp/@ssl_key_passphrase = <password> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
Using milestone 2 input plugin 'gelf'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.0/plugin-milestones {:level=>:warn, :file=>"logstash/config/mixin.rb", :line=>"209"}
config LogStash::Codecs::Plain/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@debug = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@codec = <LogStash::Codecs::Plain charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@add_field = {} {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@host = "0.0.0.0" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@port = 12201 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@remap = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Inputs::Gelf/@strip_leading_underscore = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Codecs::Line/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::Stdout/@type = "" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::Stdout/@tags = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::Stdout/@exclude_tags = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::Line charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::Stdout/@workers = 1 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
Using milestone 2 output plugin 'elasticsearch_http'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.0/plugin-milestones {:level=>:warn, :file=>"logstash/config/mixin.rb", :line=>"209"}
config LogStash::Codecs::Plain/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@host = "spr-prod-dogo-es-01.my-domain.com" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@type = "" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@tags = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@exclude_tags = [] {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@codec = <LogStash::Codecs::Plain charset=>"UTF-8"> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@workers = 1 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@index = "logstash-%{+YYYY.MM.dd}" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@manage_template = true {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@template_name = "logstash" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@template_overwrite = false {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@port = 9200 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@user = nil {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@password = <password> {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@flush_size = 100 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@idle_flush_time = 1 {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@document_id = nil {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
config LogStash::Outputs::ElasticSearchHTTP/@replication = "sync" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
Starting tcp input listener {:address=>"0.0.0.0:5141", :level=>:info, :file=>"logstash/inputs/tcp.rb", :line=>"84"}
Pipeline started {:level=>:info, :file=>"logstash/pipeline.rb", :line=>"78"}
Starting gelf listener {:address=>"0.0.0.0:12201", :level=>:info, :file=>"logstash/inputs/gelf.rb", :line=>"70"}
Automatic template management enabled {:manage_template=>"true", :level=>:info, :file=>"logstash/outputs/elasticsearch_http.rb", :line=>"104"}
Template Search URL: {:template_search_url=>"http://spr-prod-dogo-es-01.my-domain.com:9200/_template/*", :level=>:debug, :file=>"logstash/outputs/elasticsearch_http.rb", :line=>"112"}
Accepted connection {:client=>"127.0.0.1:41478", :server=>"0.0.0.0:5141", :level=>:debug, :file=>"logstash/inputs/tcp.rb", :line=>"165"}
config LogStash::Codecs::Line/@charset = "UTF-8" {:level=>:debug, :file=>"logstash/config/mixin.rb", :line=>"105"}
output received {:event=>{"message"=>"Hello, Im from nc", "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:26.176Z", "host"=>"127.0.0.1:41478"}, :level=>:debug, :file=>"(eval)", :line=>"22"}
Connection closed {:client=>"127.0.0.1:41478", :level=>:debug, :file=>"logstash/inputs/tcp.rb", :line=>"120"}
2014-04-28T16:34:26.176+0000 127.0.0.1:41478 Hello, Im from nc
Flushing output {:outgoing_count=>1, :time_since_last_flush=>7.522, :outgoing_events=>{nil=>[[#<LogStash::Event:0xc2be308 @accessors=#<LogStash::Util::Accessors:0x15462876 @store={"message"=>"Hello, Im from nc", "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:26.176Z", "host"=>"127.0.0.1:41478"}, @lut={"host"=>[{"message"=>"Hello, Im from nc", "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:26.176Z", "host"=>"127.0.0.1:41478"}, "host"], "message"=>[{"message"=>"Hello, Im from nc", "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:26.176Z", "host"=>"127.0.0.1:41478"}, "message"]}>, @data={"message"=>"Hello, Im from nc", "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:26.176Z", "host"=>"127.0.0.1:41478"}, @cancelled=false>, "logstash-%{+YYYY.MM.dd}", ""]]}, :batch_timeout=>1, :force=>nil, :final=>nil, :level=>:debug, :file=>"stud/buffer.rb", :line=>"207"}
output received {:event=>{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, :level=>:debug, :file=>"(eval)", :line=>"22"}
2014-04-28T16:34:29.164+0000 lagbchch00-0016 Hello, Im a gelf message
Flushing output {:outgoing_count=>1, :time_since_last_flush=>2.851, :outgoing_events=>{nil=>[[#<LogStash::Event:0x38f28473 @accessors=#<LogStash::Util::Accessors:0x39d5b73a @store={"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, @lut={"source_host"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "source_host"], "timestamp"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "timestamp"], "@timestamp"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "@timestamp"], "full_message"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "full_message"], "short_message"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "short_message"], "message"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "message"], "host"=>[{"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, "host"]}>, @data={"facility"=>"appname", "version"=>"1.0", "host"=>"lagbchch00-0016.my-domain.com", "level"=>6, "file"=>"(irb)", "line"=>57, "@version"=>"1", "@timestamp"=>"2014-04-28T16:34:29.164Z", "source_host"=>"10.0.14.26", "message"=>"Hello, Im a gelf message"}, @cancelled=false>, "logstash-%{+YYYY.MM.dd}", ""]]}, :batch_timeout=>1, :force=>nil, :final=>nil, :level=>:debug, :file=>"stud/buffer.rb", :line=>"207"}

In kibana I can see the netcat message but not the GELF one, ES logs doesnt say anything either.

Any ideas?

Thanks!
Simon.

Support Nested Hashes in HTTP output mapping

I would like the ability to have nested hashes when using the mapping function of the HTTP output.

given:
mapping => [ "metric", "test", "tags", [ "name", "Help", "type", "Please"]

The current json output would be:

{
   "metric": "test",
   "tags": "["name","Help","type","Please"]"
}

What I would expect would be:

{
  "Metric": "test",
  "tags": {
    "name": "Help",
    "type": "Please"
  }
}

NoMethodError: undefined method `tv_sec' in outputs/elasticsearch

copied bug report from PR #1211 comment by @thuck.
seems to happen with post multiline/tv_sec fix. To validate.

NoMethodError: undefined method `tv_sec' for #Array:0x432c4c7a
sprintf at /opt/logstash/lib/logstash/event.rb:230
gsub at org/jruby/RubyString.java:3041
sprintf at /opt/logstash/lib/logstash/event.rb:216
receive at /opt/logstash/lib/logstash/outputs/elasticsearch.rb:324
handle at /opt/logstash/lib/logstash/outputs/base.rb:86
initialize at (eval):78
call at org/jruby/RubyProc.java:271
output at /opt/logstash/lib/logstash/pipeline.rb:266
outputworker at /opt/logstash/lib/logstash/pipeline.rb:225
start_outputs at /opt/logstash/lib/logstash/pipeline.rb:152

RPM doesn't create needed directory for logstash-web

When installing the RPM the directory for the logstash-web pid is not created:

/etc/init.d/logstash-web: line 92: /var/run/logstash-web/logstash-web.pid: No such file or directory

The directory is actually missing, not permissions or something.

Simple Vagrantfile to check that:

# -*- mode: ruby -*-
# vi: set ft=ruby :

# Vagrantfile API/syntax version. Don't touch unless you know what you're doing!
VAGRANTFILE_API_VERSION = "2"

Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
  config.vm.box = "chef/centos-6.5"
  config.vm.provision :shell, :inline => "cd /tmp && curl -OL 'https://download.elasticsearch.org/logstash/logstash/packages/centos/logstash-1.4.0-1_c82dc09.noarch.rpm'"
  config.vm.provision :shell, :inline => "yum -y localinstall /tmp/logstash-1.4.0-1_c82dc09.noarch.rpm"
  config.vm.provision :shell, :inline => "sudo chkconfig --add logstash && sudo chkconfig --add logstash-web"
  config.vm.provision :shell, :inline => "sudo service logstash start && sudo service logstash-web start"
end

make test fails when locale is not English

When i run "make test" from a fresh clone, I have a failure on the date tests:

1) apache common log format "198.151.8.4 - - [29/Aug/2012:20:17:38 -0400] "GET /..." when processed
     Failure/Error: Unable to find matching line from backtrace
     Insist::Failure:
       Expected "2012-08-30T00:17:38.000Z", but got "2014-04-16T12:36:53.772Z"
     # ./spec/examples/parse-apache-logs.rb:57:in `(root)'
     # ./lib/logstash/runner.rb:82:in `run'
     # ./lib/logstash/runner.rb:160:in `run'
     # ./lib/logstash/runner.rb:199:in `run'
     # ./lib/logstash/runner.rb:116:in `main'
     # ./lib/logstash/runner.rb:239:in `(root)'

Finished in 3.74 seconds
89 examples, 1 failure

Failed examples:

rspec ./spec/test_utils.rb:109 # apache common log format "198.151.8.4 - - [29/Aug/2012:20:17:38 -0400] "GET /..." when processed

However if I run "LANG=C make test", tests are OK:

Finished in 24.09 seconds
495 examples, 0 failures

Randomized with seed 39261

Locales:

$ locale
LANG=fr_FR.UTF-8
LANGUAGE=fr_FR
LC_CTYPE=fr_FR.UTF-8
LC_NUMERIC="fr_FR.UTF-8"
LC_TIME="fr_FR.UTF-8"
LC_COLLATE="fr_FR.UTF-8"
LC_MONETARY="fr_FR.UTF-8"
LC_MESSAGES="fr_FR.UTF-8"
LC_PAPER="fr_FR.UTF-8"
LC_NAME="fr_FR.UTF-8"
LC_ADDRESS="fr_FR.UTF-8"
LC_TELEPHONE="fr_FR.UTF-8"
LC_MEASUREMENT="fr_FR.UTF-8"
LC_IDENTIFICATION="fr_FR.UTF-8"
LC_ALL=

kv should always overwrite the target field

In the event that kv has nothing to do, it doesn't write the target field.
If your source and target are the same, this can be a problem: You expect the target to always be a hash but when kv took no action, it's still a string.

This in turn breaks inserting things into ES if ES is expecting an object.

undefined local variable or method `org' at event.rb:218

A missing require, perhaps? This works with jruby, but fails on ruby1.9.1

Failed to flush outgoing items {:outgoing_count=>1, :exception=>#<NameError: undefined local variable or method `org' for 2014-04-16T10:35:35Z devbox %{message}:LogStash::Event>, :backtrace=>["/opt/logstash/lib/logstash/event.rb:218:in `block in sprintf'", "/opt/logstash/lib/logstash/event.rb:209:in `gsub'", "/opt/logstash/lib/logstash/event.rb:209:in `sprintf'", ...}

Spec tests fail on non english systems

Running make tarball-test fails on my french debian on these two spec tests

  1) apache common log format "198.151.8.4 - - [29/Aug/2012:20:17:38 -0400] "GET /..." when processed
     Failure/Error: Unable to find matching line from backtrace
     Insist::Failure:
       Expected "2012-08-30T00:17:38.000Z", but got "2014-04-10T10:46:18.771Z"
     # ./spec/examples/parse-apache-logs.rb:57:in `(root)'
     # ./lib/logstash/runner.rb:82:in `run'

and

  1) LogStash::Filters::Date parsing with timezone parameter "{"mydate":"2013 Nov 24 01:29:01"}" when processed
     Failure/Error: Unable to find matching line from backtrace
     Insist::Failure:
       Expected "2013-11-24T09:29:01.000Z", but got "2014-04-10T12:54:14.411Z"
     # ./spec/filters/date.rb:278:in `(root)'
     # ./lib/logstash/runner.rb:82:in `run'
     # ./lib/logstash/runner.rb:165:in `run'
     # ./lib/logstash/runner.rb:204:in `run'
     # ./lib/logstash/runner.rb:116:in `main'
     # ./lib/logstash/runner.rb:244:in `(root)'

Adding locale => "en" to the spec conf solves the issue.

Collectd codec missing NaN handling from collectd input

In 1.4.0 with the collectd input plugin there was handling for managing NaN values. This came after my fork for refactoring as a codec so it was omitted, unfortunately.

NaN values are a reality and a way to handle them needs to be added back in.

add robustness to avoid crashing on unexepected input

this is a followup from issue #1240 - where pre multiline/tv_sec fix shippers were paired with post multiline/tv_sec fix indexers using redis. The indexer was crashing on the tv_sec error because the shipper malformed the event.

should we add robustness to avoid crashing in such a condition?

Logstash upstart script not working for some users

/var/log/upstart/logstash.log only gives the following: Sending logstash logs to /var/log/logstash/logstash.log. Sadly, there's no information in it. So I tried to run command that is executed as the logstash user:

$ /usr/bin/java -Djava.io.tmpdir=/var/lib/logstash -Xmx500m -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -Djava.awt.headless=true -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -jar /opt/logstash/vendor/jar/jruby-complete-1.7.11.jar  -I/opt/logstash/lib /opt/logstash/lib/logstash/runner.rb agent -f /etc/logstash/conf.d -l /var/log/logstash/logstash.log
LoadError: no such file to load -- i18n
  require at org/jruby/RubyKernel.java:1085
  require at file:/opt/logstash/vendor/jar/jruby-complete-1.7.11.jar!/META-INF/jruby.home/lib/ruby/shared/rubygems/core_ext/kernel_require.rb:55
  require at /opt/logstash/lib/logstash/JRUBY-6970.rb:27
   (root) at /opt/logstash/lib/logstash/runner.rb:50

It works if I set $GEM_HOME:

$ GEM_HOME="/opt/logstash/vendor/bundle/jruby/1.9/" /usr/bin/java -Djava.io.tmpdir=/var/lib/logstash -Xmx500m -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -Djava.awt.headless=true -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -jar /opt/logstash/vendor/jar/jruby-complete-1.7.11.jar  -I/opt/logstash/lib /opt/logstash/lib/logstash/runner.rb agent -f /etc/logstash/conf.d -l /var/log/logstash/logstash.log

I don't know if that is the actual problem, or if this is simply because I am running the command differently compared to the upstart daemon. If you have any more clues on how to dig into that, I will tryโ€ฆ

improve error handling for elasticsearch_http

I had some issues setting up the elasticsearch_http output. I had several working inputs but mysteriously nothing showed up in elasticsearch. I turned on verbose logging and there were no errors. After double and triple checking everything else, I modified the plugin to inspect what was happening and it turned out that elasticsearch was returning errors because the cluster was red (no quorum) because my index template required two replicas. This failed completely silently on the logstash side.

From my point of view a few things are wrong there:

  1. The plugin does not check the status code that elasticsearch returns (500) and does not inspect the message that is returned. It should verify both the status and the message to be as expected and log an error if that is not the case.
  2. The plugin only catches EOF exceptions and surely there are other things that might go wrong?
  3. With verbose logging on (-vvv) there was no evidence of anything elasticsearch related even happening when in fact it was trying and failing. At debug level I would expect to see a message confirming when the plugin is trying to push events to elasticsearch and another debug message when that completes successfully and an error/warning otherwise.

Adding max_length to multiline codec

Most of the time there is no control on no. of lines comes in a stack trace and it is main cause of java heap issue when multiline codec is used. It would be better if we add a max_length property to multiline code to allow only fewer lines which are relevant to get pushed to the input queue.

log4j, [2014-04-20T02:42:07.970]  WARN: org.elasticsearch.transport.netty: [Thor(App)] exception caught on transport layer [[id: 0x06f37535, /10.2.17.110:41482 => /10.2.17.143:9300]], closing connection
java.lang.OutOfMemoryError: Java heap space
log4j, [2014-04-20T02:49:26.843]  WARN: org.elasticsearch.transport.netty: [Thor(App)] exception caught on transport layer [[id: 0xb845a618, /10.2.17.143:35101 => /10.2.17.110:9303]], closing connection
java.lang.OutOfMemoryError: Java heap space
log4j, [2014-04-20T02:49:41.050]  WARN: org.elasticsearch.transport.netty: [Thor(App)] exception caught on transport layer [[id: 0xb845a618, /10.2.17.143:35101 :> /10.2.17.110:9303]], closing connection

Option to set up permissions for unix socket

Unix server socket is created with default permissions 755. It's practically useless to have such mode as it allows only user under which logstash is running to write into socket. Whether logstash is running under root or unprivileged user other unprivileged processes aren't able to write data into it.

Allow Embedded ElasticSearch instance to install plugins like cloud-aws, cloud-gce, etc.

Request

Expose and document embedded ElasticSearch instance to install plugins (cloud-aws, cloud-gce, etc) inside Logstash.

Motivation

I tried connecting my Logstash node to an ElasticSearch cluster over AWS. I connect the ES nodes in the ES cluster using the cloud-aws plugin. Connecting my LS node took a bit more time, mainly because I couldn't find a way to install the cloud-aws plugin from Logstash. While the ES nodes connected to each other happily, my Logstash node just refused to connect. Even running a separate local instance of ES connected to the cluster fine, but LS did not communicate to the local instance of ES (using output { elasticsearch { host => localhost cluster => "logstash" } }). I did eventually resolve this issue by rsyncing over the cloud-aws/ plugin into logstash-1.4.0/vendor/jar/elasticsearch-1.0.1/plugins/.

Alternatively, I could have just rsynced the elasticsearch-1.0.1/bin/ directory over to logstash-1.4.0/vendor/jar/elasticsearch-1.0.1/bin/, and then install plugins and treat it as if it was a regular version of elasticsearch.

My Approach

Some terminal output

ehtesh@ackee:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 12.04.4 LTS
Release:        12.04
Codename:       precise
ehtesh@ackee:~$ ls
cloud-aws elasticsearch-1.0.1  elasticsearch-1.0.1.tar.gz  logstash-1.4.0  logstash-1.4.0.tar.gz  logstash.conf configs
ehtesh@ackee:~$ elasticsearch-1.0.1/bin/plugin -install elasticsearch/elasticsearch-cloud-aws/2.0.0.RC1
Trying http://download.elasticsearch.org/elasticsearch/elasticsearch-cloud-aws/elasticsearch-cloud-aws-2.0.0.RC1.zip...
Downloading ......................................................................................................................................................................................................................................................................
..................................................................................................................................................................................................................................................................................
..................................................................................................................................................................................................................................................................................
..................................................................................................................................................................................................................................................................................
..............................................................................................................................................................DONE
Installed elasticsearch/elasticsearch-cloud-aws/2.0.0.RC1 into /home/ubuntu/elasticsearch-1.0.1/plugins/cloud-aws
ehtesh@ackee:~$ cat configs/elasticsearch.yml
node.name: logstash-indexer
cluster.name: logstash
plugin.mandatory: cloud-aws

cloud:
    aws:
        access_key: <snipped>
        secret_key: <snipped>
        region: us-east
discovery:
    type: ec2
    ec2:
        groups: prod-logstash
ehtesh@ackee:~$ cat logstash.conf
input {
    stdin { type => "foo" }
}

output {
    elasticsearch { cluster => "logstash" }
    stdout { codec => rubydebug }
}
ehtesh@ackee:~$ logstash-1.4.0/bin/logstash -f logstash.conf
... <snipped warnings> ...
log4j, [2014-05-05T03:51:27.690]  WARN: org.elasticsearch.discovery: [logstash-ackee-14405-2026] waited for 30s and no initial state was set by the discovery
Exception in thread ">output" org.elasticsearch.discovery.MasterNotDiscoveredException: waited for [30s]
        at org.elasticsearch.action.support.master.TransportMasterNodeOperationAction$3.onTimeout(org/elasticsearch/action/support/master/TransportMasterNodeOperationAction.java:180)
        at org.elasticsearch.cluster.service.InternalClusterService$NotifyTimeout.run(org/elasticsearch/cluster/service/InternalClusterService.java:491)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(java/util/concurrent/ThreadPoolExecutor.java:1145)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(java/util/concurrent/ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(java/lang/Thread.java:744)
^C
ehtesh@ackee:~$ cp configs/elasticsearch.yml .
ehtesh@ackee:~$ logstash-1.4.0/bin/logstash -f logstash.conf
... <snipped warnings> ...
Exception in thread ">output" org.elasticsearch.ElasticsearchException: Missing mandatory plugins [cloud-aws]
        at org.elasticsearch.plugins.PluginsService.<init>(org/elasticsearch/plugins/PluginsService.java:130)
        at org.elasticsearch.node.internal.InternalNode.<init>(org/elasticsearch/node/internal/InternalNode.java:143)
        at org.elasticsearch.node.NodeBuilder.build(org/elasticsearch/node/NodeBuilder.java:159)
        at org.elasticsearch.node.NodeBuilder.node(org/elasticsearch/node/NodeBuilder.java:166)
        at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:606)
        at RUBY.build_client(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch/protocol.rb:198)
        at RUBY.client(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch/protocol.rb:15)
        at RUBY.initialize(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch/protocol.rb:157)
        at RUBY.register(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch.rb:238)
        at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)
        at RUBY.outputworker(/home/ubuntu/logstash-1.4.0/lib/logstash/pipeline.rb:220)
        at RUBY.start_outputs(/home/ubuntu/logstash-1.4.0/lib/logstash/pipeline.rb:152)
^C
ehtesh@ackee:~$ grep -v "plugin.mandatory" elasticsearch.yml > temp && mv temp elasticsearch.yml
ehtesh@ackee:~$ logstash-1.4.0/bin/logstash -f logstash.conf
... <snipped warnings> ...
Exception in thread ">output" org.elasticsearch.common.settings.NoClassSettingsException: Failed to load class setting [discovery.type] with value [ec2]
        at org.elasticsearch.common.settings.ImmutableSettings.loadClass(org/elasticsearch/common/settings/ImmutableSettings.java:448)
        at org.elasticsearch.common.settings.ImmutableSettings.getAsClass(org/elasticsearch/common/settings/ImmutableSettings.java:436)
        at org.elasticsearch.discovery.DiscoveryModule.spawnModules(org/elasticsearch/discovery/DiscoveryModule.java:51)
        at org.elasticsearch.common.inject.ModulesBuilder.add(org/elasticsearch/common/inject/ModulesBuilder.java:44)
        at org.elasticsearch.node.internal.InternalNode.<init>(org/elasticsearch/node/internal/InternalNode.java:166)
        at org.elasticsearch.node.NodeBuilder.build(org/elasticsearch/node/NodeBuilder.java:159)
        at org.elasticsearch.node.NodeBuilder.node(org/elasticsearch/node/NodeBuilder.java:166)
        at java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:606)
        at RUBY.build_client(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch/protocol.rb:198)
        at RUBY.client(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch/protocol.rb:15)
        at RUBY.initialize(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch/protocol.rb:157)
        at RUBY.register(/home/ubuntu/logstash-1.4.0/lib/logstash/outputs/elasticsearch.rb:238)
        at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)
        at RUBY.outputworker(/home/ubuntu/logstash-1.4.0/lib/logstash/pipeline.rb:220)
        at RUBY.start_outputs(/home/ubuntu/logstash-1.4.0/lib/logstash/pipeline.rb:152)
        at java.lang.Thread.run(java/lang/Thread.java:744)
Caused by: java.lang.ClassNotFoundException: org.elasticsearch.discovery.ec2.Ec2DiscoveryModule
        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at org.jruby.util.JRubyClassLoader.findClass(JRubyClassLoader.java:128)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.elasticsearch.common.settings.ImmutableSettings.loadClass(ImmutableSettings.java:446)
        at org.elasticsearch.common.settings.ImmutableSettings.getAsClass(ImmutableSettings.java:436)
        at org.elasticsearch.discovery.DiscoveryModule.spawnModules(DiscoveryModule.java:51)
        at org.elasticsearch.common.inject.ModulesBuilder.add(ModulesBuilder.java:44)
        at org.elasticsearch.node.internal.InternalNode.<init>(InternalNode.java:166)
        at org.elasticsearch.node.NodeBuilder.build(NodeBuilder.java:159)
        at org.elasticsearch.node.NodeBuilder.node(NodeBuilder.java:166)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:440
        at org.jruby.javasupport.JavaMethod.invokeDirect(JavaMethod.java:304)
        at org.jruby.java.invokers.InstanceMethodInvoker.call(InstanceMethodInvoker.java:52)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:306)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:136)
        at org.jruby.ast.CallNoArgNode.interpret(CallNoArgNode.java:60)
        at org.jruby.ast.CallNoArgNode.interpret(CallNoArgNode.java:60)
        at org.jruby.ast.ReturnNode.interpret(ReturnNode.java:92)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:182)
        at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:198)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:326)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:170)
        at org.jruby.ast.FCallOneArgNode.interpret(FCallOneArgNode.java:36)
        at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:139)
        at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:182)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:306)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:136)
        at org.jruby.ast.VCallNode.interpret(VCallNode.java:88)
        at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:204)
        at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:206)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:336)
        at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:179)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:183)
        at org.jruby.RubyClass.newInstance(RubyClass.java:804)
        at org.jruby.RubyClass$INVOKER$i$newInstance.call(RubyClass$INVOKER$i$newInstance.gen)
        at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroOrOneOrNBlock.call(JavaMethod.java:297)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:326)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:170)
        at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57)
        at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:112)
        at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:164)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:286)
        at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:81)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:85)
        at org.jruby.RubySymbol$1.yieldInner(RubySymbol.java:445)
        at org.jruby.RubySymbol$1.yield(RubySymbol.java:465)
        at org.jruby.runtime.Block.yield(Block.java:142)
        at org.jruby.RubyArray.eachCommon(RubyArray.java:1606)
        at org.jruby.RubyArray.each(RubyArray.java:1613)
        at org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:316)
        at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:145)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:149)
        at org.jruby.ast.CallNoArgBlockPassNode.interpret(CallNoArgBlockPassNode.java:53)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
        at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
        at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
        at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:139)
        at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:182)
        at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:306)
        at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:136)
        at org.jruby.ast.VCallNode.interpret(VCallNode.java:88)
        at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
        at org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:112)
        at org.jruby.runtime.Interpreted19Block.evalBlockBody(Interpreted19Block.java:206)
        at org.jruby.runtime.Interpreted19Block.yield(Interpreted19Block.java:194)
        at org.jruby.runtime.Interpreted19Block.call(Interpreted19Block.java:125)
        at org.jruby.runtime.Block.call(Block.java:101)
        at org.jruby.RubyProc.call(RubyProc.java:290)
        at org.jruby.RubyProc.call(RubyProc.java:228)
        at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:99)
        at java.lang.Thread.run(Thread.java:744)
^C
ehtesh@ackee:~$ rsync -av --quiet elasticsearch-1.0.1/plugins/cloud-aws/ logstash-1.4.0/vendor/jar/elasticsearch-1.0.1/plugins/cloud-aws/
ehtesh@ackee:~$ logstash-1.4.0/bin/logstash -f logstash.conf
... <snipped warnings> ...
cool
{
       "message" => "cool",
      "@version" => "1",
    "@timestamp" => "2014-05-05T15:46:35.664Z",
          "type" => "foo",
          "host" => "ackee"
}
^C
ehtesh@ackee:~$ rsync -av --quiet elasticsearch-1.0.1/bin/ logstash-1.4.0/vendor/jar/elasticsearch-1.0.1/bin/
ehtesh@ackee:~$ logstash-1.4.0/vendor/jar/elasticsearch-1.0.1/bin/plugin -install lmenezes/elasticsearch-kopf
-> Installing lmenezes/elasticsearch-kopf...
Trying https://github.com/lmenezes/elasticsearch-kopf/archive/master.zip...
Downloading ......................................................................................................................................................................................................................................................................
..................................................................................................................................................................................................................................................................................
..................................................................................................................................................................................................................................................................................
..................................................................................................................................................................................................................................................................................
..............................................................................................................................................................DONE
Installed lmenezes/elasticsearch-kopf into /home/ubuntu/logstash-1.4.0/vendor/jar/elasticsearch-1.0.1/plugins/kopf
Identified as a _site plugin, moving to _site structure ...
ehtesh@ackee:~$

logstash 1.3.3 is losing data with raw TCP input

When using raw TCP for input, I have noticed that you can lose data. This appears to happen when the input queue is overflowed. How is this overflowed, I am not sure because there is no entry in the log file and I am not a Ruby programmer. But I suspect it is when a queue processing thread dies.

Since there is no flow control, the sending program might as well dump the data to /dev/null.

When this happens, it would be nice if the socket was closed, then the client can pause and reconnect.

Cisco ASA pattern error

Hi,

There is an issue with the built-in pattern for Cisco ASA firewalls. The line :

# ASA-6-302020, ASA-6-302021
CISCOFW302020_302021 %{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\(%{DATA:fwuser}\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \(%{DATA:user}\))?

should be replaced by :

# ASA-6-302020_302021 inbound
CISCOFW302020_302021_1 %{CISCO_ACTION:action}(?: (?<direction>inbound))? %{WORD:protocol} connection for faddr %{IP:src_ip}/%{INT:icmp_seq_num}(?:\(%{DATA:fwuser}\))? gaddr %{IP:dst_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:dst_ip}/%{INT:icmp_code}( \(%{DATA:user}\))?
# ASA-6-302020_302021 outbound
CISCOFW302020_302021_2 %{CISCO_ACTION:action}(?: (?<direction>outbound))? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\(%{DATA:fwuser}\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \(%{DATA:user}\))?

Indeed, the src_ip & dst_ip are different if the direction is inbound or outbound.

You will need to update the Logstash Cookbook page for Cisco ASA too, because we replace the pattern CISCOFW302020_302021 by two patterns (CISCOFW302020_302021_1 and CISCOFW302020_302021_2).

Filter date, format "UNIX" defaults to 1970-01-01

Example date filter which I am currently using:

   date {
        match => [ "timestamp", 
            "yyyy-MM-dd HH:mm:ss,SSS",
            "yyyy-MM-dd'T'HH:mm:ss,SSS",
            "dd/MMM/yyyy:HH:mm:ss Z",
            "EEE MMM dd HH:mm:ss 'CET' yyyy",
            "UNIX"
        ]
        target => "@timestamp"
        remove_field  => [ "timestamp" ]
    }

This causes great confusion because if the timestamp field does not match any of the date formats, it always matches on UNIX, even if it is obviously not a unix timestamp (does not match regexp /^\d+$/), and so creates events at 1970-01-01.
Would it be possible to confirm that the date could be a Unix timestamp before forcing a failed conversion?

Cheers,
-Robin-

clean up s3 output docs

See http://logstash.net/docs/1.4.0/outputs/s3.

It has a lot of TODOs, transitions to and from italics that don't have a reason that I see, and just needs some formatting improvements.

This bug is to clean up the look and put the TODOs in some constrained place so they don't interrupt reading about what is present.

Separate logstash package

I know many people having small setups use ELK, but even more people actually only use L, so it would make sense to reflect packaging accordingly instead of getting one large monolithic RPM.

Thanks!

Space in directory name breaks file discovery

Installing logstash in a directory where the full path to the directory has a space in it will cause file-not-found errors on config or other files.

The bin/logstash start-up script is suspect.

Seen at training on April 25, 2014 with 1.4.0.

HTTPS + AUTH output to Elasticsearch

I'd like to use https + basic auth with QBox.io(or insert any upcoming ESAAS) as a Logstash output.

This PR got closed a long time ago (#430) but I have no idea how jetty would be involved.

input plugin for batch processing of file

The input file plugin is primarily designed for tailing a file with sincedb. Problem arises when we download logs overnight and let logstash process it and then delete files which leaves sincedb entry remains same and if new file get same inode number then either its partially read or not read at all depending on size of previous file being read. A separate batch input file plugin with not effect the current one but will give more options for specific use case.

USER grok pattern should allow '@' to support email address as username

I just started tinkering with Logstash, Elasticsearch, Kibana et all and
I am following the tutorials right now. So far so awesome!

When I was trying to process some Apache logs with the pattern COMBINEDAPACHELOG I ran into a couple of
_grokparsefailure errors which were caused by authenticated users that
had email addresses as user names.

I extended the grok patterns to support an '@' in USER and wanted to
issue a pull request if this small fix is okay. I was wondering whether I should add a test case to https://github.com/elasticsearch/logstash/blob/master/spec/examples/parse-apache-logs.rb or somewhere else.

Kind regards

Multiline filter still breaking

After the fix in #1211, nightly build still crashes (only the filter, though :( )

the multiline filter crashes with exception

Exception in filterworker {"exception"=>#<NoMethodError: undefined method `[]' for nil:NilClass>, "backtrace"=>["logstash/lib/logstash/event.rb:165:in `overwrite'", "logstash/lib/logstash/filters/multiline.rb:212:in `filter'", "(eval):23:in `initialize'", "org/jruby/RubyProc.java:271:in `call'", "logstash/lib/logstash/pipeline.rb:262:in `filter'", "logstash/lib/logstash/pipeline.rb:203:in `filterworker'", "logstash/lib/logstash/pipeline.rb:143:in `start_filters'"], :level=>:error}

Don't distribute unnecessary packages with releases

Examples, we ship 'coveralls' which depends on 'term-ansicolor' and this 2nd gem is GPL2 licensed. I'm not sure what the legal ramifications are, but frankly we don't do it on purpose and we shouldn't be shipping coveralls with our releases anyway since it is only used during testing, not during production runs.

zeromq errors out upon logstash restart

When restarting logstash when receiving input from zeromq, logstash gives error as

{:timestamp=>"2014-05-01T00:38:36.503000+0200", :message=>"ZeroMQ error while in recv_string", :error_code=>-1, :level=>:error}

Error repeats for a while before logstash has started up completely, it functions fine afterwards though

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.