GithubHelp home page GithubHelp logo

logstash-filter-rest's People

Contributors

gandalfb avatar lucashenning avatar mschneider82 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

logstash-filter-rest's Issues

Some values types get transformed to string

This question was originally posted inside the comments for issue #22

Is there a way to prevent stringifying the values?

if I set:
body => { offset => null pageSize => 1 saySomething => "hello" preferMethod => false }

I get in my POST service request body:
{ "offset": "null", "pageSize": 1, "saySomething": "hello", "preferMethod": "false" }

only the integer was left alone but the boolean and null were transformed to strings.

Logstash 5.5 Support

Do you know what would need to be done to make this compatible with Logstash 5.5? It works on 5.x versions up to 5.4 but does not with 5.5. The install fails.

Unknown setting 'request' for rest

I'm using logstash 2.2.4. I keep getting an error when I try to run. Not sure what could be wrong 🤔

sudo /opt/logstash/bin/logstash -f rest.conf --configtest
Unknown setting 'request' for rest {:level=>:error}
Unknown setting 'target' for rest {:level=>:error}
Unknown setting 'fallback' for rest {:level=>:error}
Error: Something is wrong with your configuration. {:level=>:error}

Plugin is installed.

$ /opt/logstash/bin/plugin list | grep rest
logstash-filter-rest

rest.conf:

input {
          file {
    path => ["/var/log/file.json"]
    codec => json
    start_position => beginning
  }
}

filter {
  rest {
    request => {
      url => "http://example.com"
      method => "post"
      headers => {
        "key1" => "value1"
        "key2" => "value2"
      }
      auth => {
        user => "AzureDiamond"
        password => "hunter2"
      }
      params => {
        "key1" => "value1"
        "key2" => "value2"
        "key3" => "%{somefield}"
      }
    }
    json => true
    sprintf => true
    target => "my_key"
    fallback => {
      "key1" => "value1"
      "key2" => "value2"
    }
  }
}

output {
        stdout { codec => rubydebug }
}

body must be event field?

i use this plugin in filter to clean data
if i use like this
image
and i was Very confused!!!The grammar is wrong?or i was wrong?

but is use like this
image
Surprisingly it worked!!!

so i think body every value must be use %{} Is that so?

Possible to pass variables in URL

The run down is that my API call looks something like this
https://thisismysite/api/ipaddress/10.1.1.8

The IP address on the end is subject to change depending on the event. All of the information is passed via the URL and not necessarily passed via the header or a parameter.
I tried simply using "url => https://thisismysite/api/ipaddress/%{source_ip}" but logstash did not like that.

Is this not a functionality that's built into logstash-filter-rest or am I misunderstanding something?

Configurable key for the response data

It would be nice to let the user specify the "key" where the response data is located. This would be optional with a sensible default.

So a config param like response_key => "http_response" would end up putting the response body under a key named http_response.
It would look like this:
{ /* other event json */, "http_response": {response_body} }

Or if the response is json, then:
{ /* other event json */, "http_response": {"key1": "val1", "key2": "val2", ... } }

Access response headers

As far as I know this is not already implemented so forgive me if I couldn't see it in the code. It would be great if this plugin would also allow us to keep response headers in the event, not just response body.

If the idea is ok but nobody is willing to tackle I may take a crack at it myself but I am not so familiar with Ruby and have no environment set up.

Return only some values

Hi

I haven't been able to figure out if there is a way to only return certain fields from the JSON return.

Lets say i query an API which returns:

  • Title
  • Summary
  • Date

How to I tell the rest plugin that i only want to return the Summary field from the external api?

Not able to set right JSON payload in body

Hi
Here how my conf file looks like

rest {
   request => {
     url =>"https://abc.com"       
     method => "post"                  
  headers => {
   	"Authorization" => "%{access}"
   	"Content-Type" => "application/json"
   	}

     params => '{
   "reportRequests" :
   [{
   		"checkid" : "2345678",
   		"dateRanges" : [{
   				"endDate" : "2018-11-12",
   				"startDate" : "2018-11-01"
   			}
   		]
   	}
   ]
}'
   }
   json => true                         
   target => "my_key"                  

 }

This is the error i am getting

fieldViolations\": [\n {\n \"description\": \"Invalid JSON payload received. Unknown name \\\"\\\": Root element must be a message.\"

The exact body works perfectly in POSTMAN, but here it does not work. Looks like something is missing params/body. Please suggest the right syntax or such feature is not supported?

Install with Logstash 2.3.1

Sorry but I have a issue when try to install your plugin with Logstash 2.3.1.

bin/plugin install logstash-filter-rest
The use of bin/plugin is deprecated and will be removed in a feature release. Please use bin/logstash-plugin.
Validating logstash-filter-rest
Installing logstash-filter-rest
Plugin version conflict, aborting
ERROR: Installation Aborted, message: Bundler could not find compatible versions for gem "logstash-core":
In snapshot (Gemfile.lock):
logstash-core (= 2.3.1)

In Gemfile:
logstash-core-plugin-api (= 1.8.0) java depends on
logstash-core (<= 2.3.1, >= 2.0.0) java

logstash-filter-rest (>= 0) java depends on
  logstash-core (< 2.0.0, >= 1.4.0) java

logstash-core (= 2.3.1) java

Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.

I tried to make a update but i can not find logstash-core

Do you have any idea ?

So I would like to use your plugin to rewrite with Google Map Lat/Lng missing information to the data.
And you plugin looks like the only solution i found right now.

I have a lots of data where i do not have these information in the address.

best regards

Daniel

Install error

Hi
At first thanks for your work.

I tried to install but i had error. I use java version "1.7.0_95" & logstash-2.2.2:

gonzalo@ubuntu:~/logstash-2.2.2$ bin/plugin install  logstash-filter-rest
Validating logstash-filter-rest
Installing logstash-filter-rest
Plugin version conflict, aborting
ERROR: Installation Aborted, message: Bundler could not find compatible versions for gem "logstash-core":
  In snapshot (Gemfile.lock):
    logstash-core (= 2.2.2)

  In Gemfile:
    logstash-input-s3 (>= 0) java depends on
      logstash-mixin-aws (>= 0) java depends on
        logstash-core (< 3.0.0, >= 2.0.0.beta2) java

Colon(s) in the request URL?

Hi there,

Here is another one! I try to extend our logs with geoip information, so I am using the following config:

filter {
       ...
        rest {
          request => {
            url => "http://10.0.0.1:8080/GeoDirectoryServer-7.2.0/v1/ipinfo/%{ip}"
            proxy => "http://192.168.10.35:3128"
          }
          json => false
          target => "geodata"
        }
       ...
}

, where the %{ip} field holds the IP address of the client. It is working well for ipv4 addresses, but if a request comes with ipv6 then an exception is logged in logstash:

[ip] = "2003:5e:4c43:c0a9:c885:6468:a4b4:4feb"

[2017-01-30T15:07:08,807][ERROR][logstash.pipeline        ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>#<LogStash::Json::ParserError: Unexpected character ('c' (code 99)): was expecting comma to separate ARRAY entries
 at [Source: [B@b85a270; line: 1, column: 78]>, "backtrace"=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:41:in `jruby_load'", "/usr/share/logstash/logstash-core/lib/logstash/json.rb:38:in `jruby_load'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-rest-0.5.1/lib/logstash/filters/rest.rb:22:in `to_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-rest-0.5.1/lib/logstash/filters/rest.rb:263:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in `multi_filter'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:41:in `multi_filter'", "(eval):2631:in `initialize'", "org/jruby/RubyArray.java:1613:in `each'", "(eval):2623:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "(eval):2663:in `initialize'", "org/jruby/RubyArray.java:1613:in `each'", "(eval):2654:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "(eval):1169:in `filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:295:in `filter_batch'", "org/jruby/RubyProc.java:281:in `call'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:192:in `each'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:191:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:294:in `filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:282:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in `start_workers'"]}
[2017-01-30T15:07:08,833][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Json::ParserError: Unexpected character ('c' (code 99)): was expecting comma to separate ARRAY entries
 at [Source: [B@b85a270; line: 1, column: 78]>, :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:41:in `jruby_load'", "/usr/share/logstash/logstash-core/lib/logstash/json.rb:38:in `jruby_load'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-rest-0.5.1/lib/logstash/filters/rest.rb:22:in `to_object'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-rest-0.5.1/lib/logstash/filters/rest.rb:263:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in `multi_filter'", "org/jruby/RubyArray.java:1613:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:41:in `multi_filter'", "(eval):2631:in `initialize'", "org/jruby/RubyArray.java:1613:in `each'", "(eval):2623:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "(eval):2663:in `initialize'", "org/jruby/RubyArray.java:1613:in `each'", "(eval):2654:in `initialize'", "org/jruby/RubyProc.java:281:in `call'", "(eval):1169:in `filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:295:in `filter_batch'", "org/jruby/RubyProc.java:281:in `call'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:192:in `each'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:191:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:294:in `filter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:282:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in `start_workers'"]}

debug output (bin/logstash --debug -f ...):

15:15:19.075 [[main]>worker39] DEBUG logstash.filters.rest - Parsing event fields {:sprintf_fields=>["http://10.1.1.1:8080/GeoDirectoryServer-7.2.0/v1/ipinfo/%{ip}", {}]}
15:15:19.081 [[main]>worker39] ERROR logstash.pipeline - Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>#<LogStash::Json::ParserError: Unexpected character ('c' (code 99)): was expecting comma to separate ARRAY entries

I think the issue might be the colons in the URL (http://10.0.0.1:8080/GeoDirectoryServer-7.2.0/v1/ipinfo/2003:5e:4c43:c0a9:c885:6468:a4b4:4feb).

Of course, if I do URL Encoding with the following workarround:

      mutate { add_field => { "ip2" => "%{ip}" } }
      mutate { gsub => [ "ip2", ":", "%3a" ] }
       ...
        rest {
         ...
       }
      mutate { remove_field => [ "ip2" ] }

, then it is working well, just I find there additional lines awkward.

Would you consider checking, why these colons are triggering exceptions?

KR
Tamás

client error is thrown

Hi ,
I installed the logstash-filter-rest plugin and tried to run this config:
logstash --debug --path.data=/tmp/mahsa-test -e 'input { stdin {} } filter { rest { request => { url =>
"http://www.google.se/" } target => 'rest' json => false }} output {stdout { codec => rubydebug }}'

and I get this
Http error received {:code=>nil, :body=>nil, :client_error=>#<Manticore::SocketException: Network is unreachable
(connect failed)>}

but when I do this curl command on the same server 'curl http://www.google.se/' I recieve the good response.
any idea why I am getting this error? is there any 'proxy' setting available that I can use?

Documentation update : json configuration parameter

It is more a documentation improvement.

It was not obvious for me that the json configuration parameter is meant to specify that the response is expecte to have a json format. What else could it be, will you ask...rather than having users to deduce it, it find it easier to use when it is written.

filter {
  rest {
    request => {
      url => "http://example.com"        # string (required, with field reference: "http://example.com?id=%{id}" or params, if defined)
      method => "post"                   # string (optional, default = "get")
      headers => {                       # hash (optional)
        "key1" => "value1"
        "key2" => "value2"
      }
      auth => {
        user => "AzureDiamond"
        password => "hunter2"
      }
      params => {                        # hash (optional, available for method => "get" and "post"; if post it will be transformed into body hash and posted as json)
        "key1" => "value1"
        "key2" => "value2"
        "key3" => "%{somefield}"         # sprintf is used implicitly
      }
    }
    json => true                         # is json the format of the target ? boolean (optional, default = true)
    target => "my_key"                   # string (mandatory, no default)
    fallback => {                        # hash describing a default in case of error
      "key1" => "value1"
      "key2" => "value2"
    }
  }
}

Option to send the entire event as JSON in request body

It would be nice if there was a config option to simply send the entire logstash event as a json in the POST body (by using event.to_json). In my use case, the server needs to most of the event data, and it's much easier to parse the POST body as a JSON object that to handle each parameter independently.

Fields not being sent in parameters

Plugin is sending word "postcode" as a parameter rather than returned field when using below with logstash 2.3.4

input {
  # Read all documents from Elasticsearch matching the given query
  elasticsearch {
    hosts => "localhost"
    index => "postcodes"
    scan => false
    size => 1
    scroll => "1m"
    #query => '{"from":0,"size":10,"query":{"match_all":{}}}'
    query => '{"query":{"range":{"id":{"gte":1,"lte":2}}}}'
    }
}

filter {
    rest {
      request => {
        url => "https://maps.googleapis.com/maps/api/distancematrix/json?"
        json => false
        method => "get"
        sprintf => true
        params => {
          "units" => "imperial"
          "origins" => "%{postcode}"
          "destinations" => "A119ZZ"
          "key" => "GOOGLEAPIKEY"
        }
      }
    }
}
output {
        stdout { codec => rubydebug }
        }

HTTP Client Error Handling

Errors thrown from the underlying http client (when client.http() is called) are not caught and can cause the entire logstash pipeline to crash.

I've been encountering these errors as the result of socket timeouts (uncaught Manticore::SocketTimeout errors). They can also be caused by providing an invalid host in the url field.

For my purposes, it would be better to simply tag the event as failed and move on instead of stopping the whole pipeline.

Basic HTTP Auth does not work

Hi!

@lucashenning, it seems that option auth has no effect and because of this Basic HTTP Auth does not work.

The snippet from my Logstash pipeline configuration:

rest {
  request => {
    url => "http://<FQDN>/"
    method => "get"
    auth => {
      user => "user"
      password => "pass"
    }
    params => {
      "v" => "%{[event_data][Ip]}"
    }
  }
  json => true
  target => "[event_data][Info]"
}

I expect the Basic HTTP Auth header in an HTTP request (Authorization: Basic dXNlcjpwYXNz) generated by logstash-filter-rest, but it is missing!

Here is the real HTTP request:

GET /?v=8.8.8.8 HTTP/1.1
Connection: Keep-Alive
Content-Length: 0
Host: <FQDN>
User-Agent: Manticore 0.6.4
Accept-Encoding: gzip,deflate

Install error - plugin version conflict

I ran the following commands to install the plugin:
/usr/share/logstash/bin/logstash-plugin install logstash-filter-rest

Below is the output. I'm using logstash v5.0.1. I'm not familiar with the development side of logstash, so not sure how to fix this. Is this plugin compatible with logstash v5? It looks like it's compatible with logstash-codec-json version <3.0.0, but the codec plugin that is shipped with v5 is 3.0.2.

Validating logstash-filter-rest
Installing logstash-filter-rest
Plugin version conflict, aborting
ERROR: Installation Aborted, message: Bundler could not find compatible versions for gem "logstash-codec-json":
In snapshot (Gemfile.lock):
logstash-codec-json (= 3.0.2)

In Gemfile:
logstash-output-udp (>= 0) java depends on
logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

logstash-filter-rest (>= 0) java depends on
  logstash-codec-json (< 3.0.0, >= 1.6.0) java

logstash-output-udp (>= 0) java depends on
  logstash-codec-json (>= 0) java

Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.
Bundler could not find compatible versions for gem "logstash-core":
In snapshot (Gemfile.lock):
logstash-core (= 5.0.1)

In Gemfile:
logstash-core-plugin-api (>= 0) java depends on
logstash-core (= 5.0.1) java

logstash-filter-rest (>= 0) java depends on
  logstash-core (< 2.0.0, >= 1.4.0) java

logstash-core (>= 0) java

Running bundle update will rebuild your snapshot from scratch, using only
the gems in your Gemfile, which may resolve the conflict.

Issues when using logstash-filter-rest and logstash-netflow-codec together

I have a config file that uses the UDP input and the Netflow codec plugins. We have enabled a config file with the rest plugin today, but we received the following error:

 [2017-09-06T16:42:26,284][ERROR][logstash.inputs.udp      ] Exception in inputworker {"exception"=>#<RuntimeError: can't modify frozen array>, "backtrace"=>["org/jruby/RubyArray.java:2640:in `reject!'", "org/jruby/RubyArray.java:2653:in `delete_if'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-rest-0.5.4/lib/logstash/filters/rest.rb:22:in `compact'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.4.0/lib/bindata/struct.rb:168:in `each_pair'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-3.4.1/lib/logstash/codecs/netflow.rb:356:in `decode_netflow9'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.4.0/lib/bindata/array.rb:208:in `each'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.4.0/lib/bindata/array.rb:208:in `each'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-3.4.1/lib/logstash/codecs/netflow.rb:342:in `decode_netflow9'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-3.4.1/lib/logstash/codecs/netflow.rb:200:in `decode'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.4.0/lib/bindata/array.rb:208:in `each'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/bindata-2.4.0/lib/bindata/array.rb:208:in `each'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-codec-netflow-3.4.1/lib/logstash/codecs/netflow.rb:196:in `decode'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.1.1/lib/logstash/inputs/udp.rb:118:in `inputworker'", "/opt/apps/elk/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-udp-3.1.1/lib/logstash/inputs/udp.rb:89:in `udp_listener'"]}

The rest config works as expected, but the Netflow and UDP config crashed.

Variable substitution for values within array or complex variables

While testing @gandalfb modification file for issue #22 I found the following problem:

If I set:
body => { filters => [ { "text" => "hard-coded-message" "filterType" => "text" }, { "filterType" => "unique" } ] }
I get in my POST service request body (this works fine, like your test does):
{ "filters": [ { "text": "hard-coded-message", "filterType": "text" }, { "filterType": "unique" } ] }

BUT, if I set (with variable substitution inside the array and complex object):
body => { filters => [ { "text" => "%{message}" "filterType" => "text" }, { "filterType" => "unique" } ] }
I get:
{ "filters": [ { "text": "message1" } ] }

the filterType variables (both) get dropped

API seems to send initial data for each request

Using the following filter, the API seems not to refresh the token on each pass

input {
  # Read all documents from Elasticsearch matching the given query
  elasticsearch {
    hosts => "localhost"
    index => "postcodes"
    scan => false
    size => 1
    scroll => "1m"
    #query => '{"from":0,"size":10,"query":{"match_all":{}}}'
    query => '{"query":{"range":{"id":{"gte":0,"lte":10}}}}'
    }
}

filter{

    rest{
      request => {
        url => "https://maps.googleapis.com/maps/api/distancematrix/json?"
        method => "get"

      params => {
        "units" => "imperial"
        "origins" => "%{postcode}"
        "destinations" => "AA11AA|BB11BB"
        "key" => "APIKEY"
        }
        }
      json => true
      sprintf => true
      target => "google_response"
    }

    mutate{
      add_field => { "json" => "%{google_response}"}
      remove_field => [ "google_response" ]
      #gsub => [ "json", "[\\]", "_" ]
    }

    json {
       source => "json"
       target => "json"
    }
 if [json][rows][0][elements[0][status] == "OK"
    {
    mutate{
      add_field => { "postymccode" => "%{postcode}" }
      add_field => { "origin" => "%{[json][origin_addresses][0]}" }
      add_field => { "distSDGH" => "%{[json][rows][0][elements[0][distance][text]}" }
      add_field => { "distODGH" => "%{[json][rows][0][elements[1][distance][text]}" }
      add_field => { "distSDGHmeters" => "%{[json][rows][0][elements[0][distance][value]}" }
      add_field => { "distODGHmeters" => "%{[json][rows][0][elements[1][distance][value]}" }

      add_field => { "timeSDGH" => "%{[json][rows][0][elements[0][duration][text]}" }
      add_field => { "timeODGH" => "%{[json][rows][0][elements[1][duration][text]}" }
      add_field => { "timeSDGHmins" => "%{[json][rows][0][elements[0][duration][value]}" }
      add_field => { "timeODGHmins" => "%{[json][rows][0][elements[1][duration][value]}" }

      remove_field => [ "json" ]
    }
    }
    else
    {

      drop {}

    }
}


output {
        elasticsearch {
         hosts => [ "localhost" ]

Output:

Settings: Default pipeline workers: 4
Pipeline main started
{
"postcode" => "BB11BA",
"id" => 6,
"@Version" => "1",
"@timestamp" => "2016-09-21T08:19:33.688Z",
"origin" => "Bristol Cl, Blackburn BB1 1BA, UK",
"distSDGH" => "31.6 mi",
"distODGH" => "34.1 mi",
"distSDGHmeters" => 50814,
"distODGHmeters" => 54917,
"timeSDGH" => "56 mins",
"timeODGH" => "43 mins",
"timeSDGHmins" => 3386,
"timeODGHmins" => 2588
}
{
"postcode" => "BB11AD",
"id" => 4,
"@Version" => "1",
"@timestamp" => "2016-09-21T08:19:33.886Z",
"origin" => "Bristol Cl, Blackburn BB1 1BA, UK",
"distSDGH" => "31.6 mi",
"distODGH" => "34.1 mi",
"distSDGHmeters" => 50814,
"distODGHmeters" => 54917,
"timeSDGH" => "56 mins",
"timeODGH" => "43 mins",
"timeSDGHmins" => 3386,
"timeODGHmins" => 2588
}

You can see the postcode changes on the second query but the info sent to the API (returned as origin) is the same postcode as the 1st lookup.

Option to use HTTP proxy

Hi Lucas,

I would really appreciate, if you would consider adding a feature to access the REST API through an HTTP proxy.

I intend to use your filter to query a geoip database via REST, but to avoid high load on the geoip server by ingesting millions of log lines, I have to cache my results. In our setup, the most straightforward solution would be to access the GeoIP server through a squid proxy, which can do caching.

I already checked, the HTTP api (logstash-mixin-http_client) does support proxy connection (config :proxy). could you please make this settings transparent in your filter config as well? I mean something like this:

 rest {
    request => {
      url => "http://example.com"
      proxyurl => "http://proxyhost:proxyport"
      ...

KR
Tamás

Double Escape Issue with Multiple Workers

Using multiple workers causes an issue where request bodies can be escaped and sent multiple times.

[2017-05-02T10:08:43,486][DEBUG][logstash.filters.rest    ] Parsed request {:request=>[:post, "http://api/endpoint/url", {:header=>{"Content-Type"=>"application/json"}, :body=>{"ip"=>"1.2.3.4"}}]}
[2017-05-02T10:08:43,486][DEBUG][logstash.filters.rest    ] Parsed request {:request=>[:post, "http://api/endpoint/url", {:header=>{"Content-Type"=>"application/json"}, :body=>"{\"ip\":\"1.2.3.4\"}"}]}
[2017-05-02T10:08:43,486][DEBUG][logstash.filters.rest    ] Fetching request {:request=>[:post, "http://api/endpoint/url", {:header=>{"Content-Type"=>"application/json"}, :body=>"{\"ip\":\"1.2.3.4\"}"}]}
[2017-05-02T10:08:43,486][DEBUG][logstash.filters.rest    ] Fetching request {:request=>[:post, "http://api/endpoint/url", {:header=>{"Content-Type"=>"application/json"}, :body=>"\"{\\\"ip\\\":\\\"1.2.3.4\\\"}\""}]}

This may be an issue with the code in rest.rb line 211:
request[2][:body] = LogStash::Json.dump(request[2][:body]) if request[2].key?(:body)

The issue is not present if I set pipeline.workers: 1 in logstash.yml.

logstash-filter-rest

Hi,

This plugin is not available for Logstash 2.0.0.
I've tried to clone this repo in order to fix logstash-filter-rest.gemspec that enforce that logstash must be under 2.0.0, but I after several tries (different ruby version) I can't build it.

Thanks and Regards,
Eric

Adding this plugin disrupts logstash metadata feature

I assign a value to a metadata slot in input and then use it to name the index in output. Changes behavior when this plugin is added to the filter section of the config file. Successful rest lookups result in %{[@metadata][sdlc]} being treated as a string.

input {
  jdbc {
             add_field => { "[@metadata][sdlc]" => "dev" }
...
}}

filter {
    rest {
      request => {
        url => "https://----------/%{username}"
        headers => {
         "Accept"=>"application/json"
        }
      }
     target => "person_info" 
fallback => {
"Name" => "not found" 
} }

split { 
field => "person_info"
}
 useragent { 
source => "user_agent" 
remove_field => "user_agent"
target => "ua"
}
}
output {
    stdout { codec => json_lines }
    elasticsearch {
        "index" => "logs_%{[@metadata][sdlc]}"
...
}}

Now, there's an index named logs_%{[@metadata][sdlc]} as well as separate ones for each level of sdlc.

BTW, fallback doesn't do either of what I expect. Doesn't produce a person_info with one element, Name, nor a Name field.

There are only error messages for failed lookups, which result in the correct interpretation

[2017-06-13T17:32:14,575][WARN ][logstash.filters.rest    ] rest response empty {:response=>"[]", :event=>2017-06-13T17:32:11.740Z %{host} %{message}}
[2017-06-13T17:32:14,575][WARN ][logstash.filters.split   ] Only String and Array types are splittable. field:person_info is of type = NilClass

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.