Skip to content
This repository was archived by the owner on Mar 21, 2020. It is now read-only.

Conversation

@rjsorensen
Copy link

@rjsorensen rjsorensen commented Feb 23, 2018

We collect events with fluentd via syslog input that match the following regex format:

^(?<time>[^ ]*\s*[^ ]* [^ ]*) (?<host>\S+) sourcetype=(?<sourcetype>\S+) cluster=(?<cluster>\S+)::(?<message>.*)$

We want to send in cluster=clustername as an indexed field with these events.

We modify the records to have a "fields" json object with the record transformer plugin:

fields '{"cluster": "${record["cluster"]}" }'

Then we configure the http event collector plugin to include the indexed field by setting the following options:

send_fields true
fields '${record["fields"]}'

@mentalblock
Copy link

I am testing this out as I would really like this feature. Though I am having trouble getting a correct "fields" json object with the record transformer plugin. Similar to your example, I have:

fields '{"logfile": "${record["logfile"]}"}'

but I get an error:

2018-03-06 01:31:49 +0000 [warn]: #0 failed to parse {"logfile": "${record["logfile"]}"} as json. Assuming {"logfile": "${record["logfile"]}"} is a string error_class=JSON::ParserError error="765: unexpected token at '{\"logfile\": \"${record[\"logfile\"]}\"}'"

I am using fluentd-1.1.0 pid=7 ruby="2.4.3". This may be a general Fluentd question but I was hoping you can help so I can test this PR myself.

@rjsorensen
Copy link
Author

rjsorensen commented Mar 6, 2018 via email

@mentalblock
Copy link

Thanks for the quick response. I was not able to get this working on v0.12.42 either.

@mentalblock
Copy link

Can you post a more comprehensive configuration so I can verify my settings?

@rjsorensen
Copy link
Author

rjsorensen commented Mar 7, 2018 via email

@mentalblock
Copy link

Thanks for this. It turns out that the error I am seeing does not affect the functionality. Your changes works perfectly. Thank you for this.

@sandeepbhojwani
Copy link

can we get this change in soon?

@jlsalmon
Copy link

+1 for this change

"host" => @placeholder_expander.expand(@host.to_s, placeholders),
"index" => @placeholder_expander.expand(@index, placeholders)
]
if @send_fields
Copy link

@kevdowney kevdowney May 11, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wouldn't this be better as:

splunk_object = Hash[ "time" => time.to_i, "source" => if @source.nil? then tag.to_s else @placeholder_expander.expand(@source, placeholders) end, "sourcetype" => @placeholder_expander.expand(@sourcetype.to_s, placeholders), "host" => @placeholder_expander.expand(@host.to_s, placeholders), "index" => @placeholder_expander.expand(@index, placeholders) ] if @send_fields splunk_object = splunk_object.merge(Hash[ "fields" => JSON.parse(@placeholder_expander.expand(@fields.to_s, placeholders)) ]) 
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

5 participants