elasticsearch - using logstash to parse csv file -


i have elasticsearch index using index set of documents.

these documents in csv format , looking parse these using logstash has powerful regular expression tools such grok.

my problem have along following lines

field1,field2,field3,number@number#number@number#number@number 

in last column have key value pairs key@value separated # , there can number of these

is there way me use logstash parse , store last column following json in elasticsearch (or other searchable format) able search it

[   {"key" : number, "value" : number},   {"key" : number, "value" : number},   ... ] 

first, can use csv filter parse out last column. then, can use ruby filter write own code need.

input {     stdin {     } }  filter {     ruby {         code => '             b = event["message"].split("#");             ary = array.new;             c in b;                 keyvar = c.split("@")[0];                 valuevar = c.split("@")[1];                 d = "{key : " << keyvar << ", value : " << valuevar << "}";                 ary.push(d);             end;             event["lastcolum"] = ary;         '     } }   output {     stdout {debug => true} } 

with filter, when input

1@10#2@20

the output

    "message" => "1@10#2@20",   "@version" => "1", "@timestamp" => "2014-03-25t01:53:56.338z",  "lastcolum" => [     [0] "{key : 1, value : 10}",     [1] "{key : 2, value : 20}" ] 

fyi. hope can you.


Comments

Popular posts from this blog

android - Get AccessToken using signpost OAuth without opening a browser (Two legged Oauth) -

org.mockito.exceptions.misusing.InvalidUseOfMatchersException: mockito -

google shop client API returns 400 bad request error while adding an item -