nginx访问日志logstash配置文件实例2
日志格式: log_format elk "$http_clientip | $http_x_forwarded_for | $time_local | $request | $status | $body_bytes_sent | " " $request_body | $content_length | $http_referer | $http_user_agent | " "$http_cookie | $remote_addr | $hostname | $upstream_addr | $upstream_response_time | $request_time"; 日志实例: 36.110.211.42 | 10.10.130.101 | 23/Jun/2017:17:51:01 +0800 | GET /lvyou/dongjing/ HTTP/1.1 | 200 | 73181 | - | 0 | - | Mozilla/5.0 (Windows NT 6.3; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/33.0.1750.117 Safari/537.36 | JSESSION_O2O=0000F0jMLw1MnT6SFCvhcqW3oP9:19h7oe5dr; SessionID=10.10.130.101.1498210484146456; sCityCode=SZX; sCityName=%E6%B7%B1%E5%9C%B3; vac_ss_sid=4441002; vac_ss_uid=4441 | 10.10.130.100 | www1-n01 | 10.10.130.237:80 | 0.808 | 0.809 logstash实例: input { file { type => "www1_access" path => ["/usr/local/elk/elklog/nginxlog/log0/www1.log","/usr/local/elk/elklog/nginxlog/log1/www1.log"] } file { type => "flight1_access" path => ["/usr/local/elk/elklog/nginxlog/log0/flight1.log","/usr/local/elk/elklog/nginxlog/log1/flight1.log"] } file { type => "m_access" path => ["/usr/local/elk/elklog/nginxlog/log0/m.log"] } } filter { ruby { init => "@kname = ['http_clientip','http_x_forwarded_for','time_local','request','status','body_bytes_sent','request_body','content_length','http_referer','http_user_agent','http_cookie','remote_addr','hostname','upstream_addr','upstream_response_time','request_time']" code => "new_event = LogStash::Event.new(Hash[@kname.zip(event.get('message').split(' | '))]) new_event.remove('@timestamp') event.append(new_event)" } if [request] { ruby { init => "@kname = ['method','uri','verb']" code => "new_event = LogStash::Event.new(Hash[@kname.zip(event.get('request').split(' '))]) new_event.remove('@timestamp') event.append(new_event) " } if [uri] { ruby { init => "@kname = ['url_path','url_args']" code => "new_event = LogStash::Event.new(Hash[@kname.zip(event.get('uri').split('?'))]) new_event.remove('@timestamp') event.append(new_event) " } kv { prefix => "url_" source => "url_args" field_split => "& " remove_field => [ "url_args","uri","request" ] } } } mutate { convert => ["body_bytes_sent" , "integer", "content_length", "integer", "upstream_response_time", "float","request_time", "float"] } grok { match => [ "message", "%{IP:clientip} \| %{USER} \| %{HTTPDATE:timestamp}" ] } date { match => [ "timestamp", "dd/MMM/YYYY:HH:mm:ss Z" ] locale => "en" } geoip { source => "clientip" } mutate { remove_field => "timestamp" remove_field => "http_clientip" } useragent { source => "http_user_agent" target => "useragent" } } output { redis { host => "10.10.45.200" data_type => "list" key => "elk_frontend_access:redis" port=>"5379" } } 注意:分隔符为“空格+table+空格”
新闻名称:nginx访问日志logstash配置文件实例2
转载源于:http://scyanting.com/article/jcdcpd.html