elk
play

ELK Elasticsearch Logstash - Kibana Welcome to Infomart Infomart - PowerPoint PPT Presentation

ELK Elasticsearch Logstash - Kibana Welcome to Infomart Infomart is a media monitoring app which monitors both Social and Traditional Media. Social media includes Twitter, Facebook, Youtube, Wordpress, Tumblr, Blogs, Forums and Web news.


  1. ELK Elasticsearch – Logstash - Kibana

  2. Welcome to Infomart Infomart is a media monitoring app which monitors both Social and Traditional Media. Social media includes Twitter, Facebook, Youtube, Wordpress, Tumblr, Blogs, Forums and Web news. Traditional Media includes all the major newspapers, magazines, broadcasts, radio etc from Canada, UK, US and International. We are using Elasticsearch for storing over 500 million documents.

  3. Meetup Goals • Install Elasticsearch: (For storing logs) • Install Kibana 4: (Dashboard for discover and visualizing logs) • Install and configure Nginx: (For kibana) • Install Logstash Server: (Processing logs before storing) • Install Logstash Forwarder: (Forwards server’s log on which it is installed to Logstash Server) • Create Dashboards in Kibana to visualize these logs.

  4. Meetup Goals Logstash Forwarder Logstash Forwarder Logstash Server Elastic Search Kibana Logstash Forwarder

  5. Install Elasticsearch • sudo add-apt-repository -y ppa:webupd8team/java • sudo apt-get update • sudo apt-get -y install oracle-java8-installer • wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt-key add - • echo 'deb http://packages.elasticsearch.org/elasticsearch/1.4/debian stable main' | sudo tee /etc/apt/sources.list.d/elasticsearch.list • sudo apt-get update

  6. Install Elasticsearch(Cont … ) • sudo apt-get -y install elasticsearch=1.4.4 • sudo service elasticsearch restart • sudo update-rc.d elasticsearch defaults 95 10 • cd /usr/share/elasticsearch • ./bin/plugin -i elasticsearch/marvel/latest

  7. Install Elasticsearch(Cont … ) Success: IPADDRESS:9200 should return you similar response

  8. Install Kibana 4 • cd /opt • wget https://download.elasticsearch.org/kibana/kibana/kibana-4.0.2- linux-x64.tar.gz • tar xvf kibana-4.0.2-linux-x64.tar.gz • rm kibana-4.0.2-linux-x64.tar.gz • mv kibana-4.0.2-linux-x64/ kibana • vim kibana/config/kibana.yml host: "localhost"

  9. Install Kibana 4(Cont … ) • cd /etc/init.d • sudo wget https://gist.githubusercontent.com/thisismitch/ 8b15ac909aed214ad04a/raw/ bce61d85643c2dcdfbc2728c55a41dab444dca20/kibana4 • sudo chmod +x /etc/init.d/kibana4 • sudo update-rc.d kibana4 defaults 96 9 • sudo service kibana4 start

  10. Install and configure nginx(Kibana) • sudo apt-get install nginx apache2-utils • sudo htpasswd -c /etc/nginx/htpasswd.users USERNAME • sudo vi /etc/nginx/sites-available/default server { listen 80; server_name localhost; auth_basic "Restricted Access"; auth_basic_user_file /etc/nginx/htpasswd.users; location / { proxy_pass http://localhost:5601; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'upgrade'; proxy_set_header Host $host; proxy_cache_bypass $http_upgrade; } }

  11. Install Kibana 4(Cont … ) Success: IPADDRESS:80 should return you similar response

  12. Install Logstash Server • echo 'deb http://packages.elasticsearch.org/logstash/1.5/debian stable main' | sudo tee /etc/apt/sources.list.d/logstash.list • sudo apt-get update • sudo apt-get install logstash • vim /etc/ssl/openssl.cnf Insert after v3_ca subjectAltName = IP:IPADDRESS

  13. Install Logstash Server(Cont … ) • cd /etc/ssl/ • sudo openssl req -config /etc/ssl/openssl.cnf -x509 -days 3650 -batch -nodes -newkey rsa: 2048 -keyout private/logstash-forwarder.key -out certs/logstash-forwarder.crt • vim /etc/logstash/conf.d/01-lumberjack-input.conf input { lumberjack { port => 5000 type => "logs" ssl_certificate => "/etc/ssl/certs/logstash-forwarder.crt" ssl_key => "/etc/ssl/private/logstash-forwarder.key" } } • vim /etc/logstash/conf.d/30-lumberjack-output.conf output { elasticsearch { host => localhost } stdout { codec => rubydebug } }

  14. Install Logstash Server(Cont … ) • cd /etc/logstash • sudo curl -O "http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz" • gunzip GeoLiteCity.dat.gz • vim /etc/logstash/conf.d/12-apache.conf filter { if [type] == "apache-access" { grok { match => { "message" => "%{COMBINEDAPACHELOG}" } } date { match => [ "timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ] } geoip { source => "clientip" target => "geoip" database => "/etc/logstash/GeoLiteCity.dat" add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ] add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ] } mutate { convert => [ "[geoip][coordinates]", "float"] } } } • sudo service logstash restart

  15. Install Logstash Forwarder • Copy the cert from logstash server to logstash forwarder client server. ("/etc/ssl/ certs/logstash-forwarder.crt”) • echo 'deb http://packages.elasticsearch.org/logstashforwarder/debian stable main' | sudo tee /etc/apt/sources.list.d/logstashforwarder.list • wget -O - http://packages.elasticsearch.org/GPG-KEY-elasticsearch | sudo apt- key add - • sudo apt-get update • sudo apt-get install logstash-forwarder

  16. Install Logstash Forwarder(Cont … ) • sudo vi /etc/logstash-forwarder.conf "servers": [ "IPADDRESS OF LOGSTAH SERVER:5000" ], "timeout": 15, "ssl ca": "/etc/ssl/certs/logstash-forwarder.crt” { "paths": [ "/var/log/apache2/access*.log" ], "fields": { "type": "apache-access" } } • sudo service logstash-forwarder restart Success: IPADDRESS:9200/_plugin/marvel should show indexed data

  17. Kibana 4 Settings Discover Visualize Dashboard

  18. Kibana 4 Kibana 4 Demo

  19. Special Thanks To • Warren Gedge • Julia Andrews • Amit • Sheldon Sawchuk • Katrina Zivanovich • Adam Hutchinson • Jose Bento • Neil

  20. Thank You saud.rehman@hotmail.com

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend