We covered and explained Elastic stack that consists of Logstash, Elastic Search and Kibana. The three components are used for data collection, data processing, analysis and data visualziation. We also covered the installation and configuration of all Elastic Stack components.
We configured Logstash to collect logs from the Linux authentication log file, process the collected logs to extract the messages and the timestamp and store them either in a CSV file or send them to Elastic Search and Kibana for later analysis and visuzliation.
The elastic stack can be used for both data analytics and cyber security incident analysis similarly to Splunk. We used the lab material from TryHackMe Logstash: Data Processing Unit room.
Highlights
What is Elastic Stack?
Elastic stack is the collection of different open source components linked together to help users take the data from any source and in any format and perform a search, analyze and visualize the data in real-time.
Elastic Search
Elasticsearch is a full-text search and analytics engine used to store JSON-formated documents. Elasticsearch is an important component used to store, analyze, perform correlation on the data, etc.
It is built on top of Apache Lucene and provides a scalable solution for full-text search, structured querying, and data analysis.
Elasticsearch supports RESTFul API to interact with the data.
Log Stash
Logstash is a data processing engine used to take the data from different sources, apply the filter on it or normalize it, and then send it to the destination which could be Kibana or a listening port.
Kibana
Kibana is a web-based data visualization that works with elasticsearch to analyze, investigate and visualize the data stream in real-time. It allows the users to create multiple visualizations and dashboards for better visibility.
Room Answers
Room answers can be found here.