Ever wondered how to perform advanced time series analysis on your log data with a curated time series user interface? What if you can perform week to week analysis of your log data, take immediate actions based on real-time data of your DevOps, and reduce the time taken to solve an issue in your infrastructure from hours to minutes? You can do all of this and much more using Kibana: Describe queries, transformations, and visualizations with powerful, easy-to-learn expressions.
Kibana is a fairly intuitive platform that offers some great impressive ways to visualize data. It gives you the freedom to select the way you give shape to your data. With its interactive visualizations, you just need to start with one question and see where it leads you.
Kibana has made a place for itself by being extremely useful to developers, business interlocutors, and managers by empowering them to quickly understand thousands of log lines and finding relevant information. Kibana is well suited for monitoring logs, infrastructure, and providing operational intelligence.
Since its first release, Kibana has changed the way businesses visualize data: Rapidly create dashboards that pull together charts, maps, and filters to display the full picture of your data, build customized visualization-to-visualization drill-downs that enable deeper analysis.
In this Kibana blog, we are going to help you unlock the full extra mile of Kibana and explore the unexplored visualization techniques.
You must always keep in mind the objective that you are trying to achieve with your dashboard. A bottom-up approach would be perfect for building your dashboard rather than a top-down approach. In simple terms, it means that you need to have the objective in mind and work backward. There would be a lot of data that you would want to visualize. Thus, there is a high chance that eventually, you may end up with visuals not useful for you at all.
The key idea is to map out the required data inputs and then ensure that your configurations are in place to collect and parse your data to provide you with quality data in the end without noise
Some of the critical questions that you can answer while creating your dashboards are by building on the why who what & how framework :
Why are you building this dashboard?
Who is going to utilize this dashboard?
What kind of data will be required to build this dashboard?
How will you make the data available to create this dashboard?
Now that you have answered the above questions. You should be equipped with a clear picture of the dashboard that you are going to build. Strap on for your hands-on Kibana.
Let’s get started with some dashboarding examples!
You can download the latest version of Kibana from this link. Navigate to your Kibana page once you complete the Kibana deployment. You can achieve this by entering http://LOCALADDRESS:5061. Kibana runs as a service on port 5061. If you are logging in using PacketAI, the World’s first Autonomous Monitoring Solution login using the required credentials.
Kibana requires an index pattern to access the Elasticsearch data that you want to explore.
The steps to create an index pattern form the fundamental part of using Kibana once you have ingested data into ELK.
An index pattern selects the data to be used and it also allows you to define the properties of the fields.
An index pattern points to a specific index. The index will have your log data from the previous days or all indices that contain your data. It can also point to a data stream or index alias.
But before you begin, you need to get some of the basic understanding:
You will set up an index pattern for the data of the shipped logs to our Elasticsearch cluster ready to be utilized on Kibana. For this, first, click on the “Menu bar” and then under the Management section click on “Stack Management” and then click on Index Pattern. To set up your Index pattern click on “Create Index Pattern”. Here, you will be required to enter the index pattern definition.
Start typing in the Index pattern field, and Kibana -automatically and simultaneously- checks for the names of Elasticsearch indices that match your input. To align with multiple indexes, use a wildcard (*). For instance, if your system creates indices for Apache data with the naming scheme ”filebeat-apache-a, filebeat-apache-b” and so on. An index pattern named ”filebeat-a” matches a single source, and ”filebeat-*” matches multiple data sources. Using a wildcard is the most popular approach. High Quality Content
TIP:You also have the option to select multiple indices. You can do this by entering multiple strings, separated by a comma. Make sure there is no space after the comma.
An instance of such usage is ”filebeat-a,filebeat-b” which matches two indices. Use a minus sign (-) to exclude an index an example, test*,-test3 will exclude the test3 index
Click on “Next step”, and then you will have to configure your timestamp. The timestamp allows Kibana to utilize the time associated with your data. For your ease of understanding and convenience, you can use the @timestamp.
TIP:Choose “I don’t want to use the Time Filter” if your index does not have time-based data or the default timestamp is not suitable for you.
CAVEAT:If you don’t set a default time field, you will be unable to use global time filters on your dashboards.
Once you have completed all the above steps, click on the ‘Create Index Pattern’ button. Your index is now ready to be utilized on Kibana.
TIP:You can change the default index pattern by clicking on “Stack Management” and clicking on “Index patterns” and then “Index pattern name”. Click the star icon from the list.
Once you are satisfied with the configurations, make sure you have the required inflow of data on Kibana. To confirm if you have the required data, click on the discover option in the main menu bar. At times there may be a chance that you might find no data from the index. This can happen for mainly two reasons: either the filter on the dates does not match with your available data or the index itself might be incorrectly configured.
The image below is the final snapshot of the data in the index that you configured in the earlier steps. Voila!
You can also delete an index. This action removes the index pattern from the list of saved objects in Kibana
CAVEAT: When you delete an index pattern, you will be unable to recover field formats, scripted fields, source filters, and field popularity data associated with the index pattern.
Keep in mind that deleting an index pattern does not remove any indices or data documents from Elasticsearch. You can delete an index pattern by clicking on the “Index pattern name” which you want to remove and then click the “Delete icon”.
Going through the dashboards, you must have already understood that there are multiple ways you can visualize your data. Thus, to give you an overview, given below is a set of all visualizations that you will encounter while creating your visualizations on Kibana. Some of the most frequently used visualizations for Kibana are the Line charts, Area charts, Bar charts, Pie charts, Data Tables, Metrics, Maps, and Gauges.
When you click to create a new visualization in Kibana, you are shown the new visualization screen to the left. Kibana ships with a vast array of visualizations and often this can be overwhelming.
Remember, the trick to effective dashboards is simplicity
That should enable you to select the correct Kibana visualization to achieve your desired dashboard. If you are unsure, we have put together a handy table that outlines the Kibana visualizations and their functions. That should give you some context before you get into the Kibana dashboard tutorial.
In this Kibana tutorial blog, you have covered all the basics that you need to familiarize yourself with before building your visualizations and dashboards. In the upcoming blog, you will learn to create visualizations and dashboards, along with some of the new tips and tricks on the way.
This blog is the first part of the Kibana Tutorial blog series, to continue to part 2 of the tutorial click here!
PacketAI is the world’s first autonomous monitoring solution built for the modern age. Our solution has been developed after 5 years of intensive research in French & Canadian laboratories and we are backed by leading VC’s. To know more, book a free demo and get started today!
Follow @PacketSAS on Twitter or PacketAI Company page on LinkedIn to get the latest news!