Grafana is a multi-platform open-source visualization and monitoring tool that can integrate data from different sources like Prometheus, InfluxDB, Graphite, and ElasticSearch. Available since 2014, this software provides us with visualizations as graphs or charts for a connected data source. It can query or visualize data sources, and it does not matter where they are stored.
WHAT IS PROMETHEUS?
Prometheus is an open-source system that is used for monitoring and as an alerting toolkit. It includes a rich, multidimensional data model, a concise and powerful query language called PromQL, and an efficient embedded time-series database. Prometheus server scrapes and stores time-series data. Most Prometheus components are written in Go language, making it easy to build and deploy as static binaries.
The following are some of the advantages of Grafana with Prometheus as the data source:
A user can visualize the time series directly in Prometheus Web UI, similar to Grafana.
Prometheus provides a functional query language called ‘PromQL’ that lets users select and aggregate time-series data in real-time.
Prometheus can discover targets dynamically and automatically scrap new targets on demand. It offers a variety of service discovery options for scrape targets, including K8s.
In this article, we will cover a step-by-step procedure for setting up Grafana (version 7) with Prometheus (version 2.17) as a data source.
Ubuntu 18.04 server with a non-root user with sudo privileges.
Docker (installed and configured).
STEP 1: SET UP CONTAINER FOR GRAFANA-7.0
a) Create a bridged docker network using the following command.
docker network create bridge mybridge
b) Pull the official image from the Docker hub and run the container using the following command:
docker pull grafana/grafana:<version number>
c) Run the container: While running the container, you can use the following command to install the required plugins as well in a single step.
Cloud DevOps Engineer with more than three years of experience in supporting, automating, and optimizing deployments to hybrid cloud platforms using DevOps processes, tools, CI/CD, containers and Kubernetes in both Production and Development environments.