Installing Prometheus Node Exporter

Posted on Saturday, January 30, 2021

I recently wrote an article where I installed Prometheus on Ubuntu 20.04 [1]

Now I want to start getting some real data into it to have something to graph! 


One good first step is to use Prometheus Node Exporter [2] [3]


There are two good reasons to start with node exporter.  One, it will provide tons on Unix information about your server… Memory usage, disk usage, kernel info etc.   Two, you can use it to suck up other data you create with your own cron jobs that create prometheus formatted data and put it in the correct folder.


Download and setup


Using [4]

 First see what version of ubuntu you are on


  > lsb_release -a




On this particular server I have set up a second drive located at /prometheus and that is where I put prometheus data


Create new user for node exporter

  > sudo useradd --no-create-home --shell /bin/false node_exporter


Head over to



Go find the latest (click on latest)


Right click on this guy and get the copy the link in my case its



  > wget


Untar it and install

  > tar xvf node_exporter-0.15.1.linux-amd64.tar.gz
  > sudo cp node_exporter-0.15.1.linux-amd64/node_exporter /usr/local/bin
  > sudo chown node_exporter:node_exporter /usr/local/bin/node_exporter

Create a folder to use for extra metrics

  > sudo mkdir -p /prometheus/metrics
  > sudo chown node_exporter:node_exporter /prometheus/metrics


SystemD setup


Create the systemD file


  > sudo vi /lib/systemd/system/node_exporter.service







Enable the service so it will auto start on reboot


  > sudo systemctl enable node_exporter




  > sudo systemctl status node_exporter




Test it out


  > sudo systemctl start node_exporter
  > sudo systemctl status node_exporter




Now pull ports and test.


  > ssh prometheus -L 9090:localhost:9090 –L 9100:localhost:9100



Wahoo data.


Now to scrape it


The data is available now but we still need to tell prometheus to scrape the data.

So let’s tweak the prometheus settings.  (Prometheus is running on the same server)



  > sudo vi /prometheus/pheus/prometheus.yml 


Now add another thing to scrape on this same server.



  - job_name: 'node_exporter'
    scrape_interval: 5s
      - targets: ['localhost:9100']



Restart Prometheus


  > sudo systemctl restart prometheus

Now let’s check prometheus and see if its getting the data.



Open Targets

Looks like its scraping it.



Go to Graph

Search for





(click Execute)



There is data!




Custom Data

If I look at /lib/systemd/system/node_exporter.service


  > sudo vi /lib/systemd/system/node_exporter.service



I have set the variable to look at the folder /prometheus/metrics for “extra” prometheus formatted data to ingest and have node exporter show at :9100/metrics

So let me make a simple python program and create a cron job to run it.




  > sudo mkdir /prometheus/code
  > sudo vi /prometheus/code/





#!/usr/bin/env python3
# Random stuff

import sys
import os
import random


#  Main
if __name__ == '__main__':

  #Test if base folder exist

    if not os.path.exists(base_dir):
      print("folder '" + base_dir + "' does not exist")
      prom_data  = "# HELP my_test_data_01 just some test data\n"
      prom_data += "# TYPE my_test_data_01 gauge\n"
      prom_data += "my_test_data_01 " + str(random.randint(1,100)) + "\n"   

      prom_data += "# HELP my_test_data_02 just some test data\n"
      prom_data += "# TYPE my_test_data_02 gauge\n"
      prom_data += "my_test_data_02 " + str(random.randint(1,100)) + "\n"

      prom_data += "# HELP my_test_data_03 just some test data\n"
      prom_data += "# TYPE my_test_data_03 gauge\n"
      prom_data += "my_test_data_03 " + str(random.randint(1,100)) + "\n" 

      f = open(base_dir + "/my_data.prom", 'w')

  except Exception as e:
    print("Exception {0}", e)





  > sudo chown node_exporter:node_exporter /prometheus/code/
  > sudo chown node_exporter:node_exporter /prometheus/code
  > sudo chmod u+x /prometheus/code/



Let me switch to the node_exporter user and run the python code.


  > sudo su node_exporter -s /bin/bash
  > /prometheus/code/



Now check the output


  > cat /prometheus/metrics/my_data.prom



Now let me run a curl to prove that node_exporter is getting this extra data



  > curl -s localhost:9100/metrics  | egrep my_test_data



Wahoo now let me set up a curl to make up new random data every 5 minutes
Exit out of the node_exporter user



  > sudo vi /etc/cron.d/my_fake_data


And place the following into it


*/5 * * * * node_exporter /prometheus/code/


Now the file should update every 5 minutes
Now to go look at the data




Sweet it is working J






[1]        Installing Prometheus on Ubuntu 20.04
Accessed 1/2020

Accessed 1/2020

[3]        Node Exporeter’s Github page
Accessed 1/2020

[4]        How To Install Prometheus on Ubuntu 16.04
Accessed 1/2020




  1. Nice Post thank you very much for sharing such a useful information and will definitely saved and revisit your site and i have bookmarked to check out new things frm your post.
    Data Science Course

  2. This is an informative and knowledgeable article. therefore, I would like to thank you for your effort in writing this article.
    Best Digital Marketing Courses in Bangalore

  3. I was browsing the internet for information and found your blog. I am impressed with the information you have on this blog.
    MLOps Course

  4. Informative Post. The information you have posted is very useful and sites you have referred was good. Thanks for sharing.
    Data Science Course with Placement

  5. Actually I read it yesterday but I had some ideas about it and today I wanted to read it again because it is so well written.
    IoT Training

  6. Very nice job... Thanks for sharing this amazing and educative blog post!
    Data Science Course in Chandigarh

  7. This is an informative and knowledgeable article. therefore, I would like to thank you for your effort in writing this article.
    Data Scientist Course in Chandigarh

  8. Excellent effort to make this blog more wonderful and informative. The information shared was very useful.
    Data Analytics Course in Chandigarh