Installing Prometheus Node Exporter

Posted on Saturday, January 30, 2021

I recently wrote an article where I installed Prometheus on Ubuntu 20.04 [1]

Now I want to start getting some real data into it to have something to graph! 


One good first step is to use Prometheus Node Exporter [2] [3]


There are two good reasons to start with node exporter.  One, it will provide tons on Unix information about your server… Memory usage, disk usage, kernel info etc.   Two, you can use it to suck up other data you create with your own cron jobs that create prometheus formatted data and put it in the correct folder.


Download and setup


Using [4]

 First see what version of ubuntu you are on


  > lsb_release -a




On this particular server I have set up a second drive located at /prometheus and that is where I put prometheus data


Create new user for node exporter

  > sudo useradd --no-create-home --shell /bin/false node_exporter


Head over to



Go find the latest (click on latest)


Right click on this guy and get the copy the link in my case its



  > wget


Untar it and install

  > tar xvf node_exporter-0.15.1.linux-amd64.tar.gz
  > sudo cp node_exporter-0.15.1.linux-amd64/node_exporter /usr/local/bin
  > sudo chown node_exporter:node_exporter /usr/local/bin/node_exporter

Create a folder to use for extra metrics

  > sudo mkdir -p /prometheus/metrics
  > sudo chown node_exporter:node_exporter /prometheus/metrics


SystemD setup


Create the systemD file


  > sudo vi /lib/systemd/system/node_exporter.service







Enable the service so it will auto start on reboot


  > sudo systemctl enable node_exporter




  > sudo systemctl status node_exporter




Test it out


  > sudo systemctl start node_exporter
  > sudo systemctl status node_exporter




Now pull ports and test.


  > ssh prometheus -L 9090:localhost:9090 –L 9100:localhost:9100



Wahoo data.


Now to scrape it


The data is available now but we still need to tell prometheus to scrape the data.

So let’s tweak the prometheus settings.  (Prometheus is running on the same server)



  > sudo vi /prometheus/pheus/prometheus.yml 


Now add another thing to scrape on this same server.



  - job_name: 'node_exporter'
    scrape_interval: 5s
      - targets: ['localhost:9100']



Restart Prometheus


  > sudo systemctl restart prometheus

Now let’s check prometheus and see if its getting the data.



Open Targets

Looks like its scraping it.



Go to Graph

Search for





(click Execute)



There is data!




Custom Data

If I look at /lib/systemd/system/node_exporter.service


  > sudo vi /lib/systemd/system/node_exporter.service



I have set the variable to look at the folder /prometheus/metrics for “extra” prometheus formatted data to ingest and have node exporter show at :9100/metrics

So let me make a simple python program and create a cron job to run it.




  > sudo mkdir /prometheus/code
  > sudo vi /prometheus/code/





#!/usr/bin/env python3
# Random stuff

import sys
import os
import random


#  Main
if __name__ == '__main__':

  #Test if base folder exist

    if not os.path.exists(base_dir):
      print("folder '" + base_dir + "' does not exist")
      prom_data  = "# HELP my_test_data_01 just some test data\n"
      prom_data += "# TYPE my_test_data_01 gauge\n"
      prom_data += "my_test_data_01 " + str(random.randint(1,100)) + "\n"   

      prom_data += "# HELP my_test_data_02 just some test data\n"
      prom_data += "# TYPE my_test_data_02 gauge\n"
      prom_data += "my_test_data_02 " + str(random.randint(1,100)) + "\n"

      prom_data += "# HELP my_test_data_03 just some test data\n"
      prom_data += "# TYPE my_test_data_03 gauge\n"
      prom_data += "my_test_data_03 " + str(random.randint(1,100)) + "\n" 

      f = open(base_dir + "/my_data.prom", 'w')

  except Exception as e:
    print("Exception {0}", e)





  > sudo chown node_exporter:node_exporter /prometheus/code/
  > sudo chown node_exporter:node_exporter /prometheus/code
  > sudo chmod u+x /prometheus/code/



Let me switch to the node_exporter user and run the python code.


  > sudo su node_exporter -s /bin/bash
  > /prometheus/code/



Now check the output


  > cat /prometheus/metrics/my_data.prom



Now let me run a curl to prove that node_exporter is getting this extra data



  > curl -s localhost:9100/metrics  | egrep my_test_data



Wahoo now let me set up a curl to make up new random data every 5 minutes
Exit out of the node_exporter user



  > sudo vi /etc/cron.d/my_fake_data


And place the following into it


*/5 * * * * node_exporter /prometheus/code/


Now the file should update every 5 minutes
Now to go look at the data




Sweet it is working J






[1]        Installing Prometheus on Ubuntu 20.04
Accessed 1/2020

Accessed 1/2020

[3]        Node Exporeter’s Github page
Accessed 1/2020

[4]        How To Install Prometheus on Ubuntu 16.04
Accessed 1/2020




  1. Nice Post thank you very much for sharing such a useful information and will definitely saved and revisit your site and i have bookmarked to check out new things frm your post.
    Data Science Course

  2. Thanks Your post is so cool and this is an extraordinary moving article and If it's not too much trouble share more like that.
    Digital Marketing Course in Hyderabad

  3. You have done excellent job Thanks a lot and I enjoyed your blog. Great Post.
    Data Science Certification in Hyderabad

  4. Great post happy to see this. I thought this was a pretty interesting read when it comes to this topic Information. Thanks..
    Artificial Intelligence Course

  5. I wanted to thank you for this great read and definitely enjoying every little bit of it and bookmarked to check out new stuff you post.
    Best Digital Marketing Institute in Hyderabad

  6. This is an informative and knowledgeable article. therefore, I would like to thank you for your effort in writing this article.
    Best Digital Marketing Courses in Bangalore

  7. Very good message. I stumbled across your blog and wanted to say that I really enjoyed reading your articles. Anyway, I will subscribe to your feed and hope you post again soon.
    Data Scientist Course in India

  8. Hi, I looked at most of your posts. This article is probably where I got the most useful information for my research. Thanks for posting, we can find out more about this. Do you know of any other websites on this topic?
    Data Science Course Details

  9. It's good to visit your blog again, it's been months for me. Well, this article that I have been waiting for so long. I will need this post to complete my college homework, and it has the exact same topic with your article. Thanks, have a good day.
    Data Analytics Course in Jaipur

  10. I was browsing the internet for information and found your blog. I am impressed with the information you have on this blog
    Data Science Course Fees

  11. Excellent work done by you once again here and this is just the reason why I’ve always liked your work with amazing writing skills and you display them in every article. Keep it going!
    Data Analytics Courses in Hyderabad

  12. I was browsing the internet for information and found your blog. I am impressed with the information you have on this blog.
    MLOps Course

  13. Informative Post. The information you have posted is very useful and sites you have referred was good. Thanks for sharing.
    Data Science Course with Placement

  14. We are really grateful for your blog post. You will find a lot of approaches after visiting your post. Great work thank you.
    Cloud Computing Training in Bangalore

  15. Actually I read it yesterday but I had some ideas about it and today I wanted to read it again because it is so well written.
    IoT Training

  16. Very informative Blog! There is so much information here that can help thank you for sharing.
    Data Analytics Training in Bangalore

  17. Hi, I looked at most of your posts. This article is probably where I got the most useful information for my research. Thanks for posting, we can find out more about this. Do you know of any other websites on this topic?
    IoT Course