A need popped up at work for a data logger for various lab tasks. Quickly looking at the market, I failed to identify a lab tool for data logging (cheap, easy but powerful setup, remote access); something for researchers and scientists. I decided a Raspberry Pi with some input buffering would be ideal for the task. This is my roll your own data logger, put together on Saturday – showing what is possible quickly and potential with more development time.
- Raspberry Pi (3) – I used a 3 for integrated wifi and speed but any Pi should work.
- SD Card – A reliable brand as the database will be writing frequently. I used a 16GB SanDisk Extreme
- Some form of IO buffer or input device. I first developed with a DHT22 temperature/humidity sensor, followed by a potential divider/zener diode digital logic buffer then finally a Pimoroni Automation HAT for analogue voltage. I plan on creating a custom multi-channel buffered 24V logic and 0-10V analogue input card to partner with this, for industrial device testing.
Install InfluxDB and Grafana on Raspberry Pi
A while back I attempted this but ran into problems installing and configuring InfluxDB and Grafana. I had attempted to user Docker but without success. Thankfully, .deb packages now exist in the Debian ‘Stretch’ repository for armhf (Raspberry Pi). You and either add ‘Stretch to your list of sources or download the packages with
# update these URLs by looking at the website: https://packages.debian.org/sid/grafana
sudo apt-get update
sudo apt-get upgrade
sudo dpkg -i influxdb_1.0.2+dfsg1-1_armhf.deb
wget http://ftp.us.debian.org/debian/pool/main/g/grafana/grafana-data_2.6.0+dfsg-3_all.deb # grafana data is a dependancy for grafana
sudo dpkg -i grafana-data_2.6.0+dfsg-3_all.deb
sudo apt-get install -f
sudo dpkg -i grafana_2.6.0+dfsg-3_armhf.deb
sudo apt-get install -f
Installed via the .deb package, InfluxDB creates a service and enables at startup so there is little more configuration to do. One must create the databases that our scripts will be saving into. This can be done via the web gui. Navigate to your Pi IP on port 8083 (default InfluxDB admin port – http://localhost:8083 if you’re doing this on the Pi). Using the ‘Query Templates’ dropdown, one can create a database with ‘Create Database’ – it’s fairly self explainatory. Make one called ‘logger’ to store our sample data.
Create Logging Script
The example below uses Pimoroni’s Automation HAT but can be easily adapted to a GPIO pin or DHT22 using the commented lines. It’s worth having a read of the InfluxDB Key Concepts but essentially, I create a unique tag for each time the script runs called ‘run’. This enables a user to differentiate between test setups with no backend jiggery pockery when we come to Grafana. The measurement name can be set via the command arguments, to enable a user to set different testing sessions, currently by default it is ‘dev’. You’ll need the InfluxDB Python module for this script, which you can install like this:
sudo pip install influxdb
from influxdb import InfluxDBClient
#import RPi.GPIO as GPIO
# Set this variables, influxDB should be localhost on Pi
host = "localhost"
port = 8086
user = "root"
password = "root"
# The database we created
dbname = "logger"
# Sample period (s)
interval = 1
# For GPIO
# channel = 14¬
# GPIO.setup(channel, GPIO.IN)¬
# Allow user to set session and runno via args otherwise auto-generate
if len(sys.argv) > 1:
if (len(sys.argv) < 3):
print "Must define session and runNo!!"
session = sys.argv
runNo = sys.argv
session = "dev"
now = datetime.datetime.now()
runNo = now.strftime("%Y%m%d%H%M")
print "Session: ", session
print "runNo: ", runNo
# Create the InfluxDB object
client = InfluxDBClient(host, port, user, password, dbname)
# Run until keyboard out
# This gets a dict of the three values
vsense = automationhat.analog.read()
op = automationhat.input.read()
# gpio = GPIO.input(channel)
iso = time.ctime()
json_body = [
"op1" : op['one'], "op2" : op['two'], "op3" : op['three'],
"vsense1" : vsense['one'],"vsense2" : vsense['two'],"vsense3" : vsense['three']
# ,"gpio" : gpio
# Write JSON to InfluxDB
# Wait for next sample
Save the script as ‘logger.py’ and run using
python logger.py. To run the script in the background and indefinately as a ssh user use
nohup python logger.py &.
Unlike InfluxDB, Grafana doesn’t enable it’s service, so do this to enable at boot and start the service now:
sudo systemctl enable grafana-server
sudo systemctl start grafana-server
Navigating to port 3000, you should be presented with the Grafana web GUI. Grafana has plenty of powerful querying tools to make lots of pretty and informative graphs. Have a read of the Getting Started, and in particular the InfluxDB section.
The first thing you will need to do is add the ‘logger’ database as a Datasource. Navigate to Datasource->Add New and fill in as below
After this, you need to setup a Dashboard. Click the top dropdown button, press ‘New’ and create a dashboard called ‘logger’. The dashboards consist of rows. Clicking on the green bar on the left-hand edge provides a dropdown where one can add graphs etc – again, the Getting Started explains this better than me. Below are a couple of captures of how I setup the Automation HAT.
With this short introduction you should have the framework for a basic logging device and see the potential for real-world analytics of almost anything! I’m quite excited using the combination of a time based database like InfluxDB and powerful web graphing package Grafana.