home climate V1

home climate V1

In 2018, a colleague mentioned to me that he found a nice CO2 monitor with an undocumented but open USB connection. On Github, vfilimonov wrote a nice python package to decode the USB connection and show the measured data on a website. From this starting point on, I experimented with further sensors and a more complex data acquisition, storage and visualisation. The current state is an experimental setup running on a Raspberry Pi 3B+. Several sensors are connected to test and compare them.

Connected sensors are:

Homclimate sensors Homeclimate sensors with annotations
The monitoring system in the configuration as used in the following. Raspbian Buster runs from an old SSD that also stores programs and databases (black cable with two USB port because the Pi cannot supply enough current). The white USB cable runs to the AirCO2ntrol CO2 monitor. For reliability, the whole system is connected through ethernet (yellow cable) instead of WiFi. Elements on the breadboard are from left to right:
  • push button that triggers a safe shutdown script (in case ssh connections fail)
  • a green LED connected to serial0 to monitor the serial shell for activity (deactivated later to use the MH-Z19 sensor)
  • BMP180 pressure sensor (connected to I2C)
  • TSL2561 light sensor (also on the same I2C in parallel to the BMP180)
  • MH-Z19 CO2 sensor (on a serial bus)
  • and finally two DHT22 pressure/temperature sensors with the required resistors

Preparation

The go-to operating system for Raspberry Pis is Raspbian that is build on Debian. The current version as of writing this is Raspbian Buster. Here, I use Raspbian Lite because the Pi should run headless, i.e. without monitor or input devices attached. I use a basic installation with two modifiactions:

It is recommended to install Raspbian on a HDD or SDD rather than an SD card or a normal USB stick. Writing climate metrics to a database every couple of seconds puts a significant load on the storage device. Normal SD cards or USB sticks cannot deal with this for more than a couple of weeks. During development, one SD card and two USB sticks died on me because I wrote them to death. The longest one of the USB sticks survived was three months with data logging every 30 seconds. When instead using an external HDD or SSD, watch out for the power draw. The USB ports of my Pi was not able to power a 2.5″ HDD, nor an old Samsung Evo SSD. For both to work, I had to use a USB power-data breakout cable from an old external disk or other forms of external power supply.

Hardware setup and wiring

Access to GPIO pins and interfaces is restricted by default and needs to be switched on first. In the Raspi configuration under number 5 “Interface”, the I2C bus and serial need to be turned on. The terminal on the serial port must be turned off but the serial port must remain enabled (changing these options require a reboot). Remote access to GPIO pins is not strictly necessary but helpful for debugging over ssh.

To check if the sensors are connected correctly, install helper tools to check which devices are connected to the I2C bus …

… and check if all expected devices show up on the I2C bus. The BMP180 pressure sensor and the TSL2561 light sensor communicate over I2C and should show now.

This oupput shows that the TSL2561 is recognized on address 39 and the BMP180 on address 77. Which sensor uses which address is set by the manufacturer and not obvious. Check the datasheet or google.

Software installation

Get the CO2 monitor running

Accessing to the CO2 monitor builds on the co2meter module by vfilimonov.

In order to read data over USB from the CO2 monitor, two rules need to be set. Create a file /etc/udev/rules.d/98-co2mon.rules with the content

and reload the rules

Installing co2monitor correctly proved difficult on my first tries and failed a couple of times. The current version of co2monitor has all these issues fixed and should be easy to install. Still, it is a good idea to try it out before continuing. Inside an ipython3 console:

If read_data returns a dictionary or pandas dataframe, co2monitor works as intended. Each read command outputs time as datetime object, integer co2 level and a temperature float.

Install python modules to access further sensors

BMP180 pressure sensor

Note: The Adafruit_Python_BMP library for BMP180 and BMP085 is deprecated now but is still available on PyPI. Newer libraries (from Adafruit) exist for the BMP280, a successor of the older BMP180.
Test the BMP180 sensor with the following. It should output the current temperature in degrees Celcius and the pressure in kPa.

DHT22 temperature/humidity sensor

Note: The Adafruit_DHT library for DHT sensors is deprecated now but is still available on PyPI. The same functionality but with a somewhat different interface is also available in the “new” implementation for Circuitpython.
The DHT sensors can be testing with the following. It should output the current humidity and temperature for both sensors that are connected to pins 17 and 27.

TSL2561 light sensor

Also available through the Circuitpython implementation by Adafruit is access to the TSL2561 sensors.
The TSL2561 tested by the following code should output the total, broadband and infrared brightness in Lux.

MH-Z19 CO2 sensor

MH-Z19 sensors are less common because the infrared CO2 measurement is inherently much more expensive than the other sensors. There is a nice project on Github available on PyPI
And, again, test the functionality. It works correctly if the CO2 level in ppm and temperature in Celcius comes up.

Note

It may be necessary to restart the Raspbarry Pi in order to accept all the updated settings and installations. At least a reboot turned out to solve my issue of not getting any reply from the sensors or the sensor queries returning seemingly random errors.

Install influxdb

Add the repository and install through apt-get.

Start the influxdb service and enable autostart on system boot.

Install influxdb python module

The influxdb databases can be easily accessed from python after installing the influxdb python module.

Install Grafana

Add the repository and install through apt-get.

Start the grafana server and configure it to start at boot up.

Grafana runs on the default HTTP port 3000 and can be accessed with the default user “admin” and password “admin”. The admin password should be changed on first login.

Install Telegraf

Aside from the climate data, it is also a good idea to monitor the Raspberry Pi. Running large queries on the database, e.g. getting all data for several months, can easily overwhelm the Pi and max out the CPU for minutes or fill up the RAM. System monitoring with Telegraf helps to identify how large a query can get before the Pi will crash. Aside from that practical aspect, a large dataset of Raspberry Pi stats is a nice dataset to play with. Maybe elevated temperatures captured by the climate logging also show in elevated CPU temperatures? And how well do the two correlate?

Add the repository and install through apt-get.

Start Telegraf and configure it to start at boot up.

Logging data

Set up the databases

Use two databases to log climate data and system metrics, separately. The system database will be created by Telegraf automatically, so it is only necessary to set up the climate database.

If not done already: start influx server

Create a new database with human readable timestamps. Create a new user to access the data with Grafana later on. List all available databases to ensure the creation worked as intended.

For more complex names and special characters it might be necessary to quote the database name like CREATE DATABASE "homeclimate-raspi". If you created a database with a wrong name, DROP DATABASE databasename deletes it again.
To check if measurements are being written to a database, activate the database by USE databasename and then run a query, e.g. select * from "live logging" shows all values in the database that are labeled with the measurement string “live logging”.

Start logging services

Sensors

Reading the connected sensors and sending the data to influxdb can be done in a variety of ways. For simplicity, I use short python programs that periodically read and send data. Complementary systemd services execute these programs and ensure that they run after boot. The scripts and services can be found on Github.
As for the config files, I prefer to keep python programs and services in a handy location at ~/homeclimate/scripts and ~/homeclimate/services, respectively. The services can be linked to the required location at /etc/systemd/system.

Now, the logging sevices can be handled as any other systemd service.

… and enabled to automatically run after startup.

For testing, debugging, …, the most often needed commands are systemctl status to check if the service runs correctly and systemctl start, systemctl stop or systemctl restart for (re-)start and stopping. The standard input/output and error of the programs run by the services is logged in the systemd journal. It can be accessed with sudo journalctl -u servicename to, e.g., check for debug messages. The flag -f lets the journal update automatically on new incoming messages.
After changing the service unit files (xxx.service), it may be necessary to reload the configuration with sudo systemctl daemon-reload.

Telegraf

Telegraf has a lot of options for monitoring a machine. For details check the extensive documentation.
The config is controlled through commands or the config file /etc/telegraf/telegraf.conf. To keep all configs for this project together, I instead place the config file at ~/homeclimate/configs. The default configuration already logs the most relevant parameters. Further information can be gathered by uncommenting the respective plugin in telegraf.conf . It is only necessary to enable influxdb as an output to send the measured values to. The default database for Telegraf is “telegraf” and will be automatically created if not present.
To tell Telegraf about this configuration, I simply link to my own file to the location that Telegraf expects the config file to be. Creating a backup of the original config and stopping Telegraf before is always a good idea.

Telegraf should now be gathering data and sending them to the influxdb database “telegraf”. To confirm that data is indeed being written, check the database for new data series and measurements:

Create Grafana dashboards

If not done already: start the Grafana webserver

Grafana is then accessible over port 3000 by default on the IP or name of the Raspberry Pi.

To view the sensor data, a new dashboard is necessary (“+” symbol -> dashboard) that will then show the sensor data in various panels. First, however, it is necessary to add Influxdb as a new data source (cog wheel -> “Data Sources” -> “Add data source”). The relevant options are to point to the correct IP address using the default port 8086 for Influxdb. Since this is all running in an internal network and separate from the rest of my network, I did not bother setting up further security meassures. In the section ‘InfluxDB Details’ the correct database, user and password has to be set to grant Grafana access. These were all set above while creating the new database in InfluxDB. To display measurements from the sensors stored in the ‘homeclimate’ database and the Telegraf monitoring in the ‘telegraf’ database, two new InfluxDB data sources have to be added in Grafana. The new data source for homeclimate can then look like this:

It is now possible to plot the data in a new dashboard in a new data panel through a query to InfluxDB. The following settings are an example to display temperature and CO2 level from the USB CO2 monitor. Since several sensors measure temperature, the data field ‘temperature’ is used several times and it is necessary to limits the query to just the CO2 sensor in the ‘FROM’ field.

The colors and labelling on the left/right y-axis is a bit hidden and can be chosen by clicking on the colored line in the legend.

The scripts are set up such that they write data with the tag ‘live logging’. This is to differentiate the original, unprocessed, “live” data from processed data that will be stored in the same database for convenience. InfluxDB has the ability to automatically process data, e.g. compute averages. Grafana can also average data while plotting but this naturally requires loading all data first to then compute and display the average which can get very slow for a long time range or tightly spaced measurements. It is therefore much easier and faster to let InfluxDB do that averaging automatically and have Grafana only retrieve and plot the averages. To compute, e.g., hourly averages, InfluxDB computes the average over the last hour automatically after the full hour has passed and writes them to a new database or with new tags. (This is implemented in my setup already but not written up yet. I hope to find the time to write up how to do that later.)

An overview over all currently attached sensors could then look like this:

Here you can also see that the DHT sensors sometimes glitch out and return nonsensical values. Apparently, the temperature was -12° C between 7:30h and 8:00h. Right …
Wrong readings are no big deal and just have to be filtered out – a task that I didn’t get to yet. It can probably be done in InfluxDB or even Grafana but I prefer to not even write wrong data to the database but instead filter them out in the python scripts already. The glitches are large, so even a simple filter like ‘if the value deviates by more than 10° C from the previous, discard it’ is good enough.

Backup your hard work!

Now that everything is installed and running you really don’t want to lose the system or configuration. All the hours of waiting for installations and fixing annoying bugs would be lost. Create a byte copy of the Raspberry Pi boot device (HDD, SSD, SD card, USB stick). If the device ever fails you can simply burn a new one and get going in a couple of minutes.
The following is meant more for a small USB drive or SD card. Making a bit copy of a large external HDD requires a lot of disk space. For such a case, there are better solutions out there.

I’m using macOS, so I use the disk utility to find the right drive to back up.

Create an image using dd. Here, the USB-Stick is disk2. Creating the image can take a while and does not show any progress or confirmation. Just wait until it’s finished. Depending on the read speed of the SD card or stick this can take half an hour for a 16GB drive.

The homeclimate_backup.img image can be burned to a replacement device using e.g. dd or Etcher.

Statistical analysis

Collecting and viewing these data is fun and already helps already, especially when setting up warnings within Grafana: The CO2 level goes beyond 2000 ppm? Time to open the window and let some fresh air in!

Now the real fun begins with analysing these data. Measurements of 10 parameters every 30 s already gives you 28800 measurements per day! Let that run for a couple of days/weeks/months and you have an enourmous dataset to play with.

For example, I was wondering if and how well temperature and CO2 correlate. The idea is that over the summer a window or doors are open most of the time but in winter CO2 gets really high because I don’t want to loose the heat.

And indeed there is a correlation in the data: In the summer with temperatures of 24° C and above (that’s the downside of a nice appartment in the attic), the CO2 concentration is below 500 ppm almost all the time. On colder days, the CO2 is anywhere between 500 ppm and 1500 ppm but virtually never at the outdoor level of 400-450 ppm. This dataset is not ideal ideal yet because it has only three months of data and covers about beginning of august to end of october.

This makes me wonder: how much variation is there over the course of a single day?

For temperature, it’s as expected. Minimum in the early morning, maximum in the late afternoon and a few degrees variation in between. The CO2 level is much more interesting with a <500 ppm baseline and rising tracks throughout the night. Throughout the day there seems to be no clear trend. Now I wonder if there are buried trends that show on Mondays to Fridays but not on the weekend …