Last
updated 1/28/25 7/21/22
(This page is just
kind of a starter. Lots more details need
to get filled in. A better description is
being started on my Project Notes pages here.)
The system that
collects the data displayed on the front page of this web site has
been under development since late 2008 (with no end in sight :-) Its main purpose is to let me see how the
house is doing when I'm not there – especially when we're on vacation.
Collection/control
network
There are currently
about 14 data collection/control nodes on an RS-485 network running
all around the house (and out to the garage) at 57Kb over Cat5 cable. Three pairs carry unregulated ~9VDC to power
the nodes; the other pair is the (half-duplex) data.
Everything is just wired in parallel.
The old nodes run PIC 16F628s; the majority are now Arduino based. Some talk I2C to LM75 temperature sensors, some just sense data bits, as with the rain sensor and sump cycle sensor. Some have output bits that can control stuff. Ones that have wires that could get ground referenced (like water sensors on the basement floor) have opto-isolators between the external sensor and the local processor. The one that reads the water meter has a long and "interesting" history. Some notes here.
The master/slave
protocol on the wire is locally designed since it is not expected to
talk to anything else. (Sounds like
famous last words.) It's basically a sync
byte, from addr, to addr,
opcode, length, data, checksum. The master polls each of the nodes
(currently once/min) and the addressed node replies with its latest
info. Data – both length and meaning - is
device dependent, so the master program has a decode callback for each
device – identified uniquely by its address. The
protocol allows the master to send data to the slave nodes so it can
use them to control external devices.
Currently four nodes
use that 'control' capability: the
router/modem power cycler, the sprinkler controller, the house water
shutoff valve, and an inside/outside temperature/dew point display in
the bedroom. The first gives the computer
a chance to try to fix loss of connection to the internet by power
cycling the modem, which occasionally gets confused. Some
details here. The sprinkler controller lets the main
computer control 8 channels of 24V valves that control watering of
various parts of the landscaping. The house water shutoff,
controlling a hacked Water Cop is obvious. The temp/dewpoint
display in the bedroom doesn't control anything - just displays the
data sent to it.
Main
collection program
Finally moved the main control perl program to
a 5 watt pink Pogoplug from its earlier
home on the main PC. (Trying to host a mission
critical application on a Windows machine is completely unworkable.
Don't get me started.) The machine hosting that
job has been upgraded to a dual core HP 'thin client' box. It's
responsible for polling the slaves, deciding when to give up on them,
collects info for 5 minutes, then tries to ftp the latest to the web
site. It can send email, or text messages
to my phone for serious problems like sensor failures, extended power
outages, water where it shouldn't be, etc. Some
details of the migration to the Pogo are here.
The migration to the thin client are only in my Project Notes in One
Note, which I hope to publish some day.
Control of the
sprinkler system is driven by a perl
module that reads a crontab-style control
file. On the list for years has been
using the system's knowledge of rain and temperature and a new soil
moisture sensor and maybe insolation to let it modulate the watering
intelligently. Modulation was finally added, but only manually
controlled. Some day. That same cron-type system now
also extracts special data from the stdout logs, writes
timestamps to make logs easier to read, etc. Of course the
real system cron currently controls 9 other tasks, with lots of
commented-out leftovers.
There is a little web
server on the Pogo
thin client, and it would be nice if I could serve the main stats page
from there as a backup to the web host. No
progress on that yet. It does host a day's
worth of security cam pictures, tho. The web
server (and sshd!)
are visible on non-standard ports from the Internet.
I'm using a dynamic DNS mechanism and a VPN for that.
UPS
The whole system – and
especially its ability to sense and report power failures obviously
doesn't work well if the computer or network/internet connection go
down if power goes out. The big, old,
ugly sine wave UPS/power conditioner retired from work finally died
and was replaced with a current 1KW unit (11/2020).
The two 12V gel cells that used to live inside it have been
replaced with two large marine deep-cycle batteries sitting next to
the UPS. It's ugly, but it keeps the
computer and router and modem up for maybe 3 hours of power outage.
The new UPS has a
little more intelligence than the old one's contact closure on shut
down. An interesting hacked PCB provides access to its
data. A D1-mini allows it to publish via MQTT to the main
system. It provides a battery voltage
monitor that should let me shut the main PC (the biggest power hog)
down gracefully while there's still lots
of battery left to keep the Linux machine, modem, and router up. Some day.
Interestingly, for
quite a while, the UPS batteries were protected from lack of
maintenance by – my backup sump pump! That
has a sensor and an alarm for "needs water" which beeped every couple
of months. As long as I had water and
tools out to fill that one, I always filled the UPS batteries as well.
That sump pump backup
is based on a Basement Watchdog 'Big Dog' system with which I have a
love/hate relationship. Its badly flawed charging system (which
is a critical part of its job!) has caused me to buy unnecessary
replacement deep cycle batteries as well as investing a huge
amount of time and energy in monitoring the battery to protect myself.
It even has its own graph on the home page now! I've come
to believe my unit is just faulty and needs to be repaired (by driving
45 minutes to their shop). A semipermanent improvement is a HF
float charger that does a WAY better job of maintaining the battery
than the terrible Basement Watchdog unit. But without the Big
Dog boiling the battery, the water alarm almost never goes off,
leaving the UPS batteries unprotected. The whole saga is written
up in my Project Notes on One Note. I hope to publish that some
day.
File
transfer
The update data sent
to the web site consist of one ascii
line per sensor per sample. The main perl program attempts an ftp of the latest
updates every 5 minutes. If it fails, the
updates are remembered (in a file) and all will be sent when it next
succeeds in connecting. In addition, the main perl sends images
of house pictures and a few graphs to the hosting computer.
With both my previous and new hosting provider, there were occasional ftp failures which truncated the ever-growing file on the web host, losing all previous info. That rarely happens any more, but the append-only file does grow to an unmanageable size, requiring manual intervention. Over the years, I developed tools to warn me about that and make intelligently trimming the file fairly painless.
That ftp mechanism has
been replaced with much superior scp using public key authentication.
Alerts
The main perl
program currently can send email, and SMS via vtext.com.
It alerts me if a sensor fails to respond 5 times in a row or
if there's a power outage. When
the water sensors are in place, if they find undesired water that
will certainly send an alert, too. Note
that this is dependent on my internet link being up!
I've vaguely considered a dial-out backup, but it isn't
likely.
Ping stats
to ISP
A separate perl
program does a ping every
5 seconds to the router at my ISP that my modem connects to
and keeps round trip time stats. This is
mostly to detect whether the internet connection is OK.
It could/should go to a few other stable sites outside my ISP
to get a better view. The stats from that
bit of perl are also collected by the
main program and sent to the web site so I can see how connectivity
has been. The collection program notices
when it thinks connectivity is bad – currently 40 consecutive missed
pings – and sets a flag that tells the main perl
program to tell one of the 485 nodes to power cycle the modem and
router. There's some detail here.
Web site
software
I found some php software that can produce graphs with auto scaling, etc. Some more php reads the datafile and gets the data to feed the graphs. The datafile analyzing software bucketizes the one-minute data into something that makes for smoother graphs. It's a 'work in progress'.
As things have
progressed, I'm also using the gnuplot graphing package. Despite
my shared hosting company's statement that it couldn't be done, I've
compiled and installed it on that host. But the graphs it makes
are still all done on the Linux thin client in the basement, and the
images sent to the web host.
It used to have some config options on the front page, but that let some damn Pakistani hackers in to trash the site in 2019. After cleaning up that mess, I locked it down a lot more. If I want the display different, I can ssh in and change it that way.
There's a watchdog that alerts me if things go wrong: the web host doesn't get an update for a while, file size is too big, graphs are stale, security cam pics are stale, etc. After a few years' battle to keep my watchdog process running in the background (hosting providers don't like that) I finally found I could make it cron-based, and it's been stable for years now. That watchdog can send email or - via email - text messages.
I succeeded (6/2019)
in using wget on the web host to get a
nice ascii
table of rainfall as recorded by the USGS at Salt Creek here in
Elmhurst. That data is taken every 5
minutes – just like mine. It's integrated
with the current rainfall info from my hardware as another line on the
rainfall graph. It languished on the list for many
years, but it's finally in place. Sadly, I think that USGS
info is about to be dropped.
I would really like to change the back end to maybe a real database and do periodic data compression/purging of old data. (I'll keep my one minute data for a while, but maybe reprocess to hourly (average + peak?) after 2 weeks or a month, and maybe daily after 6 months. I still have the original one minute data on the PC at home.) My hosting provider supports mysql, so that makes it a little easier. And maybe I can even get around the ftp failures/file truncations with transactions to the db! The Round Robin Database tool looks pretty interesting, but I don't know if it will run on either the web host or the basement thin client.
New
commercial home automation
I installed some Z-wave (Plus) switches and outlets and a Hubitat C7 hub to control them (scrapping all the old X-10/Insteon stuff) at the end of 2021. As always, there were wrinkles, but it's pretty stable, and both Alexa and Google Home learned to control the lights with little fuss. Finally got a remotely controllable thermostat, and a front door lock is probably on the list. The interesting part is that (four years later) there's still no good connection between the two systems. There's no driving need, but it seems like two systems in my house should be able to talk to each other. And Google and Alexa as well.
…end…