how to find out who needs to clear out the dishwasher

We use the term “smart home” lightly these days. It has become a term of marketing and phantastic stories.

Considering how readily available lots of different sensors, actors and personal-assistants are these days one would think that most people would start to expect more from the marketing “smart-home”.

I believe that the smart is to be found in the small and simple. There are a lot of small things that actually make something feel smart without it actually being smart about anything.

Being smart is something not achieved yet – not even by a far stretch of the sense of the word. So let’s put that to the sides of the discussion for now and move a simple thing in the middle of this article.

Have you ever had an argument about who should or should have cleared out the dishwasher after it’s finished?

We had.

So we outsourced the discussion and decision to a 3rd party. We made our house understand when the dishwasher starts and ends it’s task. And made it flip a coin.

There was already a power consumption monitoring in place for the dishwasher. Adding a hysteresis over that monitoring would yield a simple “starts running” / “stops running” state of the dishwasher.

Pictured above is said power consumption.

  • When the values enter the red area in the graph the dishwasher is considered to be running.
  • When it leaves that area the dishwasher is considered finished/not running

Now adding a bit of random coin-tossing by the computer and each time when the dishwasher is detected to have started work a message is sent out depending on the result of the coin-toss.

That message is published and automatically displayed on all active displays in the house (TVs/…) and sent as push notifications to all members that need to be informed of this conclusive and important decision.

In short:

Everyone gets a push notification who is going to clear out the dishwasher based upon a coin-toss by a computer every time the dishwasher starts.

The base of all of this is a Node-RED flow that that uses the power consumption MQTT messages as an input and outputs back to MQTT as well as pushes out the push notifications to phones, screens and watches.

Additionally it creates a calendar entry with the start-finish time of the dishwasher run as well as the total energy consumption for this run.

Node-RED flow

The flow works like this: on the right the message enters the flow from MQTT. The message itself contains just the value of the power consumed at this very moment. In this case consumed the dishwasher.

The power consumption is updated regularly, every couple of seconds this way. So every couple of seconds this flow runs and gets an updated value of

Next a hysteresis is applied. In simple terms this means: when the value goes above a certain threshold the dishwasher is considered to be running. When it goes below a certain threshold then it is considered finished.

When the dishwasher changed it’s state to “running” the flow will generate a random number between 0 and 1. This give a 50:50 chance for either Steffi or Daniel be the chosen one to clear out the dishwasher for this run. This message is sent out as push notification to all phones, watches and TVs.

When the dishwasher finishes it’s run the total energy consumption is taken and sent out as the “I am done message”. Also this information is added to the calendar. Voilá.

the real smart home has a calendar!

A calendar? Why a calendar you may ask. Oh well there are several reasons. Think of calendars as another way to interact with the house. All sorts of things happen on a timeline. A calendar is only a visual aid to interact with timelines.

May it be a home appliance running and motion being sensed for your home alarm system. All of that can be displayed in a calendar and thus automatically sync to all your devices capable to display this calendar.

And if you start adding entries to a calendar that the house uses to know what to do next… how about putting light on-off times into an actual calendar right on your phone instead of a complicated browser user interface like many of those marketing smart-homes want us to use?

Never confuse wisdom with luck.

44th Rule of Acquisition / Ferengi

can your kitchen scale do this trick? – ESP8266+Load Cell+MQTT

Ever since we had changed our daily diet we started to weigh everything we eat or cook. Like everything.

Quickly we found that those kitchen scale you can cheaply buy are either not offering the convenience we are looking for or regularly running out of power and need battery replacements.

As we already have all sorts of home automation in place anyway the idea was born to integrate en ESP8266 into two of those cheap scales and – while ripping out most of their electronics – base the new scale functionality on the load cells already in the cheap scale.

So one afternoon in January 2018 I sat down and put all the parts together:

ESP8266 + HX711 + 4 Load Cells
my notes of the wiring… this might be different for your load cells…

After the hardware portion I sat down and programmed the firmware of the ESP8266. The simple idea: It should connect to wifi and to the house MQTT broker.

It would then send it’s measures into a /raw topic as well as receive commands (tare, calibration) over a /cmd topic.

Now the next step was to get the display of the measured weights sorted. The idea for this: write a web application that would connect to the MQTT brokers websocket and receive the stream of measurements. It would then add some additional logic like a “tare” button in the web interface as well as a list of recent measurements that can be stored for later use.

the web app. I am not a web designer – help me if you can! ;-)

An additional automation would be that if the tare button is pressed and the weight is bigger than 10g the weight would automatically be added to the measurements list in the web app – no matter which of the tare buttons where used. The tare button in the web app or the physical button on the actual scale. Very practical!

Here’s a short demo of the logic, the scale and the web app in a video:

You can grab the sourcecode for the Arduino ESP8266 firmware as well as the source code for the web application here.

LED projector for your home automation needs

In 2017 Texas Instruments had released a line of cheap industry grade LED projectors meant to be used in production lines and alike:

DLP® LightCrafter Display 2000 is an easy-to-use, plug-and-play evaluation platform for a wide array of ultra-mobile and ultra-portable display applications in consumer, wearables, industrial, medical, and Internet of Things (IoT) markets. The evaluation module (EVM) features the DLP2000 chipset comprised of the DLP2000 .2 nHD DMD, DLPC2607 display controller and DLPA1000 PMIC/LED driver. This EVM comes equipped with a production ready optical engine and processor interface supporting 8/16/24-bit RGB parallel video interface in a small-form factor.

Texas Instruments

And of course this got picked up by the makers. In the hands of people like MickMake who designed an adapter PCB for the RaspberryPi Zero W to the smallest projector available from TI.

After I had learned about the existence of those small projectors I had to get a couple and try for myself. There would be so many immediate and potential applications in our house.

2x DLPLDCR2000EVM with MickMake adapter and RaspberryPi Zero W

After having them delivered I did the first trial with just a breadboard and the Raspberry Pi 3.

first light!

The projector module has a native resolution of 640×360 – so not exactly high-pixel-density. And of course if the image is projected bigger the screen-door effect is quite noticeable. Also it’s not the brightest of images depending on the size. For the usual use-cases the brightness is definitely sufficient.

Downsides

  • too low brightness for large projection size – no daylight projection
  • low resolution is an issue for text and web content – it is not so much of an issue for moving pictures as you might think. Video playback is well usable.
  • flimsy optics that you need to set focus manually – works but there is no automatic focus or alike.

Upsides

  • very low powered – 2.5A/5V USB power supply is sufficient for Pi Zero + Projector on full brightness (30 lumen)
  • low brightness is not always bad – one of our specific use cases requires an as dim as possible image with fine grain control of thr brightness which this projector has.
  • extremely small footprint / size allows to integrate this device into places you would not have thought of.
  • almost fully silent operation – the only moving part that makes a sound is the color wheel inside the DLP module. You have to put your ear right onto it to hear anything.
  • passive cooling sufficient – even at full brightness an added heat sink is enough to dissipate the heat generated by the LED.

So what are these use cases that require such a projector you ask?

Night status display:

For the last 20+ years I am used to sleep with a “night playlist” running. So far a LED TV was used at the lowest brightness possible. Still it was pretty bright. The projector module allows to dim the brightness down to almost “moon brightness” and also allows to adjust the color balance towards the reds. This means: the perfect night projection is possible! And the power consumption is extremely low. A well watchable lowest brightness red-shifted image also means much lower temperatures on the projector module – it’s crazy how low powered, low temperature.

at 75% brightness (camera did not properly focus…)

Season Window Projection:

Because the projector is small, low-powered and bright enough for back-lit projection we tried and succeeded with a Halloween window projection scene the last season.

outside view
inside view

It really looks funky from the outside – funky enough to have several people stop in front of the house and point fingers. All that while power consumption was really

House overall status projections:

When projecting information is that cheap and power efficient it really shines when used to display overall status information like house-alarm status, general switch maps, locations of family members and so on. I’ve left those to your imagination as these kind of status displays are more or less giving away a lot of personal information that isn’t well suited for the internet.

power consumption after the ssd swap

A week after swapping out mechanical hard drives for SSDs it’s time to look at what it meant in the longer run for the power consumption of the server.

15 watt less at least

Depending on what the server is asked to do – high or low cpu load and so on – the power consumption fluctuates but it’s very visible that the averages are about 15 watt lower at all times. Great!

out with the old, in with the new – house gets ssd upgrade

A week ago I had written about another mechanical hard drive that was about to bite the dust in our houses elaborate set-up.

Not having time for a full-day-of-focus I postponed the upgrade to this saturday. With the agreement of the family as they are suffering through the maintenance period as well.

The upgrade would need cautious preparation in order to be doable in one sitting. And this was also meant to be some sort of disaster-recovery-drill. I would restore the house central docker and service infrastructure from scratch along this.

And this would need to happen:

  • all services, zfs pools, docker containers, configurations needed to be double checked for full backup – as this would be used to restore all (ZFS snapshots are just the bomb for these things!)
  • the main central docker server would have to go down
  • get all hard disks ripped out
  • SSDs put in and properly configured
  • get a fresh Ubuntu 18.04 LTS set-up and booting from ZFS on a NVMe SSD (bios update(s)!, secure boot disabling, ahci enabling, m.2 instead of sata express switching…you get the idea)
  • get the network set-up in order: upgrading from Ubuntu 16.04 to 18.04 means ifupdown networking was replaced by netplan. Hurray! Not.
  • get docker-ce and docker-compose ready and set-up and all these funky networkings aligned – figure out in this that there are major issues with IPv6 in docker currently.
  • pull in the small number of still needed mechanical hard disks and import the ZFS pools
  • start the docker builds from the backup (one script \o/)
  • start the docker containers in their required order (one script \o/)

Apart from some hardware/bios related issues and the rather unexpected netplan introduction everything went fairly good. It just takes ages to see data copied.

the “heartbeat” is a general term in our house for busy everything is. It’s an artificial value calculated from sensor inputs/s and actions taken and so on. Good indication if there are issues. During the time of maintenance (organge/red) it hasn’t been updated and was stuck at the pre-given value.

Bandwidth was the only real issue with this disaster recovery. All building blocks seemed to fall into place and no unplanned measure had to be taken. The house systems went partially down at around 12:30 and were back up 10 hours later 22:00. Of course non-automated things like internet kept working and all switches were only manual push-buttons. So everything could be done still but with a lot less convenience.

All in all there are more than 40 vital docker container based services that get started one after the other and interconnect to deliver a full house home automation. With the added SSD performance this whole ship is much much more responsive to activities. And hopefully less prone to mechanical defects.

Backup and Disaster-Preparations showed to be practical and working well. There was no beat missed (except sensor measure values during the 10 hours downtime) and no data lost.

Core i3 with 3.7 Ghz and 32 Gbyte RAM is sufficient and tuned for power consumption

What could be done better: It could be much more straight forward when there were less dependencies on external repositories / docker-hub. Almost all issues that came up with containers where from the fact that the maintainers had just a day before introduced something that kept them from spinning up naturally. Bad luck. But that can be helped! There’s now a multi-page disaster-recovery-procedure document that will be used and updated in the future.

Oh and what speeds am I seeing? The promissed 3 Gbyte/s read and write speeds are real. It’s quite impressive to see 4-digit megabyte/s values in iotop frequently.

I almost forgot! During this exercise I had been in the server room less than 30 minutes. But I was on a warm and nice work-desk set-up I am using in the house as much as I can – and I will tell you about it in another article. But the major feature of this work-desk set-up is that it is (a) a standing desk and (b) has a treadmill under it. Yes. Treadmill.

You will get pictures of the set-up in that mentioned article, but since I had spent more than 10 hours walking on saturday doing the disaster recovery I want to give you a glimpse of what such a set-up means:

46 km while doing disaster recovery successfully.

indoor location tracking with ESP32

This project uses the same approach that I took for my ESP32 based indoor location tracking system (by tracking BLE signal strength). But this project came up with an actual user interface – NICE!

“Indoor positioning of a moving iBeacon, using trilateration and three ESP32 development modules. ESP32 modules report all beacons they see, to MQTT topic. Dashboard subscribes to this topic, and shows the location of beacons which are seen by all three stations.”
(https://github.com/jarkko-hautakorpi/iBeacon-indoor-positioning-demo)

Apple Airplay for SONOS (in Docker)

We’ve got a couple of SONOS based multi-room-audio zones in our house and with the newest generation of SONOS speakers you can get Apple Airplay. Fancy!

But the older hardware does not support Apple Airplay due to it’s limiting hardware. This is too bad.

So once again Docker and OpenSource + Reverse-Engineering come to the rescue.

AirConnect is a small but fancy tool that bridges SONOS and Chromecast to Airplay effortlessly. Just start and be done.

It works a treat and all of a sudden all those SONOS zones become Airplay devices.

There is also a nice dockerized version that I am using.

waking up to another dying hard disk – upgrade time!

At our house I am running a medium-sized operation when it comes to all the storage and in-house / home-automation needs of the family.

This is done by utilizing several products from QNAP, Synology and a custom built server infrastructure that does most of the heavy-lifting using Docker.

This morning I woke up to an eMail stating that one of the mirrored drives in the machine is reporting read-errors.

Since this drive is part of a larger array of spinning-rust style hard disks just replacing it would work but due to the life-time of those drives I am not particularly interested in more replacing in the very near future. So a more general approach seems right.

63083 lifetime hours = 2628 days = 7.2 years powered up

You can see what I mean. This drive is old. Very old. And so are its mates. Actually this is the newest drive of another 6 or so 1.5TB and 1TB drives in this array.

Since this redundant array in fact is still quite small and not fully used as most storage intensive non service-related disk space demands have moved to iSCSI and other means it’s not the case anymore that so many disks, so well redundant with so little disk space are needed anymore. Actual current space utilization seems about 20% of the available 2TB volume.

Time for an upgrade! Taking a look in the manual of the mainboard I had replaced 2 years ago I found that this mainboard does have dual NVMe m.2 ports. From which I can boot according to that same manual.

So I thought: Let’s start with replacing the boot drives and the /var/lib docker portions with something fast.

To my surprise Samsung is building 1 TB NVMe M.2 SSDs to a price I expected to be much higher.

Nice! So let me reeport back when this shipped and I can start the re-set-up of the operating system and docker environment. Which by all fairness should be straight forward. I will upgrade from Ubuntu 16.04 LTS to 18.04 LTS in the same step – and the only more complex things I expect to happen is the boot-from-ZFS(on Linux) and iSCSI set-up of the machine.

If you got any tips or best-practice, let me know.

I just have started the catch-up on what happpened in the last 2 years to ZFS on Linux. My initial decision to use Linux 2 years ago as the main driver OS and Ubuntu as the distribution was based upon the exepectation to not have this as my hobby in the next years. And that expectation was fulfilled by Ubuntu 16.04 LTS.

small and cheap multi-sensor nodes for home automation

I had reported on my efforts to develop an indoor location tracking system previously. Back in 2017 when I started to work on this I only planned to utilize inexpensive EspressIf ESP32 SoCs to look for bluetooth beacons.

In the time between I figured that I could, and should, also utilize the multiple digital and analog input/output pins this specific SoC offers. And what better to utilize it with then a range of sensors that also now could feed their measurements into an MQTT feed along with the bluetooth details.

And there is a whole lot of sensors that I’ve added. On a breadboard it looks like this:

So what do we have here:

  • Motion sensor
  • Temperature sensor
  • Humidity sensor
  • Light sensor
  • Barometric pressure sensor
  • and of course an RGB LED to show a status

The software I’ve done already and after 3 weeks of extensive testing it seems that it’s stable. I will release this eventually later in the process.

I’ve also found plastic cases that fill fit this amount of sensory over the sensor cases I had already bought for the ESP32 alone. For now I’ll close this article with some pictures.

The MQTT feed one of these nodes produces…

…and the Grafana dashboard I am using for this specific prototype device.