Archive for category home automation

when you’re working late: grant your eyes a rest

“Ever notice how people texting at night have that eerie blue glow?

Or wake up ready to write down the Next Great Idea, and get blinded by your computer screen?

During the day, computer screens look good—they’re designed to look like the sun. But, at 9PM, 10PM, or 3AM, you probably shouldn’t be looking at the sun.

f.lux fixes this: it makes the color of your computer’s display adapt to the time of day, warm at night and like sunlight during the day.

It’s even possible that you’re staying up too late because of your computer. You could use f.lux because it makes you sleep better, or you could just use it just because it makes your computer look better.”

Bildschirmfoto 2014-09-27 um 12.58.33


No Comments

a new Music Service for SONOS: xenim streaming network

I am a frequent podcast live-stream listener. And being that I am enjoying the awesome service called xenim streaming network.

Bildschirmfoto 2014-08-19 um 21.03.21

Any Podcast producer can join the xsn and with that can live-stream his own Podcast while recording. It’s CDN is based on voluntarily provided resources and pretty rock-solid as far as my experience with it goes.

Since I am a frequent user of this – and I’ve got that gorgeous SONOS hardware scattered around my house – I thought I need to have that service integrated into my SONOS set.

The SONOS system knows the concept of “Music Services”. There are quite a lot of them but xsn is missing. But SONOS is awesome and they got an API!

Unfortunately the API documentation is hidden behind a NDA wall so that would be a no-go. What’s not hidden is what the SONOS controllers have to discuss with all the existing services. Most of the time these do not use HTTPS so we’re free to listen to the chatters. I did just that and was able, for the sake of interoperability, to reverse engineer the SONOS SMAPI as far as it is necessary to make my little xsn Music Service work.

As usual you can get the source-code distributed freely through Github. If you’re not into that sort of compiling and programming things, you are invited to use my free-of-charge provided service. To set it up on your home SONOS just follow these simple steps:

Step 1: Start your SONOS Controller Application and find out the IP address of your SONOS.

Click on “About My Sonos System” and check the IP address written next to the “Associated ZP”.

Screen Shot 2014-08-19 at 19.45.56

Step 2: Add the xsn Music Service.

By opening a browser window and browsing to: http://<your-associated-zp-ip>:1400/customsd.htm

When you’re there – fill out the fields as below. The SID is either 255, or if you used that previously, something between 240-253. The service name is “xenim streaming network”. The Endpoint URL and Secure Endpoint URL both are

Set the Polling interval to 30 seconds. Click on the Anonymous Authentication SOAP header policy and you’re good to go. Click on “send” to finish.

Bildschirmfoto 2014-08-19 um 21.16.27

Step 3: Add the new Music Service to your SONOS Controller.

Click on “Add Music Services” and click through until you see “xenim streaming network”. Add the service and you’re set!

p.s.: It’s normal that the service icon is a question mark.

Step 4: Enjoy Live Podcasts!

Source 1:
Source 2:

No Comments

using the RaspberryPi to make all SONOS speakers support Apple Airplay

Airplay allows you to conveniently play music and videos over the air from your iOS or Mac OS X devices on remote speakers.

Since we just recently “migrated” almost all audio equipment in the house to SONOS multi-room audio we were missing a bit the convenience of just pushing a button on the iPad or iPhones to stream audio from those devices inside the household.

To retrofit the Airplay functionality there are two options I know of:

1: Get Airplay compatible hardware and connect it to a SONOS Input.

airportexpress_2012_back-285111You have to get Airplay hardware (like the Airport Express/Extreme,…) and attach it physically to one of the inputs of your SONOS Set-Up.  Typically you will need a SONOS Play:5 which has an analog input jack.


2: Set-Up a RaspberryPi with NodeJS + AirSonos as a software-only solution

You will need a stock RaspberryPi online in your home network. Of course this can run on virtually any other device or hardware that can run NodeJS. For the Pi setting it up is a fairly straight-forward process:

You start with a vanilla Raspbian Image. Update everything with:

sudo apt-get update

sudo apt-get upgrade

Then install NodeJS according to this short tutorial. To set-up the AirSonos software you will need to install additional avahi software. Especially this was needed for my install:

sudo apt-get install git-all libavahi-compat-libdnssd-dev

You then need to get the AirSonos software:

sudo npm install airsonos -g

After some minutes of wait time and hard work by the Pi you will be able to start AirSonos.

sudo airsonos

And it’ll come up with an enumeration of all active rooms.

Screen Shot 2014-06-25 at 11.38.47

And on all your devices it’ll show up like this:



Screen Shot 2014-06-25 at 12.38.27




weave your net of things that have internet…ehm – internet of things


The internet of things” is a buzzword used more and more. It means that things around you are connected to the (inter)network and therefore can talk to each other and, when combined, offer fantastic new opportunities.

Yeah right.

So NodeRed is a NodeJS based toolset that allows you to create so called “flows” (see picture above). Those flows determine what reacts and happens when things happen. Fantastic, told you!

Source 1:
Source 2: 3:

No Comments

Automated Picture Tank and Gallery for a photographer

Since my wife started working as a photographer on a daily basis the daily routine of getting all the pictures off the camera after a long day filled with photo shootings got her bored quickly.

Since we got some RaspberryPis to spare I gave it a try and created a small script which when the Pi gets powered on automatically copies all contents of the attached SD card to the houses storage server. Easy as Pi(e) – so to speak.


So this is now an automated process for a couple of weeks – she comes home, get’s all batteries to their chargers, drops the sd cards into the reader and poweres on the Pi. After it copied everything successfully the Pi sends an eMail with a summary report of what has been done. So far so good – everything is on our backuped storage server then.

Now the problem was that she often does not immediately starts working on the pictures. But she wants to take a closer look without the need to sit in front of a big monitor – like taking a look at her iPad in the kitchen while drinking coffee.

So what we need was a tool that does this:

  • take a folder (the automated import folder) and get all images in there, order them by day
  • display an overview per day of all pictures taken
  • allow to see the fullsized picture if necessary
  • work on any mobile or stationary device in the household – preferably html5 responsive design gallery
  • it should be fast because commonly over 200 pictures are done per day
  • it should be opensource because – well opensource is great – and probably we would need to tweak things a bit

Since I did not find anything near what we had in mind I sat down this afternoon and wrote a tool myself. It’s opensourced and available for you to play with it. Here’s a short description what it does:

It’s called GalleryServer and basically is an embedded http server which takes all .jpg files from a folder (configurable) and offers you some handy tool urls which respons with JSON data for you to work with. I’ve written a very small html user interface with a bit of javascript (using the great html5 kickstart) that allows you to see all available days and get a nice thumbnail overview of each day – when you click on it it opens the full-size image in a new window.

It’s pretty fast because it’s not actively resizing the images – instead it’s taking the thumbnail picture from the original jpg file which the camera placed there during storing the picture. It’s got some caching and can be run on any operating system where mono / .net is available – which is probably anything – even the RaspberryPi.

Source 1: my wifes page
Source 2: 99lime html5 kickstart boilerplate
Source 3:

No Comments

personal annual reports

The report for 2012 is in! Since 2008 Jehiah Czebotar is monitoring his daily life and he is compiling a report from that data for everyone to read. He self says that this is a hat tip to Nicholas Felton who himself is releasing beautiful yearly reports of statistics around his life.

Bildschirmfoto 2013-01-16 um 15.34.15I am a fan of those nice graphics and statistics about the life. It really gives you insights that you wouldn’t be able to get otherwise. Especially with my own home automation and self-monitoring ambitions it’s quite a load of new ideas coming in from these nice graphics.

Source 1:
Source 2:

No Comments

new actors to switch power on/off and measure power usage by AVM

Usually the actors that allow you to switch power on/off and who measure power usage use the 434Mhz or 868Mhz wireless bands to communicate with their base station. Now the german manufacturer AVM came up with a solution that allows you to switch on/off (with an actual button on the device itself and wireless!) and to measure the power consumption of the devices connected to it.

The unspectacular it looks the spectacular are the features:


  • switch up to 2300 watts / 10 ampere
  • use different predefined settings to switch on/off or even use Google Calendar to tell it when to switch
  • measure the energy consumption of connected devices
  • it uses the european DECT standard to communicate with a Fritz!Box base station (which is a requirement)

For around 50 Euro it’s quite an investment but maybe I’ll give it a shot – especially the measurement functionality sounds great. Since I do not have one yet I don’t know anything about how to access it through third party software (h.a.c.s.?)

Source 1:
Source 2:
Source 3:

1 Comment

my home is my castle – CastleOS: the home automation operating system

And once again some smart people put their heads together and came up with something that will revolutionize your world. Well it’s ‘just’ home automation but indeed it looks very very promising. Especially the human-machine interface through speech recognition. First of all let’s start with a short introductory video:

“CastleOS is an integrated software suite for controlling the automation equipment in your home – an operating system for your castle, if you will. The first piece of the suite is what we call the “Core Service” – it acts as the central controller for the whole system. This runs on any relatively recent Windows computer (or more specifically, the computer that has an Insteon PLM or USB stick plugged in to it), and creates a network connection to both your home automation devices, and the second piece of the integrated suite – the remote access apps like the HTML5 app, Kinect voice control app, and future Android/iOS apps.” (from the CastleOS page)

So it’s said to be an all-in-one system that controls power-outlets and devices through it’s core service and offering the option to add Kinect based speech recognition to say things like “Computer, Lights!”.

Unfortunately it comes with quite high and hard requirements when it comes to hardware it’s compatible with. A kinect possible exists in your household but I doubt that you got the Insteon hardware to control out devices with.

That seems to be the main problem of all current home automation solutions – you just have to have the according hardware to use them. It’s not quite possible to use anything and everything in a standardized way. Maybe it’s time to have a “home plug’n’play” specification set-up for all hard- and software vendors to follow?

Source 1:

1 Comment

h.a.c.s. html5 user interface re-implemented

Slow is the right word to describe my html and javascript learning-by-doing progress right now. I have chosen the h.a.c.s. user interface as a valid project to learn html and javascript up to a point where I can start to write useable websites with it. The h.a.c.s. ui seemed to be a good choice because it’s at the moment only used by my family and they are a bunch of battle-proven beta testers.

So first a small video to get an idea what I am implementing right now:

So all you can see is SVG and HTML rendered stuff – made with the help of awesome javascript libraries, as there are:

  • jQuery
    • for the basic javascript coverage
  • Raphaël
    • to draw svg in a human-controllable
  • JustGage
    • to draw those nice gauges
  • OdoMeter
    • an animated HTML5 canvas odometer

I plan to add a lot more – like for swiping gestures. So this will be – just like h.a.c.s – a continuous project. Since I switched to OS X entirely at home I use the great Coda2 to write and debug the code. It helps a lot to have two browser set-up because for some reason I still not feel that well with the WebKit Web Inspector.

Bildschirmfoto 2013-01-06 um 20.47.22

Another great feature of Coda2 is the AirPreview – which means it will preview your current page in the editor on an iOS device running DietCoda – oh how I love those automations.

So I reached the first goal set for myself for the user interface: It’s doing the things the old UI did and it’s maintainable in addition. I am still struggling with javascript here and there – mainly because the debugging and tracing is oh-so-difficult (or I am to slow understanding).

If you got any recommendation for a javascript editor that can handle multiple includes and debugging (step-by-step, …) and good tracing for events please comment!

Source 1: jQuery
Source 2: Raphaël
Source 3: JustGage
Source 4: OdoMeter

No Comments

putting h.a.c.s. (or other) sensory data into a motion based webcam image

I am using some Raspberry Pis to monitor the areas around the house. Mainly because it’s awesome to see how many animals are roaming around in your garden throughout the day. On the Pi I am using the current Debian image and motion to interface with an USB webcam.

Now I wanted to include sensory data into the webcam images – like the current temperature. The nice thing about h.a.c.s. is that it can deliver every sensors data in nice and easy to use JSON. The only challenge now is to get the number into motion.

First of all I need to get the URL together where I can access sensor data for the right sensor. In this case it’s the sensor called “Schuppen” – an outdoor sensor measuring the current temperature around the house.

Bildschirmfoto 2012-12-16 um 00.37.37

Now there is an easy way to ‘feed’ data into a running motion instance. Motion offers a control port and allows to set the text_left and text_right properties. Doing a simple GET request there allows us to set the text to – in this example – “remote-controlled-text”:

Bildschirmfoto 2012-12-16 um 00.52.56

So – that’s how the text is set – now how to get the temperature value, and just that, out of the JSON response of h.a.c.s.? Easy – use jsawk!

Bildschirmfoto 2012-12-16 um 01.02.07

With all that a very small shell script is quickly hacked:

Bildschirmfoto 2012-12-16 um 01.05.38

If you want to copy that into your editor, here’s the code:

TEMPERATURE=`curl -s 'http://hacs/data/sensor?name=Schuppen&type=temperature&lastentry=true' | jsawk 'return[0][1]'`
curl -s 'http://localhost:8080/0/config/set?text_left='$TEMPERATURE

Localhost port 8080 is the port and adress of the motion control server .

To have the webcam updated regularly, I added it to crontab and from now on the current temperature is in every webcam image – hurray!

Source 1: motion
Source 2: jsawk

No Comments

being there, without being there: Good Night Lamp

Isn’t technology great when it brings families closer together, even when they are thousand miles apart?

Home automation does not only mean that you are going to flip some switches and sensor away in every imaginable way. It also means creativity. And being creative with the functionalities at hand is really what makes home automation so interesting.

It’s those creative ways that adds use to the nerdy home automation switches and sensors. It’s what adds practicality.

Good Night Lamp is such a creative solution that makes use of home automation hardware and the internet. To understand the concept, watch a video:

“The Good Night Lamp is a family of connected lamps that lets you communicate the act of coming back home to your loved ones, remotely.”

Well I don’t know if it really needs specialized hardware like those Good Night Lamp products. But certainly if you have some sensory and the ability to flip switches it is fairly easy to come up with workflows and things that should happen when the circumstances are right. In fact I do not believe in highly specialized products like a single-purpose lamp. But I do believe, if those lamps are connected to a network and if you can access them through some sort of API, that those types of products will pave the way to a connected world we only know from science fiction yet.

Another good solution to this is the long promised IP capable light bulb. Engineers were using the “light bulb with an ip adress” as an example for IPv6 for years now. And it seems that the time has come when we really want to assign an ip adress to every lightbulb in our home.

LIFX is a good start concept and in a couple of months there will be more manufacturers who are offering networked light bulb solutions.


Source 1:
Source 2:

No Comments

ELV MAX! Cube C# Library – control your cube!

I was asked if it would be possible to get the ELV MAX! Cube interfacing functionality outside of h.a.c.s. – maybe as a library. Sure! That is possible. And to speed up things I give you the ELV MAX! Cube C# Library called: MAXSharp

It’s a plain and simple library without much dependencies – in fact there’s only some threading and the FastSerializer. Since I am using this library with h.a.c.s. as well I did not remove the serializer implementation.

There’s a small demo program included which is called MAXSharpExample. The library itself contains the abstractions necessary to get information from the ELV MAX! Cube. It does not contain functionality to control the cube – if you want to add, feel free it’s all open sourced and I would love to see pull requests!

The architecture is based upon polling – I know events would make a cleaner view but for various reasons I am using queues in h.a.c.s. and therefore MAXSharp does as well. The example application spins up the ELV MAX interfacing / handling thread and as soon as you’re connected you can access all house related information and get diff-events from the cube.

Any comment is appreciated!

Source 1: State of Reverse Engineering
Source 2:

No Comments

if this than that – simple recipes for home automation

Workflows are important – and having a lot of switching possiblities and even more sensors that measure things it begins to become important to be able to implement workflows behind all that hardware.

It’s nice to be able to switch light on and of when you want to. But isn’t it even better to have some sort of workflow behind all sorts of triggers. Think of the possibilities!

If this then that is a service to help you define very simple workflows:

Want an example?

It knows a lot of ‘this’ and a lot of ‘that’. So give it a try or even better, add your own home automation software as ‘this’ and ‘that’ 🙂

Source 1:

No Comments

extending the house storage

In times when mobile phone cameras produce pictures of 2 MBytes each and decent DSLR cameras produce pictures in the range of more than 20 Mbytes each – not speaking of the various sensors around the house the question of how all of this is going to be stored is an interesting one.

Prices for mass storage is dropping for years and sized of hard disks are getting bigger and bigger. 3 Tbyte drives are fairly cheap now. Cheap enough to consider serious redundancy even for home use.

Having that home automation hobby and having very specific needs when it comes to home entertainment or even watching TV (we don’t watch live-tv…) we have a relatively huge demand for storage space. That way we are already storing over 10 Tbyte of data, fully encrypted, redundant and backed-up.

Our file server infrastructure grew with the needs over the years.

It started way back in 2003 when I set-up the first fileserver for my apartment back then. It was a fairly huge 19 inch case with 5 hard disks (100 Gbyte each). This machine was filled in 2005 and needed replacement.

We’re in IDE land back then. Because the system hardware died on me due to a power surge all the disks and a new mainboard were seated in a new case with room for a lot of disks.

One interesting detail might be that I consistently used Windows Server for that purpose.

The machine always wasn’t just a fileserver. It was smtp, imap, nntp and media server all the time. That lead to a growing demand of CPU and memory resources. It started with an 800 Mhz AMD Athlon (which died quickly) and for the next years to come I used a 2.8 Ghz Intel Pentium 4. Everything started with Windows Server 2003 – bought in the Microsoft Store when I was a Microsoft employee.

Diskspace demand kept growing and in 2009 a new case, new mainboard + memory and new disks where due.

Since 2009 a Core4Quad Q9550 with 2.8 Ghz and 16 Gbyte of Memory is the heart of our fileserver. Since we’re frequently live-transcoding video streams to feed iPads and iPhones around the house that machine has plenty of grunt to feed the demand. We can have 2 iPhones and 2 iPads playing 720p content without getting stutters. Back in 2009 we also switched to a mixed IDE and SATA setup as you can see in the picture:

Plenty of room when the new case arrived – it was getting crowded just 2 years later in 2011. Every seat was taken – which means 13 disks are in that case and 1 attached through USB.

That adds up to more than 16 Tbyte of raw storage. In 2011 we also upgraded to Windows Server 2008. We never lost a bit with that operating system, not under the heaviest load and even through serious hardware malfunctions. A lot of disks of those 13 died throughout the years: Almost 1 every 2 months was replaced – most of them through extended waranties – of course we have a spare always ready to take the place. Only one time I had to rush to a store to get a replacement drive when two disks failed short after each other. That’s why there’s that 2 Tbyte drive in the 1.5 Tbyte compound…

So it’s getting full again. Since that case isn’t really holding more disks and replacing them is getting harder because of the tight fit the idea was born to now add a bigger case but to just add a NAS/SAN which holds between 6 to 8 disks at once, comes with it’s own redundancy management and exports one big iSCSI volume.

That said a network card was added to the fileserver and a QNAP TS-859 Pro+ 8-bay appliance was bought. This one is a shiny black device which uses less power then an aditional case with extra cpu and memory would have use and after calculating through a number of combinations it’s even the cheapest solution for an 8 drive set-up.

After some intensive testing it seems that the iSCSI approach is the most robust one. Since I am just done with testing the appliance the next step is to buy drives. So stay tuned!

Source 1:


ELV MAX! Cube and the Solar-log 500 – state of the reverse engineering and h.a.c.s.

It’s been some weeks since I wrote a status update on the ELV MAX! cube protocol reverse engineering and integration into my own home automation project called h.a.c.s..

So first of all I want to give a short overview over what has been achieved so far:

  • I wrote a C# library, highly influenced by a PHP implementation from the domotica forum, which allows you to continuesly get status information from the ELV MAX! cube with current (1.3.6) firmware. It is tested so far with a fairly big set-up for the ELV MAX! cube (see below)
  • I was able to integrate that library into my own home automation project called h.a.c.s. – There the ELV MAX! cube is just another device, alongside a EzControl XS1 and a SolarLog 500. The cube is monitored using my library and diff-sets as well as status information are stored automatically with the h.a.c.s. built-in mechanisms. In fact you can access for example the window shutter contact information just like you would with any other door contact in the EzControl XS1.
  • You can use events coming from the ELV MAX! cube to create new events – how about switching off/on devices when opening/closing windows?
  • Every bit of information from all integrated sensor monitoring and actor handling devices come together in h.a.c.s.

I started the reverse engineering with just one shutter contact and one thermostat. After all my test were successful I went for the big package and ordered some more sensors. This is how the setup is currently configured:

ELV MAX! set-up

I’ve learned a lot of interesting things about the ELV MAX! cube hardware and software. One is that you need to be ready for surprised. The documentation of the cube tells you the following:

Did you spot the funny fact? 50 devices – we’re well below that limit. 10 rooms – holy big mansion batman! We’re well over that. How is that possible? Well take it as a fact – you can create more than 10 rooms. And that is very handy. I’ve created 13 rooms and there are probably more to come because those shutter contacts are quite cheap and can be used for various other home automation sensory games. The tool to set-up and pair those sensors just came up with a notice that said “Oh well, you want to create more than 10 rooms? If you’re sure that you want that we allow you to, but hey, don’t blame us!”. Cool move ELV! – As of now I haven’t found any downside of having more than 10 rooms.

All my efforts started with firmware version 1.3.5. This firmware seemed to have some severe memory leaks – because just by retrieving the current configuration information every 10 seconds the device would stop communicating after more then 48 hours. Only a reboot could revive it – sometimes amnesia set in which led to a house roundtrip for me.

With some changes in the library (like keeping the connection open as long as possible) and a new firmware version 1.3.6. the cube was way more cooperative and hasn’t crashed for about 1 month now (with 10 seconds update times).

So what does my library do? It is designed to run in it’s own thread. When it’s started it opens a connection to the cube and retrieves the current status and configuration information. Those informations are stored in an object called “House”. This house consists of multiple rooms – and those rooms are filled with window shutter contacts and thermostats. All information related to those different intances are stored along with them. The integration into h.a.c.s. allows the library to generate sensor and actor events (like when a temperature changes, a window opens/closes) which are passed back to h.a.c.s. and handled in the big event loop there.

With all that ELV MAX! cube data I wanted to plug a quite nice tool that I am using in the iPhone and the iPad. It’s called “Moni4home” and it allows you to control the EzControl XS1 directly. Because it’s only accessing the EzControl XS1 I used h.a.c.s. to “inject” additional sensor data into the standard EzControl XS1 data. So basically data flow is like this: iPad app accesses h.a.c.s. which acts as a proxy. h.a.c.s. retrieves the EzControl XS1 sensor and actor data and injects additional virtual sensors like those from the ELV MAX! cube. h.a.c.s. then sends that beefed up data to the iPad app. Voilá!

After the successful integration of the ELV MAX! cube I’ve started to work on the next bit of networking home automation equipment in my house – a solar panel data logger called “Solar-Log 500”. This device monitors two solar power inverters and stores the sensory data.

Solar-Log 500 built-in statistics page

“Funny” story first: this device has the same problem like the ELV MAX! cube. When you start to poll it every 10 seconds (or less) it just stops operating after about 20 hours. Bear in mind: In case of the Solar-Log I just http-get a page that looks like this in the browser:

And by doing so every 10 seconds the device stops working. I am using the current firmware – so one workaround for that issue is to planable reboot the Solar-Log at a time when there is no sun and therefore nothing to log or monitor.

Beside that it’s a fairly easy process: Get that information, log it. Done.

that’s how the console output of h.a.c.s. looks like with all sensors and devices active (Mozilla+Wilma are the two aquaria :-))

So there you have it – h.a.c.s. interfacing with three different devices and roughly 100 sensors and actors over 434mhz/868mhz, wireless and wired network. There’s still more to come!

A lot of people seem to dive into home automation these days. Apparently Andreas is also at the point of starting his own home automation project. Good to know that he also is using the EzControl XS1 and in the future maybe even the ELV MAX! cube. Party on Andreas!

Source 1: ELV MAX! cube progress
Source 2: Reverse Engineering the ELV MAX! cube protocol
Source 3: ELV MAX! cube – home automation for the heating
Source 4:
Source 5: h.a.c.s. sourcecode
Source 6:
Source 7:

1 Comment

generate C# classes from JSON data

It’s a common use case: you’ve got some JSON formatted data and you want to interface with it using your favourite programming language C#. You can write the appropriate classes yourself, or you could use the fabulous json2csharp helper page.

Source 1:
Source 2:
Source 3:

No Comments

openHAB – home automation bus

It certainly is just me thinking: this home automation / smart home thing gains more momentum every week. Now there’s a java based home automation bus initative taking care of the software standardization side. Quite interesting. And beside all that they had some fantastic ideas how a user interface for those things should look like. Like for example how you would interact with your house while planning when things power on and off. Use Google Calendar! This is just plain genius!

“The open Home Automation Bus (openHAB) project aims at providing a universal integration platform for all things around home automation. It is a pure Java solution, fully based on OSGi. The Equinox OSGi runtime and Jetty as a web server build the core foundation of the runtime.

It is designed to be absolutely vendor-neutral as well as hardware/protocol-agnostic. openHAB brings together different bus systems, hardware devices and interface protocols by dedicated bindings. These bindings send and receive commands and status updates on the openHAB event bus. This concept allows designing user interfaces with a unique look&feel, but with the possibility to operate devices based on a big number of different technologies. Besides the user interfaces, it also brings the power of automation logics across different system boundaries.”

I especially like the idea of that calendar integration – sending scripts through an appointment is a great idea – having some sort of scripting language is another one. A little bit on the marketing side is the option to chat with your house through XMPP / Jabber… that might take the idea a little bit too far out – but who would want to blame them? Fantastic stuff!

Source 1:
Source 2:


home automation example: domotica

For several years now I am interested in this home automation thing – I even got a little bit of my own home automation going. But with websites like domotica you can get an idea of what is achieveable and how it might look for the people actually using it every day.

Source 1:

1 Comment

reverse-engineering the ELV MAX! Cube protocol

I had a couple of hours to tinker with my ELV MAX! Cube and there is some progress with the protocol reverse engineering.

Of course there is the domotica forum helping out with some information the guys over there have found but in addition to their very helpful findings I want it to be integrated into h.a.c.s. – and along with it I maybe want to have a way to find eventual protocol changes quick and easy in the future.

So yesterday I partied on the ‘first contact’ – today I am a bit deeper into the protocol itself:

Here are some explanations to the picture:

When a tcp connection to the cube is opened you can immediately read from it – the cube is throwing information at you. There’s always a character at the beginning of each line which marks the type and beginning of the message.

There seem to be these types of messages in the first package of information:

  • H – Header maybe?
    •  it contains the serial number of your cube, the RF address, the firmware version and several other things like time information
  • M – Metadata?
    • this seems to be some kind of global metadata list, containing the rooms with their IDs (it’s the %) in the screenshot). Furthermore it contains the serial numbers and names of the devices in that room – at the moment there’s just a window-state-sensor in that first room called “Fensterkontakt 1”
  • C – Configuration?
    • since there are multiple C messages these seem to contain detailed configuration data specific to a device in the MAX! network. Each device seems to be addressed by a RF Address and it’s serial number.
    • the first C message in the screenshot is associated to the cube itself
    • the second C message is associated to the window-state-sensor – you can clearly see in there the room id “%)” and the serial of the window-state-sensor.
  • L – live status?
    • this message seems to contain room status information. In our case there is only the room with id “%)”. When the window-state-sensor changes state the last byte changes value – interesting, eh?

On the coding side I’ve got several things set-up in my little debug tool. I’ve wrapped those message types into various classes to handle them more easily later on in h.a.c.s.. Furthermore I used a little decompiler-wisdom to extract some more information from the included ELV MAX! cube software.

Thanks to german UrhG paragraph § 69e (german copyright law) I am allowed to decompile the included software in order to achieve interoperability (and only that). That’s exactly what I would like to achieve: Interoperability. And for the record: besides that I also filed a support request to ELV in which I ask them if I could get access to a presumably existing documentation of that protocol.

While waiting on that documentation I am using JD-GUI as a decompiler user interface for java – since the software of the cube is written in java.

There are many interesting things in there but it’s a slow process to get ahold of all the things necessary. There are already some very nice things showing up. Like when you want to know if there’s a cube (or more) in the network you just need to send a multicast ip packet containing a characteristic signature and all the cubes in your network will try to connect back to you with some basic information – nice, isn’t it? Or what about that AES Encryption/Decryption that seems to be built into the cube? Yes that’s right! It seems to be possible to send commands to either encrypt or decrypt according to the AES. Thoughtfully these commands are marked with ‘e’ and ‘d’. Or that if you send “l:” as a command with CR+LF at the end you get a device listing with all stats… and so on.

Some open question to EQ-3/ELV for the end of this article:

  • Why this strange protocol? Why all the work on both sides? Just because an HTTP server implementation with a RESTful service would have been that more difficult?
  • Base64 encoded data? The 90s called, they want their 8th bit back.
  • why that complex local webserver approach when you could have done everything in a java app anyways?

That’s it for today, I just pushed a feature to the Git repository which allows you to run whatever command you like on your cube with the debugging tool:

Enjoy! 🙂

Source 1:
Source 2:
Source 3:
Source 4:
Source 5:

1 Comment

ELV Max! Cube – home automation for the heating

For several years now I am building my own home automation tools by putting together existing hardware and self-written software. As the central software core of my home automation system I use h.a.c.s. – “home automation control server” which I put up as open source software on GitHub.

Throughout the years I was able to embedd a lot of daily tasks and measurements in one place which can be accessed by a simple web page. It currently looks like this:

You can find some articles on this blog about h.a.c.s. if you want to know more about it.

As of today I can control and measure the states of switches, windows, doors, temperature and humidity and power consumption. Scenarios like “when this door opens, switch on that light” are easy things to do with h.a.c.s.

Now “Winter’s coming!”. And therefore I want to take control of the heating of each and every room in the house. I want to set a goal for a temperature and I want the heating to fire up or cool down with that goal. And of course I want to monitor manual changes of each and every radiator in the house.

Last week then I stumbled upon a piece of kit called “ELV MAX! Cube”. It’s a white cube (as the name implies) which offers a USB port from which it is powered and an RJ-45 ethernet port which connects the cube to the home network.

The cube itself does not draw much power and it can be powered by the routers USB port easily. It allows you to connect some peripherals using 868 mhz rf. Those peripherals can be: window state sensors (closed/open) and thermostats to control the radiators (and a switch but, well… hopefully not necessary).

It comes with it’s own user interface – a java application that connects to the device and allows you to configure it. Quite nice – it runs on Windows and Mac. You can use a cloud service to control the device over the internet, but I have no intention in trying that out right now.

My plan is to extend h.a.c.s. to get information from the cube and handle them and in the end even control the cube by setting temperatures and controlling the outcome of those changes.

As of now there are some efforts to decode the quite interesting protocol the cube is talking. You communicate with the cube over TCP (my cube listens on port 62910).

Currently I am building a small debug application which allows me to experiment with the output of the cube faster than plain telnet would. And within this I had the first contact tonight:

As always all my efforts can be seen in the hacs repository.

Source 1:
Source 2:
Source 3:
Source 4:

1 Comment