History Lessons: Trial of the Major War Criminals before the International Military Tribunal in Nuremberg

History was one of my favourite classes at school – I liked it so much that I even wrote one of my final examinations at the A levels in history. I like to know how stuff happened and I like to know what people got from it.

Being a german there’s a lot of history in the last 100 years guiding the interest. You can imagine that the darkest parts of those 100 years are the first and the second world war. Thankfully my generation never had to suffer through such a terrifiying time.

So for the equally interested reader of this article I have good news. In times of the internet we get access to documents that were previously hard or expensive (or both) to get. Like the original documents of the so called Nuremberg Trial – the Trial of the Major War Criminals before the International Military Tribunal in Nuremberg.

You can get them in english, over 16.000 pages of PDFs, packed into 42 PDF files. Or in the official translation in German on Zeno.org.

That will keep me reading for a while – but there’s even more. With the progression of scan projects more and more original sources are becoming available for everyone.

Source 1: http://www.loc.gov/rr/frd/Military_Law/NT_major-war-criminals.html
Source 2: http://avalon.law.yale.edu/subject_menus/imt.asp
Source 3: http://nuremberg.law.harvard.edu/php/docs_swi.php?DI=1&text=overview
Source 4: http://www.zeno.org/Geschichte/M/Der%20N%FCrnberger%20Proze%DF

ELV MAX! Cube and the Solar-log 500 – state of the reverse engineering and h.a.c.s.

It’s been some weeks since I wrote a status update on the ELV MAX! cube protocol reverse engineering and integration into my own home automation project called h.a.c.s..

So first of all I want to give a short overview over what has been achieved so far:

  • I wrote a C# library, highly influenced by a PHP implementation from the domotica forum, which allows you to continuesly get status information from the ELV MAX! cube with current (1.3.6) firmware. It is tested so far with a fairly big set-up for the ELV MAX! cube (see below)
  • I was able to integrate that library into my own home automation project called h.a.c.s. – There the ELV MAX! cube is just another device, alongside a EzControl XS1 and a SolarLog 500. The cube is monitored using my library and diff-sets as well as status information are stored automatically with the h.a.c.s. built-in mechanisms. In fact you can access for example the window shutter contact information just like you would with any other door contact in the EzControl XS1.
  • You can use events coming from the ELV MAX! cube to create new events – how about switching off/on devices when opening/closing windows?
  • Every bit of information from all integrated sensor monitoring and actor handling devices come together in h.a.c.s.

I started the reverse engineering with just one shutter contact and one thermostat. After all my test were successful I went for the big package and ordered some more sensors. This is how the setup is currently configured:

ELV MAX! set-up

I’ve learned a lot of interesting things about the ELV MAX! cube hardware and software. One is that you need to be ready for surprised. The documentation of the cube tells you the following:

Did you spot the funny fact? 50 devices – we’re well below that limit. 10 rooms – holy big mansion batman! We’re well over that. How is that possible? Well take it as a fact – you can create more than 10 rooms. And that is very handy. I’ve created 13 rooms and there are probably more to come because those shutter contacts are quite cheap and can be used for various other home automation sensory games. The tool to set-up and pair those sensors just came up with a notice that said “Oh well, you want to create more than 10 rooms? If you’re sure that you want that we allow you to, but hey, don’t blame us!”. Cool move ELV! – As of now I haven’t found any downside of having more than 10 rooms.

All my efforts started with firmware version 1.3.5. This firmware seemed to have some severe memory leaks – because just by retrieving the current configuration information every 10 seconds the device would stop communicating after more then 48 hours. Only a reboot could revive it – sometimes amnesia set in which led to a house roundtrip for me.

With some changes in the library (like keeping the connection open as long as possible) and a new firmware version 1.3.6. the cube was way more cooperative and hasn’t crashed for about 1 month now (with 10 seconds update times).

So what does my library do? It is designed to run in it’s own thread. When it’s started it opens a connection to the cube and retrieves the current status and configuration information. Those informations are stored in an object called “House”. This house consists of multiple rooms – and those rooms are filled with window shutter contacts and thermostats. All information related to those different intances are stored along with them. The integration into h.a.c.s. allows the library to generate sensor and actor events (like when a temperature changes, a window opens/closes) which are passed back to h.a.c.s. and handled in the big event loop there.

With all that ELV MAX! cube data I wanted to plug a quite nice tool that I am using in the iPhone and the iPad. It’s called “Moni4home” and it allows you to control the EzControl XS1 directly. Because it’s only accessing the EzControl XS1 I used h.a.c.s. to “inject” additional sensor data into the standard EzControl XS1 data. So basically data flow is like this: iPad app accesses h.a.c.s. which acts as a proxy. h.a.c.s. retrieves the EzControl XS1 sensor and actor data and injects additional virtual sensors like those from the ELV MAX! cube. h.a.c.s. then sends that beefed up data to the iPad app. Voilá!

After the successful integration of the ELV MAX! cube I’ve started to work on the next bit of networking home automation equipment in my house – a solar panel data logger called “Solar-Log 500”. This device monitors two solar power inverters and stores the sensory data.

Solar-Log 500 built-in statistics page

“Funny” story first: this device has the same problem like the ELV MAX! cube. When you start to poll it every 10 seconds (or less) it just stops operating after about 20 hours. Bear in mind: In case of the Solar-Log I just http-get a page that looks like this in the browser:

And by doing so every 10 seconds the device stops working. I am using the current firmware – so one workaround for that issue is to planable reboot the Solar-Log at a time when there is no sun and therefore nothing to log or monitor.

Beside that it’s a fairly easy process: Get that information, log it. Done.

that’s how the console output of h.a.c.s. looks like with all sensors and devices active (Mozilla+Wilma are the two aquaria :-))

So there you have it – h.a.c.s. interfacing with three different devices and roughly 100 sensors and actors over 434mhz/868mhz, wireless and wired network. There’s still more to come!

A lot of people seem to dive into home automation these days. Apparently Andreas is also at the point of starting his own home automation project. Good to know that he also is using the EzControl XS1 and in the future maybe even the ELV MAX! cube. Party on Andreas!

Source 1: ELV MAX! cube progress
Source 2: Reverse Engineering the ELV MAX! cube protocol
Source 3: ELV MAX! cube – home automation for the heating
Source 4: http://www.solar-log.com/de/produkte-loesungen/solar-log-500/uebersicht.html
Source 5: h.a.c.s. sourcecode
Source 6: http://monitor4home.com/Beschreibung.html
Source 7: http://www.aheil.de/2012/11/06/hack-the-planet-architectural-draft/

second Tokyo Trip 2012 – Rakuten Technology Conference 2012

This October I had the pleasure to fly to Tokyo for the second time in 2012.

The development unit of Rakuten Japan was hosting the 7th Rakuten Technology Conference in Rakuten Tower 1 in Tokyo.

The schedule was packed with up to 6 tracks in parallel. From research to grass-roots-development a lot of interesting topics.

[nggallery id=4]

Source 1: http://tech.rakuten.co.jp/rtc2012/
Source 2: Recorded Lectures

open source audio codecs getting better

Some weeks ago I heard about a new audio codec which is being developed as open source – very similar to vorbis – the previous open source approach to audio codecs.

This time it seems that they’ve got some standardization into the play so it might be more successful than vorbis was.

“Opus is a totally open, royalty-free, highly versatile audio codec. Opus is unmatched for interactive speech and music transmission over the Internet, but also intended for storage and streaming applications. It is standardized by the Internet Engineering Task Force (IETF) as RFC 6716 which incorporated technology from Skype’s SILK codec and Xiph.Org’s CELT codec.”

Source 1: http://www.opus-codec.org/
Source 2: http://auphonic.com/blog/2012/09/26/opus-revolutionary-open-audio-codec-podcasts-and-internet-audio/
Source 3: http://tools.ietf.org/html/rfc6716

Learn to code

Knowing how to deal with those personal computers is getting more important by the day. Not everybody needs to know how to write code – but since writing code and making those machines do what you want them to do isn’t as hard as it used to be it’s worth the try!

On the mission to learn to code this page is probably very interesting for anyone wanting to learn:

Source 1: http://www.codecademy.com/#!/exercises/0

generate C# classes from JSON data

It’s a common use case: you’ve got some JSON formatted data and you want to interface with it using your favourite programming language C#. You can write the appropriate classes yourself, or you could use the fabulous json2csharp helper page.

Source 1: http://json2csharp.com/
Source 2: http://jsonclassgenerator.codeplex.com/
Source 3: http://json.codeplex.com/

a font for number people

OpenType is a font format which I personally might have underestimated in the past. Well you know – fonts and stuff. This all seemed not too interesting up until now. Now that changed dramatically when a font came to my attention which can be used for various purposes and as a font does not resemble the normal numbers and characters scheme. But what can it be used then if not to type numbers and characters?

Well. What about typing graphs?

Everything in the above image is generated by a font… like in your Word-processor (if it uses that font)

“Designed by Travis Kochel, FF Chartwell is a fantastic typeface for creating simple graphs. Driven by the frustration of creating graphs within design applications (primarily Adobe Creative Suite) and inspired by typefaces such as FF Beowolf and ­­FF PicLig, Travis saw an opportunity to take advantage of OpenType technology to simplify the process.

Using OpenType features, simple strings of numbers are automatically transformed into charts. The visualized data remains editable, allowing for hassle-free updates and styling.”

Source 1: https://www.fontfont.com/how-to-use-ff-chartwell

baking with the PI

Do you know what happens during the push of the power button and typing your log-in information inside of your computer? No? You should. At least from a software side. Not that it is necessary to use a computer. But in order to understand what this wonderful machine does and why.

For those teaching and learning purposes the Raspberry Pi is a perfect device. It’s cheap and now there is a course you can take online which shows you – starting from the very beginning – how to get the device up and running and how to make it do what you like. And that’s without installing an operating system. You are about to write your very own.

“This website is here to guide you through the process of developing very basic operating systems on the Raspberry Pi! This website is aimed at people aged 16 and upwards, although younger readers may still find some of it accessible, particularly with assistance. More lessons may be added to this course in time.”

Source: http://www.cl.cam.ac.uk/freshers/raspberrypi/tutorials/os/

Entwickler in Hamburg – die Developer Conference 2012 in Hamburg

Der Freitag der vergangenen Woche begann sehr sehr früh. Es ging nach Nürnberg um den Flug nach Hamburg zu erwischen. Erstaunlich wie günstig die heutzutage sind: der Flug nach Hamburg (50 Minuten in der Luft) sollte nur 10 Euro teurer als die Zugfahrt zurück (4 Stunden auf Schienen) sein…

Jedenfalls war es ein schön kurzer Flug und schwupps stand ich vor der Otto Versand Zentrale in Hamburg… Es war Zeit für die…

… Developer Conference Hamburg 2012.

Es war meine erste DevCon-HH und dementsprechend kann ich keine Vergleiche zum letzten Jahr ziehen. Die Räumlichkeiten – direkt bei Otto – waren jedenfalls sehr ordentlich aufgebaut, alles sehr bequem. Kurze Wege zwischen Kaffee und Vortragsstuhl. Die 2 der Vortragssäle waren leider nur über den Hauptsaal zu erreichen. Was ein-zweimal dazu führte dass Vortrage in den kleineren Sälen bereits beendet waren und die Menschenmengen durch den Hauptsaal Richtung Kaffee strömten während die Zuhörer im Hauptsaal noch versuchten zuzuhören. Hier mal im Bild erklärt: Rechts der große Hauptsaal und Links ein kleinerer Vortragssaal. Ich stand beim fotografieren direkt im Türrahmen.

Es ging für mich mit zwei sehr guten und interessanten Vorträgen los. Die Keynote des ersten Tages gibt es mittlerweile auch, wie es sich gehört, auf Slideshare:

Insgesamt war die Qualität der Vorträge sehr hoch. Ich fand die Mischung zwischen harten und soften Themen rund um die Software-Entwicklung mehr als gelungen und sicherlich werde ich versuchen nächstes Jahr wieder zu kommen.

[nggallery id=3]

 

Source 1: http://www.developer-conference-hh.de/

Congratulations Mr. Heil!

I am very pleased to congratulate a good man and ex-colleague for his accomplished mission: He handed in his PhD thesis with the Subject (german): “Anwendungsentwicklun für intelligente Umgebungen im Web Engineering“:

“This book describes a holistic approach to develop complex software systems based on the WebComposition process model. It shows how to integrate soft- and hardware components in a cost efficient and effective way using  Web technologies and the Semantic Web. The WebComposition Concurrency System, a formal language to predict system dependencies and conflicts, allows efficient planing and monitoring of the development and operation process of the overall system.”

I had the pleasure to work with Andreas on several occasions. One that I remember with the strongest feelings is a 2 1/2 day around-the-clock hack-a-thon at Microsoft Research. We got it working back then!

For the last 8 years I am constantly trying to get him interested and convinced to work on things directly or remotely connected to some of the stuff I do – but up until now luck wasn’t on my side. Maybe someday :-)

My sincere compliments on achieving his goal on this. Congratulations!

Source 1: http://www.aheil.de/books/
Source 2: http://blog.aheil.de
Source 3: http://www.schrankmonster.de/category/familyandfriends/aheil-de/

openHAB – home automation bus

It certainly is just me thinking: this home automation / smart home thing gains more momentum every week. Now there’s a java based home automation bus initative taking care of the software standardization side. Quite interesting. And beside all that they had some fantastic ideas how a user interface for those things should look like. Like for example how you would interact with your house while planning when things power on and off. Use Google Calendar! This is just plain genius!

“The open Home Automation Bus (openHAB) project aims at providing a universal integration platform for all things around home automation. It is a pure Java solution, fully based on OSGi. The Equinox OSGi runtime and Jetty as a web server build the core foundation of the runtime.

It is designed to be absolutely vendor-neutral as well as hardware/protocol-agnostic. openHAB brings together different bus systems, hardware devices and interface protocols by dedicated bindings. These bindings send and receive commands and status updates on the openHAB event bus. This concept allows designing user interfaces with a unique look&feel, but with the possibility to operate devices based on a big number of different technologies. Besides the user interfaces, it also brings the power of automation logics across different system boundaries.”

I especially like the idea of that calendar integration – sending scripts through an appointment is a great idea – having some sort of scripting language is another one. A little bit on the marketing side is the option to chat with your house through XMPP / Jabber… that might take the idea a little bit too far out – but who would want to blame them? Fantastic stuff!

Source 1: http://www.openhab.org/
Source 2: http://kaikreuzer.blogspot.de/2012/08/openhab-1.html

ELV MAX! Cube progress

I’ve just pushed a commit to the repository which finalizes my current effort in getting data out of the ELV MAX! Cube. With this sourcecode you should be able to get the following information out of your ELV MAX! Cube:

  • a list of all configured rooms
  • a list of all devices in those rooms
  • Thermostat and ShutterContacts have all their flags with them (like Battery Status, Open/Closes, Mode (auto, manual,…))

That brings me one step further to the integration of the ELV MAX! Cube into h.a.c.s. – next weekend probably :-)

p.s.: I’ve already ordered more thermostat and shuttercontact sensors.

Source 1: https://github.com/bietiekay/hacs

home automation example: domotica

For several years now I am interested in this home automation thing – I even got a little bit of my own home automation going. But with websites like domotica you can get an idea of what is achieveable and how it might look for the people actually using it every day.

Source 1: http://www.hekkers.net/domotica/Default.aspx

reverse-engineering the ELV MAX! Cube protocol

I had a couple of hours to tinker with my ELV MAX! Cube and there is some progress with the protocol reverse engineering.

Of course there is the domotica forum helping out with some information the guys over there have found but in addition to their very helpful findings I want it to be integrated into h.a.c.s. – and along with it I maybe want to have a way to find eventual protocol changes quick and easy in the future.

So yesterday I partied on the ‘first contact’ – today I am a bit deeper into the protocol itself:

Here are some explanations to the picture:

When a tcp connection to the cube is opened you can immediately read from it – the cube is throwing information at you. There’s always a character at the beginning of each line which marks the type and beginning of the message.

There seem to be these types of messages in the first package of information:

  • H – Header maybe?
    •  it contains the serial number of your cube, the RF address, the firmware version and several other things like time information
  • M – Metadata?
    • this seems to be some kind of global metadata list, containing the rooms with their IDs (it’s the %) in the screenshot). Furthermore it contains the serial numbers and names of the devices in that room – at the moment there’s just a window-state-sensor in that first room called “Fensterkontakt 1”
  • C – Configuration?
    • since there are multiple C messages these seem to contain detailed configuration data specific to a device in the MAX! network. Each device seems to be addressed by a RF Address and it’s serial number.
    • the first C message in the screenshot is associated to the cube itself
    • the second C message is associated to the window-state-sensor – you can clearly see in there the room id “%)” and the serial of the window-state-sensor.
  • L – live status?
    • this message seems to contain room status information. In our case there is only the room with id “%)”. When the window-state-sensor changes state the last byte changes value – interesting, eh?

On the coding side I’ve got several things set-up in my little debug tool. I’ve wrapped those message types into various classes to handle them more easily later on in h.a.c.s.. Furthermore I used a little decompiler-wisdom to extract some more information from the included ELV MAX! cube software.

Thanks to german UrhG paragraph § 69e (german copyright law) I am allowed to decompile the included software in order to achieve interoperability (and only that). That’s exactly what I would like to achieve: Interoperability. And for the record: besides that I also filed a support request to ELV in which I ask them if I could get access to a presumably existing documentation of that protocol.

While waiting on that documentation I am using JD-GUI as a decompiler user interface for java – since the software of the cube is written in java.

There are many interesting things in there but it’s a slow process to get ahold of all the things necessary. There are already some very nice things showing up. Like when you want to know if there’s a cube (or more) in the network you just need to send a multicast ip packet containing a characteristic signature and all the cubes in your network will try to connect back to you with some basic information – nice, isn’t it? Or what about that AES Encryption/Decryption that seems to be built into the cube? Yes that’s right! It seems to be possible to send commands to either encrypt or decrypt according to the AES. Thoughtfully these commands are marked with ‘e’ and ‘d’. Or that if you send “l:” as a command with CR+LF at the end you get a device listing with all stats… and so on.

Some open question to EQ-3/ELV for the end of this article:

  • Why this strange protocol? Why all the work on both sides? Just because an HTTP server implementation with a RESTful service would have been that more difficult?
  • Base64 encoded data? The 90s called, they want their 8th bit back.
  • why that complex local webserver approach when you could have done everything in a java app anyways?

That’s it for today, I just pushed a feature to the Git repository which allows you to run whatever command you like on your cube with the debugging tool:

Enjoy! :-)

Source 1: http://www.domoticaforum.eu/viewtopic.php?f=66&t=6654&sid=f8f912914163cb44d447cfa3de44d63d
Source 2: http://en.wikipedia.org/wiki/Decompiler
Source 3: http://www.gesetze-im-internet.de/urhg/__69e.html
Source 4: http://java.decompiler.free.fr/?q=jdgui
Source 5: http://en.wikipedia.org/wiki/Advanced_Encryption_Standard

a file is a file is a file.

Ever wondered how a software finds out that this file named “filename” is a pdf, jpeg, movie? There are several thousands, probably hundreds-of-thousands of fileformats out there. Some of them are used many times a day without us even noticing. We’re just moving an image from A to B not caring about what constitutes an image file and what makes a jpeg different to a png image.

Now for pure academic reasons there is one file that is many (no, not borg). It’s a file that is:

“CorkaMIX.exe is simultaneously a valid: * Windows Portable Executable binary * Adobe Reader PDF document * Oracle Java JAR (a CLASS inside a ZIP)/Python script * HTML page

It serves no purpose, except proving that files format not starting at offset 0 are a bad idea. Many files (known as polyglot) already combines various langages in one file, however it’s most of the time at source level, not binary level.”

Source 1: http://code.google.com/p/corkami/downloads/detail?name=CorkaMIX.zip

ELV Max! Cube – home automation for the heating

For several years now I am building my own home automation tools by putting together existing hardware and self-written software. As the central software core of my home automation system I use h.a.c.s. – “home automation control server” which I put up as open source software on GitHub.

Throughout the years I was able to embedd a lot of daily tasks and measurements in one place which can be accessed by a simple web page. It currently looks like this:

You can find some articles on this blog about h.a.c.s. if you want to know more about it.

As of today I can control and measure the states of switches, windows, doors, temperature and humidity and power consumption. Scenarios like “when this door opens, switch on that light” are easy things to do with h.a.c.s.

Now “Winter’s coming!”. And therefore I want to take control of the heating of each and every room in the house. I want to set a goal for a temperature and I want the heating to fire up or cool down with that goal. And of course I want to monitor manual changes of each and every radiator in the house.

Last week then I stumbled upon a piece of kit called “ELV MAX! Cube”. It’s a white cube (as the name implies) which offers a USB port from which it is powered and an RJ-45 ethernet port which connects the cube to the home network.

The cube itself does not draw much power and it can be powered by the routers USB port easily. It allows you to connect some peripherals using 868 mhz rf. Those peripherals can be: window state sensors (closed/open) and thermostats to control the radiators (and a switch but, well… hopefully not necessary).

It comes with it’s own user interface – a java application that connects to the device and allows you to configure it. Quite nice – it runs on Windows and Mac. You can use a cloud service to control the device over the internet, but I have no intention in trying that out right now.

My plan is to extend h.a.c.s. to get information from the cube and handle them and in the end even control the cube by setting temperatures and controlling the outcome of those changes.

As of now there are some efforts to decode the quite interesting protocol the cube is talking. You communicate with the cube over TCP (my cube listens on port 62910).

Currently I am building a small debug application which allows me to experiment with the output of the cube faster than plain telnet would. And within this I had the first contact tonight:

As always all my efforts can be seen in the hacs repository.

Source 1: https://github.com/bietiekay/hacs
Source 2: http://www.schrankmonster.de/?s=hacs
Source 3: http://www.elv.de/max-cube-lan-gateway.html
Source 4: http://www.domoticaforum.eu/viewtopic.php?f=66&t=6654

das Keyboard

It appeared to me that I stopped working with a decent keyboard since I moved completely to Macs at home. I was using the keyboards the machine came with and not always does Apple deliver the best possible keyboard for the money.

So I tried to turn back to my trusty IBM PS/2 Model M last week and I had to find out that somehow the actively powered USB to PS2 adapter I had is got lost. A passive one just doesn’t cut it and the keyboard does not work at all.

I remembered that in 2006 I wrote about a back-then-new keyboard that resembled the fantastic Model M. Voilá! They even worked on their keyboards since 2006 and improved them :-)

A little bit more than 6 years after writing first about the product I got me a “das Keyboard Ultimate S EU”.

First verdict: It is awesome!

It’s expensive, that’s true. But if just feels right typing on it. I can see me writing a lot of stuff for longer periods on that keyboard :-)

Source 1: http://www.schrankmonster.de/2005/05/22/real-elite-keyboard
Source 2: http://www.schrankmonster.de/2005/08/17/teh-keyboard-for-teh-coders/

 

a javascript / html live-preview editor in your browser

This whole web developing thing is getting somewhere. Take a look at that great implementation of a html / javascript editor with built-in live preview. It got syntax highlighting and all and best of all: it runs directly in your browser. You don’t have to install anything.

Some more information directly from the readme file:

JS Bin is a webapp specifically designed to help JavaScript and CSS folk test snippets of code, within some context, and debug the code collaboratively.

JS Bin allows you to edit and test JavaScript and HTML (reloading the URL also maintains the state of your code – new tabs doesn’t). Once you’re happy you can save, and send the URL to a peer for review or help. They can then make further changes saving anew if required.

The original idea spawned from a conversation with another developer in trying to help him debug an Ajax issue. The original aim was to build it using Google’s app engine, but in the end, it was John Resig‘s Learning app that inspired me to build the whole solution in JavaScript with liberal dashes of jQuery and a tiny bit of LAMP for the saving process.

Version 1 of JS Bin took me the best part of 4 hours to develop, but version 2, this version, has been rewritten from the ground up and is completely open source.”

Source 1: http://jsbin.com/#source
Source 2: http://jsbin.tumblr.com/
Source 3: https://github.com/remy/jsbin

a delicious raspberry pi

Just a couple of days ago – after a waiting time of more than half a year – my personal raspberry pi board arrived. Fantastic!

It’s small. Oh yes, it’s very very small.

What is the Raspberry Pi you may ask:

“The Raspberry Pi is a credit-card sized computer that plugs into your TV and a keyboard. It’s a capable little PC which can be used for many of the things that your desktop PC does, like spreadsheets, word-processing and games. It also plays high-definition video. We want to see it being used by kids all over the world to learn programming.”

For under 40 Euro you get a huge choice of I/O interfaces like USB, Ethernet, HDMI, Audio and Multi Purpose IO pins you can play with if you’re into hardware hacking. This small card is running a fully blown linux and because it has a dedicated graphics core which can hardware decode and encode 1080p h264 it’s definitely a good choice for a home mediacenter (yes, XBMC runs on it.)

It draws so little power that you could use solar panels to power it. It’s all open and sourced and I will use it for a couple of things in the household. Like a cheap Airplay node. Or a more intelligent sensor node for home automation. This thing seriously rocks – finally a device to play with – with reasonable horse-power.

Source 1: http://www.raspberrypi.org
Source 2: http://www.raspbmc.com

Adventures in e-Commerce and technology

Oh dear. I just thought about the fact that I never really announced or talked about the fact that I changed my employee and moved to a (old) new place.

Yes that’s right, I am not with sones anymore. I am since January 1st the CTO of Rakuten Germany. When I signed the contract the company was called Tradoria – one of the first big projects I had the opportunity to work on was the so called brandchange.

A humongeous japanese based company called Rakuten bought Tradoria in the middle of 2011 and after half a year it was time to switch the brand.

As you can imagine these were busy weeks since January 1st. I had to digest a lot of existing technology and products. I met and got to know a lot of interesting people – first and foremost a great team of developers that went through almost all imagineable pains and parties to come up with a marketplace and shop system that is a perfect base for take-off.

A short word on the business-model of Rakuten – If you’re a merchant you gotta love it: Think of Rakuten as a full service provider for a merchant and customer. You as a Rakuten merchant get all the frontend and backend bliss to present and manage your products and orders. Rakuten takes care of all the nasty bits and pieces like hosting, development, telephone orders, invoicing, payment. The only thing that you as a Rakuten merchant need to do is to put in great products, gather orders and send out packages. Since Rakuten isn’t selling products on it’s own it won’t be competing with the merchants like other marketplace providers do these days.

On top of that Rakuten cares for the merchant and the customer. Just a week after that successful brandchange I attended (and spoke) at the Tradoria Live! 2012. That’s basically the merchant get-together. This year over 500 people attended this one-day conference. Think of it as a hands-on conference with features, plans, summaries of the last year and the upcoming one – every merchant is invited to come and talk to the people in person that work hard everyday to make the marketplace and shop system better.

click on it to see it big

Just 24 hours later standing on that stage I found myself here:

東京

Yep. That’s Tokyo (東京). After a very long flight we had the chance to attend a all-embracing tokyo tour before the meetings and talks would start for our team. It was an awesome and exhausting week – just about 120 hours later I was back in Germany – I must have slept for two days :-)

Back in germany I had a lot of stuff to learn and work through. We had already moved to a wonderful house near Bamberg – it was pretty much big luck to find it. It’s actually ridiculously huge for a couple and two cats but we love it. Imagine the contrast: moving from an apartment next to a four-lane city street to the countryside just a 15 minute drive away from work with philosophical quietness all around.

Now after about half a year I am well into the process. I met a lot of high profile techies and things seem to take up speed in regards of teamplay in germany and with all the other countries. It’s a bliss to work for a group of companies that actually go through a lot of transitions while transforming from start-ups to an enterprise.

Ready for a family picture? Ready. Steady. Go!

That’s all Rakuten – that’s all on one mission: Shopping is entertainment! Empower the merchants!

Beside all that I even started to learn japanese. ただいま  :-)

first conference for about a year: Berlin Buzzwords 2012

It’s been a while since I attended a technology conference. But it’s going to change. This week I attended the Berlin Buzzword 2012 conference in Berlin.

Search.Store.Scale is the headline under which this awesome conference takes place and after a very slow start there were a lot of great talks about current technologies regarding databases, data processing and storage. From great overviews to some very in-depth talks… like the one called “Searching Japanese with Lucene and Solr”. Since I am currently in the process of getting to know the japanese language better this talk in particular had interesting insights into how to handle the japanese language. Very impressive and a bit frightening how complicated language processing can be.

And out gets something like this:

automation to the people: download YouTube videos automatically

You know that: You have just stumbled upon a great and informative YouTube channel. It’s full of videos you would like to watch but to do that you need to have internet access in any case. And of course that internet access needs to be as fast as possible to cope with the video quality you would like to watch.

If only it would be possible to download a video from YouTube, store it locally and watch it whenever you got the time. Maybe you want to take that video with you on that great, internetless self-awareness trip…

Now there are a lot of tools that allow you to download YouTube clips manually. I used BYTubeD for that purpose. It is a nice and easy to use Firefox Add-On which can be started whenever a YouTube video appears in any page.

After you’ve started into BYTubeD you can select which of the videos on the page you would like to download and what quality you would like to get.

All this works very well if you only want to download something once every while. Problems come up if you want to download regular postings…

I’ve subscribed to several – to me – very interesting YouTube channels. These get updated almost every day. The only option for me to keep track with them is to take the time, surf YouTube and use BYTubeD to download manually if there is anything new. Now this was a waste of time for me so I automated it.

I wrote a small tool I call “YouTubeFeast” – because it allows you to feast on YouTube… yeah I know. Now this tool is designed to run on a linux or windows machine in the background and scan in configurable intervals for new videos. If it finds new videos it downloads them in the quality you pre-configured to a folder you configured. It couldn’t be easier.

It’s open-source (GPLv2) and I’ve made it publicly available on GitHub. You can even find a pre-compiled binary version there which is ready-to-run.

The configuration file “YouTubeFeast.configuration” is a plain and simple text file. Use your favourite text editor and obey some simple rules:

  • any line beginning with # is a comment
  • any line not beginning with a # is a download-job
  • any download job consists of the following, tabulator separated parameters:
    • the URL of the video page / channel homepage / overview
    • the desired quality (360p, 720p, 1080p)
    • the path to store the videos
    • the interval (in hours) to check for new stuff
  • don’t forget: tabulator separates parameters (take a look into the example configuration file…)

After configuring the only thing you need to do is to start YouTubeFeast. It will then go through all the jobs and download video files – as soon as it comes across an already downloaded file it stops that specific job.

That’s all about it. If you got any comment or suggestions for improvement please let me know.

Source 1: https://github.com/bietiekay/YouTubeFeast
Source 2: Download YouTubeFeast-March2013

downloading the whole Jamendo catalog

Yesterday @simcup wrote on twitter about that he is currently downloading the whole Jamendo catalog of Creative Commons music. Capture

Although I already knew Jamendo it never occurred to be to download their whole catalog. Since I am a fan of choice I immediately thought about how I could download the catalog too. Since the only clue was a cryptic uri-like text how to achieve that it suddenly sounded like a great idea to write a universal tool and release it as open-source. This tool should allow users to download the whole catalog and keep their local jamendo mirror in sync with the server. So anytime new artists, albums or tracks are added the user does not need to download them all again.

So the only thing I had as a starting point was that cryptic uri pointing me to something I’ve never heard of called Rythmbox. Turns out that this is a GNOME music player application which has Jamendo integration. After some clueless poking around I decided to take a look at the source of Rythmbox, especially the Jamendo module.

This module is written in python and quite clean to read. And just by looking at the first lines I came across the interesting fact that there is a almost daily updated XML dump of the Jamendo catalog available from Jamendo. Hurray! Since Jamendo wants developers to interact with the platform they decided to put a documentation online which allows anyone to write tools and stream and download tracks. After all the clues I found I finally ended up on this page.

So there are the catalog download, track stream and torrent uris necessary to download the catalog. Now the only thing that is needed is a tool which parses the XML and creates a nice folder structure for us.

folderstructure

Parsing XML in C# (my prefered programming language) is easy. Basically you can use a tool called XSD.exe and let it generate first the XSD from the XML and then ready-to-use C# classes from that XSD.

generating_xsd_and_csharp

After doing all that actually reading the whole catalog into a useable form breaks down to just three lines of code:

parsingxml

Isn’t it great how modern frameworks take away the complexity of such tasks. At this point I’ve already parsed the whole catalog into my tool and only wrote three lines of code. The rest was generated automatically for me. The best of all – this also works on non-windows operating systems when you use mono.

When the XML data is parsed and available in a nice data structure it’s easy to iterate through all artists, all albums and all tracks and then download the actual mp3 or ogg. And that’s basically what my tool does. It takes the XML, parses it, and downloads. It will check before downloading if the track already exists and will only download those added since the last run.

Additionally since I am deeply involved into the development of the GraphDB graph database at sones I want to make use of the Jamendo data and the graph structure it poses. Since the directory structure my tool is generating is only one aspect how you could possibly look at the data it’s quite interesting to demonstrate the capabilities of GraphDB based on that data.

The idea behind the graph representation of the data is that you could start from almost any starting point imaginable. No matter if you you start from a single track and drill up into genre and artists, or if you start at a location and drill down to tracks.

So what the Downloader does in matters of GraphDB integration is that it outputs a GraphQL script which can be imported into an instance of GraphDB.

The sourcecode of my tool is available on github and released unter the BSD license – feel free to play with it and to contribute.

Source 1: http://www.jamendo.com
Source 2: https://github.com/bietiekay/JAMENDOwnloader

Boogie Board

Two weeks ago I had read an article about a “replacement for papernotes” product called “Boogie Board”. The company behind the product claims to replace paper with the bold slogan of “say goodby to paper”.

Well what is it? Basically it’s a liquid crystal display without the logic to adress specific pixels. So think of it like taking the liquid crystal part and leaving out all the transistors and logic to actually display something. Then add a pen or even your finger nail and you can “write” on that display – what’s happening is that obviously the crystals get pushed aside and the background of the “display” shines through – this background is white so when you write on the boogie board everything is white on black…

_MG_9156

_MG_9160

_MG_9162

The only button on the tablet is named “erase” – and that’s what the button does: the whole display flashes two times, one white, and then black and everything is back to where we started. You cannot save. You just press erase and start over. It’s truly a replacement for post-it-notes…

Of course there’s a battery inside, and it’s said to hold for tens of thousands of erases. You cannot change the battery when it’s empty, but on the other hand this gadget is less than 30 Euros and it does look like you can break it up and try your best to exchange the battery yourself. Since the battery isn’t needed to display anything I don’t think I will run out of juice just yet.

Source: http://www.improvelectronics.com