open source audio codecs getting better

Some weeks ago I heard about a new audio codec which is being developed as open source – very similar to vorbis – the previous open source approach to audio codecs.

This time it seems that they’ve got some standardization into the play so it might be more successful than vorbis was.

“Opus is a totally open, royalty-free, highly versatile audio codec. Opus is unmatched for interactive speech and music transmission over the Internet, but also intended for storage and streaming applications. It is standardized by the Internet Engineering Task Force (IETF) as RFC 6716 which incorporated technology from Skype’s SILK codec and Xiph.Org’s CELT codec.”

Source 1: http://www.opus-codec.org/
Source 2: http://auphonic.com/blog/2012/09/26/opus-revolutionary-open-audio-codec-podcasts-and-internet-audio/
Source 3: http://tools.ietf.org/html/rfc6716

Learn to code

Knowing how to deal with those personal computers is getting more important by the day. Not everybody needs to know how to write code – but since writing code and making those machines do what you want them to do isn’t as hard as it used to be it’s worth the try!

On the mission to learn to code this page is probably very interesting for anyone wanting to learn:

Source 1: http://www.codecademy.com/#!/exercises/0

generate C# classes from JSON data

It’s a common use case: you’ve got some JSON formatted data and you want to interface with it using your favourite programming language C#. You can write the appropriate classes yourself, or you could use the fabulous json2csharp helper page.

Source 1: http://json2csharp.com/
Source 2: http://jsonclassgenerator.codeplex.com/
Source 3: http://json.codeplex.com/

a font for number people

OpenType is a font format which I personally might have underestimated in the past. Well you know – fonts and stuff. This all seemed not too interesting up until now. Now that changed dramatically when a font came to my attention which can be used for various purposes and as a font does not resemble the normal numbers and characters scheme. But what can it be used then if not to type numbers and characters?

Well. What about typing graphs?

Everything in the above image is generated by a font… like in your Word-processor (if it uses that font)

“Designed by Travis Kochel, FF Chartwell is a fantastic typeface for creating simple graphs. Driven by the frustration of creating graphs within design applications (primarily Adobe Creative Suite) and inspired by typefaces such as FF Beowolf and ­­FF PicLig, Travis saw an opportunity to take advantage of OpenType technology to simplify the process.

Using OpenType features, simple strings of numbers are automatically transformed into charts. The visualized data remains editable, allowing for hassle-free updates and styling.”

Source 1: https://www.fontfont.com/how-to-use-ff-chartwell

baking with the PI

Do you know what happens during the push of the power button and typing your log-in information inside of your computer? No? You should. At least from a software side. Not that it is necessary to use a computer. But in order to understand what this wonderful machine does and why.

For those teaching and learning purposes the Raspberry Pi is a perfect device. It’s cheap and now there is a course you can take online which shows you – starting from the very beginning – how to get the device up and running and how to make it do what you like. And that’s without installing an operating system. You are about to write your very own.

“This website is here to guide you through the process of developing very basic operating systems on the Raspberry Pi! This website is aimed at people aged 16 and upwards, although younger readers may still find some of it accessible, particularly with assistance. More lessons may be added to this course in time.”

Source: http://www.cl.cam.ac.uk/freshers/raspberrypi/tutorials/os/

Entwickler in Hamburg – die Developer Conference 2012 in Hamburg

Der Freitag der vergangenen Woche begann sehr sehr früh. Es ging nach Nürnberg um den Flug nach Hamburg zu erwischen. Erstaunlich wie günstig die heutzutage sind: der Flug nach Hamburg (50 Minuten in der Luft) sollte nur 10 Euro teurer als die Zugfahrt zurück (4 Stunden auf Schienen) sein…

Jedenfalls war es ein schön kurzer Flug und schwupps stand ich vor der Otto Versand Zentrale in Hamburg… Es war Zeit für die…

… Developer Conference Hamburg 2012.

Es war meine erste DevCon-HH und dementsprechend kann ich keine Vergleiche zum letzten Jahr ziehen. Die Räumlichkeiten – direkt bei Otto – waren jedenfalls sehr ordentlich aufgebaut, alles sehr bequem. Kurze Wege zwischen Kaffee und Vortragsstuhl. Die 2 der Vortragssäle waren leider nur über den Hauptsaal zu erreichen. Was ein-zweimal dazu führte dass Vortrage in den kleineren Sälen bereits beendet waren und die Menschenmengen durch den Hauptsaal Richtung Kaffee strömten während die Zuhörer im Hauptsaal noch versuchten zuzuhören. Hier mal im Bild erklärt: Rechts der große Hauptsaal und Links ein kleinerer Vortragssaal. Ich stand beim fotografieren direkt im Türrahmen.

Es ging für mich mit zwei sehr guten und interessanten Vorträgen los. Die Keynote des ersten Tages gibt es mittlerweile auch, wie es sich gehört, auf Slideshare:

Insgesamt war die Qualität der Vorträge sehr hoch. Ich fand die Mischung zwischen harten und soften Themen rund um die Software-Entwicklung mehr als gelungen und sicherlich werde ich versuchen nächstes Jahr wieder zu kommen.

[nggallery id=3]

 

Source 1: http://www.developer-conference-hh.de/

Congratulations Mr. Heil!

I am very pleased to congratulate a good man and ex-colleague for his accomplished mission: He handed in his PhD thesis with the Subject (german): “Anwendungsentwicklun für intelligente Umgebungen im Web Engineering“:

“This book describes a holistic approach to develop complex software systems based on the WebComposition process model. It shows how to integrate soft- and hardware components in a cost efficient and effective way using  Web technologies and the Semantic Web. The WebComposition Concurrency System, a formal language to predict system dependencies and conflicts, allows efficient planing and monitoring of the development and operation process of the overall system.”

I had the pleasure to work with Andreas on several occasions. One that I remember with the strongest feelings is a 2 1/2 day around-the-clock hack-a-thon at Microsoft Research. We got it working back then!

For the last 8 years I am constantly trying to get him interested and convinced to work on things directly or remotely connected to some of the stuff I do – but up until now luck wasn’t on my side. Maybe someday :-)

My sincere compliments on achieving his goal on this. Congratulations!

Source 1: http://www.aheil.de/books/
Source 2: http://blog.aheil.de
Source 3: http://www.schrankmonster.de/category/familyandfriends/aheil-de/

openHAB – home automation bus

It certainly is just me thinking: this home automation / smart home thing gains more momentum every week. Now there’s a java based home automation bus initative taking care of the software standardization side. Quite interesting. And beside all that they had some fantastic ideas how a user interface for those things should look like. Like for example how you would interact with your house while planning when things power on and off. Use Google Calendar! This is just plain genius!

“The open Home Automation Bus (openHAB) project aims at providing a universal integration platform for all things around home automation. It is a pure Java solution, fully based on OSGi. The Equinox OSGi runtime and Jetty as a web server build the core foundation of the runtime.

It is designed to be absolutely vendor-neutral as well as hardware/protocol-agnostic. openHAB brings together different bus systems, hardware devices and interface protocols by dedicated bindings. These bindings send and receive commands and status updates on the openHAB event bus. This concept allows designing user interfaces with a unique look&feel, but with the possibility to operate devices based on a big number of different technologies. Besides the user interfaces, it also brings the power of automation logics across different system boundaries.”

I especially like the idea of that calendar integration – sending scripts through an appointment is a great idea – having some sort of scripting language is another one. A little bit on the marketing side is the option to chat with your house through XMPP / Jabber… that might take the idea a little bit too far out – but who would want to blame them? Fantastic stuff!

Source 1: http://www.openhab.org/
Source 2: http://kaikreuzer.blogspot.de/2012/08/openhab-1.html

ELV MAX! Cube progress

I’ve just pushed a commit to the repository which finalizes my current effort in getting data out of the ELV MAX! Cube. With this sourcecode you should be able to get the following information out of your ELV MAX! Cube:

  • a list of all configured rooms
  • a list of all devices in those rooms
  • Thermostat and ShutterContacts have all their flags with them (like Battery Status, Open/Closes, Mode (auto, manual,…))

That brings me one step further to the integration of the ELV MAX! Cube into h.a.c.s. – next weekend probably :-)

p.s.: I’ve already ordered more thermostat and shuttercontact sensors.

Source 1: https://github.com/bietiekay/hacs

home automation example: domotica

For several years now I am interested in this home automation thing – I even got a little bit of my own home automation going. But with websites like domotica you can get an idea of what is achieveable and how it might look for the people actually using it every day.

Source 1: http://www.hekkers.net/domotica/Default.aspx

reverse-engineering the ELV MAX! Cube protocol

I had a couple of hours to tinker with my ELV MAX! Cube and there is some progress with the protocol reverse engineering.

Of course there is the domotica forum helping out with some information the guys over there have found but in addition to their very helpful findings I want it to be integrated into h.a.c.s. – and along with it I maybe want to have a way to find eventual protocol changes quick and easy in the future.

So yesterday I partied on the ‘first contact’ – today I am a bit deeper into the protocol itself:

Here are some explanations to the picture:

When a tcp connection to the cube is opened you can immediately read from it – the cube is throwing information at you. There’s always a character at the beginning of each line which marks the type and beginning of the message.

There seem to be these types of messages in the first package of information:

  • H – Header maybe?
    •  it contains the serial number of your cube, the RF address, the firmware version and several other things like time information
  • M – Metadata?
    • this seems to be some kind of global metadata list, containing the rooms with their IDs (it’s the %) in the screenshot). Furthermore it contains the serial numbers and names of the devices in that room – at the moment there’s just a window-state-sensor in that first room called “Fensterkontakt 1”
  • C – Configuration?
    • since there are multiple C messages these seem to contain detailed configuration data specific to a device in the MAX! network. Each device seems to be addressed by a RF Address and it’s serial number.
    • the first C message in the screenshot is associated to the cube itself
    • the second C message is associated to the window-state-sensor – you can clearly see in there the room id “%)” and the serial of the window-state-sensor.
  • L – live status?
    • this message seems to contain room status information. In our case there is only the room with id “%)”. When the window-state-sensor changes state the last byte changes value – interesting, eh?

On the coding side I’ve got several things set-up in my little debug tool. I’ve wrapped those message types into various classes to handle them more easily later on in h.a.c.s.. Furthermore I used a little decompiler-wisdom to extract some more information from the included ELV MAX! cube software.

Thanks to german UrhG paragraph § 69e (german copyright law) I am allowed to decompile the included software in order to achieve interoperability (and only that). That’s exactly what I would like to achieve: Interoperability. And for the record: besides that I also filed a support request to ELV in which I ask them if I could get access to a presumably existing documentation of that protocol.

While waiting on that documentation I am using JD-GUI as a decompiler user interface for java – since the software of the cube is written in java.

There are many interesting things in there but it’s a slow process to get ahold of all the things necessary. There are already some very nice things showing up. Like when you want to know if there’s a cube (or more) in the network you just need to send a multicast ip packet containing a characteristic signature and all the cubes in your network will try to connect back to you with some basic information – nice, isn’t it? Or what about that AES Encryption/Decryption that seems to be built into the cube? Yes that’s right! It seems to be possible to send commands to either encrypt or decrypt according to the AES. Thoughtfully these commands are marked with ‘e’ and ‘d’. Or that if you send “l:” as a command with CR+LF at the end you get a device listing with all stats… and so on.

Some open question to EQ-3/ELV for the end of this article:

  • Why this strange protocol? Why all the work on both sides? Just because an HTTP server implementation with a RESTful service would have been that more difficult?
  • Base64 encoded data? The 90s called, they want their 8th bit back.
  • why that complex local webserver approach when you could have done everything in a java app anyways?

That’s it for today, I just pushed a feature to the Git repository which allows you to run whatever command you like on your cube with the debugging tool:

Enjoy! :-)

Source 1: http://www.domoticaforum.eu/viewtopic.php?f=66&t=6654&sid=f8f912914163cb44d447cfa3de44d63d
Source 2: http://en.wikipedia.org/wiki/Decompiler
Source 3: http://www.gesetze-im-internet.de/urhg/__69e.html
Source 4: http://java.decompiler.free.fr/?q=jdgui
Source 5: http://en.wikipedia.org/wiki/Advanced_Encryption_Standard

a file is a file is a file.

Ever wondered how a software finds out that this file named “filename” is a pdf, jpeg, movie? There are several thousands, probably hundreds-of-thousands of fileformats out there. Some of them are used many times a day without us even noticing. We’re just moving an image from A to B not caring about what constitutes an image file and what makes a jpeg different to a png image.

Now for pure academic reasons there is one file that is many (no, not borg). It’s a file that is:

“CorkaMIX.exe is simultaneously a valid: * Windows Portable Executable binary * Adobe Reader PDF document * Oracle Java JAR (a CLASS inside a ZIP)/Python script * HTML page

It serves no purpose, except proving that files format not starting at offset 0 are a bad idea. Many files (known as polyglot) already combines various langages in one file, however it’s most of the time at source level, not binary level.”

Source 1: http://code.google.com/p/corkami/downloads/detail?name=CorkaMIX.zip

ELV Max! Cube – home automation for the heating

For several years now I am building my own home automation tools by putting together existing hardware and self-written software. As the central software core of my home automation system I use h.a.c.s. – “home automation control server” which I put up as open source software on GitHub.

Throughout the years I was able to embedd a lot of daily tasks and measurements in one place which can be accessed by a simple web page. It currently looks like this:

You can find some articles on this blog about h.a.c.s. if you want to know more about it.

As of today I can control and measure the states of switches, windows, doors, temperature and humidity and power consumption. Scenarios like “when this door opens, switch on that light” are easy things to do with h.a.c.s.

Now “Winter’s coming!”. And therefore I want to take control of the heating of each and every room in the house. I want to set a goal for a temperature and I want the heating to fire up or cool down with that goal. And of course I want to monitor manual changes of each and every radiator in the house.

Last week then I stumbled upon a piece of kit called “ELV MAX! Cube”. It’s a white cube (as the name implies) which offers a USB port from which it is powered and an RJ-45 ethernet port which connects the cube to the home network.

The cube itself does not draw much power and it can be powered by the routers USB port easily. It allows you to connect some peripherals using 868 mhz rf. Those peripherals can be: window state sensors (closed/open) and thermostats to control the radiators (and a switch but, well… hopefully not necessary).

It comes with it’s own user interface – a java application that connects to the device and allows you to configure it. Quite nice – it runs on Windows and Mac. You can use a cloud service to control the device over the internet, but I have no intention in trying that out right now.

My plan is to extend h.a.c.s. to get information from the cube and handle them and in the end even control the cube by setting temperatures and controlling the outcome of those changes.

As of now there are some efforts to decode the quite interesting protocol the cube is talking. You communicate with the cube over TCP (my cube listens on port 62910).

Currently I am building a small debug application which allows me to experiment with the output of the cube faster than plain telnet would. And within this I had the first contact tonight:

As always all my efforts can be seen in the hacs repository.

Source 1: https://github.com/bietiekay/hacs
Source 2: http://www.schrankmonster.de/?s=hacs
Source 3: http://www.elv.de/max-cube-lan-gateway.html
Source 4: http://www.domoticaforum.eu/viewtopic.php?f=66&t=6654

das Keyboard

It appeared to me that I stopped working with a decent keyboard since I moved completely to Macs at home. I was using the keyboards the machine came with and not always does Apple deliver the best possible keyboard for the money.

So I tried to turn back to my trusty IBM PS/2 Model M last week and I had to find out that somehow the actively powered USB to PS2 adapter I had is got lost. A passive one just doesn’t cut it and the keyboard does not work at all.

I remembered that in 2006 I wrote about a back-then-new keyboard that resembled the fantastic Model M. Voilá! They even worked on their keyboards since 2006 and improved them :-)

A little bit more than 6 years after writing first about the product I got me a “das Keyboard Ultimate S EU”.

First verdict: It is awesome!

It’s expensive, that’s true. But if just feels right typing on it. I can see me writing a lot of stuff for longer periods on that keyboard :-)

Source 1: http://www.schrankmonster.de/2005/05/22/real-elite-keyboard
Source 2: http://www.schrankmonster.de/2005/08/17/teh-keyboard-for-teh-coders/

 

a javascript / html live-preview editor in your browser

This whole web developing thing is getting somewhere. Take a look at that great implementation of a html / javascript editor with built-in live preview. It got syntax highlighting and all and best of all: it runs directly in your browser. You don’t have to install anything.

Some more information directly from the readme file:

JS Bin is a webapp specifically designed to help JavaScript and CSS folk test snippets of code, within some context, and debug the code collaboratively.

JS Bin allows you to edit and test JavaScript and HTML (reloading the URL also maintains the state of your code – new tabs doesn’t). Once you’re happy you can save, and send the URL to a peer for review or help. They can then make further changes saving anew if required.

The original idea spawned from a conversation with another developer in trying to help him debug an Ajax issue. The original aim was to build it using Google’s app engine, but in the end, it was John Resig‘s Learning app that inspired me to build the whole solution in JavaScript with liberal dashes of jQuery and a tiny bit of LAMP for the saving process.

Version 1 of JS Bin took me the best part of 4 hours to develop, but version 2, this version, has been rewritten from the ground up and is completely open source.”

Source 1: http://jsbin.com/#source
Source 2: http://jsbin.tumblr.com/
Source 3: https://github.com/remy/jsbin

a delicious raspberry pi

Just a couple of days ago – after a waiting time of more than half a year – my personal raspberry pi board arrived. Fantastic!

It’s small. Oh yes, it’s very very small.

What is the Raspberry Pi you may ask:

“The Raspberry Pi is a credit-card sized computer that plugs into your TV and a keyboard. It’s a capable little PC which can be used for many of the things that your desktop PC does, like spreadsheets, word-processing and games. It also plays high-definition video. We want to see it being used by kids all over the world to learn programming.”

For under 40 Euro you get a huge choice of I/O interfaces like USB, Ethernet, HDMI, Audio and Multi Purpose IO pins you can play with if you’re into hardware hacking. This small card is running a fully blown linux and because it has a dedicated graphics core which can hardware decode and encode 1080p h264 it’s definitely a good choice for a home mediacenter (yes, XBMC runs on it.)

It draws so little power that you could use solar panels to power it. It’s all open and sourced and I will use it for a couple of things in the household. Like a cheap Airplay node. Or a more intelligent sensor node for home automation. This thing seriously rocks – finally a device to play with – with reasonable horse-power.

Source 1: http://www.raspberrypi.org
Source 2: http://www.raspbmc.com

Adventures in e-Commerce and technology

Oh dear. I just thought about the fact that I never really announced or talked about the fact that I changed my employee and moved to a (old) new place.

Yes that’s right, I am not with sones anymore. I am since January 1st the CTO of Rakuten Germany. When I signed the contract the company was called Tradoria – one of the first big projects I had the opportunity to work on was the so called brandchange.

A humongeous japanese based company called Rakuten bought Tradoria in the middle of 2011 and after half a year it was time to switch the brand.

As you can imagine these were busy weeks since January 1st. I had to digest a lot of existing technology and products. I met and got to know a lot of interesting people – first and foremost a great team of developers that went through almost all imagineable pains and parties to come up with a marketplace and shop system that is a perfect base for take-off.

A short word on the business-model of Rakuten – If you’re a merchant you gotta love it: Think of Rakuten as a full service provider for a merchant and customer. You as a Rakuten merchant get all the frontend and backend bliss to present and manage your products and orders. Rakuten takes care of all the nasty bits and pieces like hosting, development, telephone orders, invoicing, payment. The only thing that you as a Rakuten merchant need to do is to put in great products, gather orders and send out packages. Since Rakuten isn’t selling products on it’s own it won’t be competing with the merchants like other marketplace providers do these days.

On top of that Rakuten cares for the merchant and the customer. Just a week after that successful brandchange I attended (and spoke) at the Tradoria Live! 2012. That’s basically the merchant get-together. This year over 500 people attended this one-day conference. Think of it as a hands-on conference with features, plans, summaries of the last year and the upcoming one – every merchant is invited to come and talk to the people in person that work hard everyday to make the marketplace and shop system better.

click on it to see it big

Just 24 hours later standing on that stage I found myself here:

東京

Yep. That’s Tokyo (東京). After a very long flight we had the chance to attend a all-embracing tokyo tour before the meetings and talks would start for our team. It was an awesome and exhausting week – just about 120 hours later I was back in Germany – I must have slept for two days :-)

Back in germany I had a lot of stuff to learn and work through. We had already moved to a wonderful house near Bamberg – it was pretty much big luck to find it. It’s actually ridiculously huge for a couple and two cats but we love it. Imagine the contrast: moving from an apartment next to a four-lane city street to the countryside just a 15 minute drive away from work with philosophical quietness all around.

Now after about half a year I am well into the process. I met a lot of high profile techies and things seem to take up speed in regards of teamplay in germany and with all the other countries. It’s a bliss to work for a group of companies that actually go through a lot of transitions while transforming from start-ups to an enterprise.

Ready for a family picture? Ready. Steady. Go!

That’s all Rakuten – that’s all on one mission: Shopping is entertainment! Empower the merchants!

Beside all that I even started to learn japanese. ただいま  :-)

first conference for about a year: Berlin Buzzwords 2012

It’s been a while since I attended a technology conference. But it’s going to change. This week I attended the Berlin Buzzword 2012 conference in Berlin.

Search.Store.Scale is the headline under which this awesome conference takes place and after a very slow start there were a lot of great talks about current technologies regarding databases, data processing and storage. From great overviews to some very in-depth talks… like the one called “Searching Japanese with Lucene and Solr”. Since I am currently in the process of getting to know the japanese language better this talk in particular had interesting insights into how to handle the japanese language. Very impressive and a bit frightening how complicated language processing can be.

And out gets something like this:

automation to the people: download YouTube videos automatically

You know that: You have just stumbled upon a great and informative YouTube channel. It’s full of videos you would like to watch but to do that you need to have internet access in any case. And of course that internet access needs to be as fast as possible to cope with the video quality you would like to watch.

If only it would be possible to download a video from YouTube, store it locally and watch it whenever you got the time. Maybe you want to take that video with you on that great, internetless self-awareness trip…

Now there are a lot of tools that allow you to download YouTube clips manually. I used BYTubeD for that purpose. It is a nice and easy to use Firefox Add-On which can be started whenever a YouTube video appears in any page.

After you’ve started into BYTubeD you can select which of the videos on the page you would like to download and what quality you would like to get.

All this works very well if you only want to download something once every while. Problems come up if you want to download regular postings…

I’ve subscribed to several – to me – very interesting YouTube channels. These get updated almost every day. The only option for me to keep track with them is to take the time, surf YouTube and use BYTubeD to download manually if there is anything new. Now this was a waste of time for me so I automated it.

I wrote a small tool I call “YouTubeFeast” – because it allows you to feast on YouTube… yeah I know. Now this tool is designed to run on a linux or windows machine in the background and scan in configurable intervals for new videos. If it finds new videos it downloads them in the quality you pre-configured to a folder you configured. It couldn’t be easier.

It’s open-source (GPLv2) and I’ve made it publicly available on GitHub. You can even find a pre-compiled binary version there which is ready-to-run.

The configuration file “YouTubeFeast.configuration” is a plain and simple text file. Use your favourite text editor and obey some simple rules:

  • any line beginning with # is a comment
  • any line not beginning with a # is a download-job
  • any download job consists of the following, tabulator separated parameters:
    • the URL of the video page / channel homepage / overview
    • the desired quality (360p, 720p, 1080p)
    • the path to store the videos
    • the interval (in hours) to check for new stuff
  • don’t forget: tabulator separates parameters (take a look into the example configuration file…)

After configuring the only thing you need to do is to start YouTubeFeast. It will then go through all the jobs and download video files – as soon as it comes across an already downloaded file it stops that specific job.

That’s all about it. If you got any comment or suggestions for improvement please let me know.

Source 1: https://github.com/bietiekay/YouTubeFeast
Source 2: Download YouTubeFeast-March2013

downloading the whole Jamendo catalog

Yesterday @simcup wrote on twitter about that he is currently downloading the whole Jamendo catalog of Creative Commons music. Capture

Although I already knew Jamendo it never occurred to be to download their whole catalog. Since I am a fan of choice I immediately thought about how I could download the catalog too. Since the only clue was a cryptic uri-like text how to achieve that it suddenly sounded like a great idea to write a universal tool and release it as open-source. This tool should allow users to download the whole catalog and keep their local jamendo mirror in sync with the server. So anytime new artists, albums or tracks are added the user does not need to download them all again.

So the only thing I had as a starting point was that cryptic uri pointing me to something I’ve never heard of called Rythmbox. Turns out that this is a GNOME music player application which has Jamendo integration. After some clueless poking around I decided to take a look at the source of Rythmbox, especially the Jamendo module.

This module is written in python and quite clean to read. And just by looking at the first lines I came across the interesting fact that there is a almost daily updated XML dump of the Jamendo catalog available from Jamendo. Hurray! Since Jamendo wants developers to interact with the platform they decided to put a documentation online which allows anyone to write tools and stream and download tracks. After all the clues I found I finally ended up on this page.

So there are the catalog download, track stream and torrent uris necessary to download the catalog. Now the only thing that is needed is a tool which parses the XML and creates a nice folder structure for us.

folderstructure

Parsing XML in C# (my prefered programming language) is easy. Basically you can use a tool called XSD.exe and let it generate first the XSD from the XML and then ready-to-use C# classes from that XSD.

generating_xsd_and_csharp

After doing all that actually reading the whole catalog into a useable form breaks down to just three lines of code:

parsingxml

Isn’t it great how modern frameworks take away the complexity of such tasks. At this point I’ve already parsed the whole catalog into my tool and only wrote three lines of code. The rest was generated automatically for me. The best of all – this also works on non-windows operating systems when you use mono.

When the XML data is parsed and available in a nice data structure it’s easy to iterate through all artists, all albums and all tracks and then download the actual mp3 or ogg. And that’s basically what my tool does. It takes the XML, parses it, and downloads. It will check before downloading if the track already exists and will only download those added since the last run.

Additionally since I am deeply involved into the development of the GraphDB graph database at sones I want to make use of the Jamendo data and the graph structure it poses. Since the directory structure my tool is generating is only one aspect how you could possibly look at the data it’s quite interesting to demonstrate the capabilities of GraphDB based on that data.

The idea behind the graph representation of the data is that you could start from almost any starting point imaginable. No matter if you you start from a single track and drill up into genre and artists, or if you start at a location and drill down to tracks.

So what the Downloader does in matters of GraphDB integration is that it outputs a GraphQL script which can be imported into an instance of GraphDB.

The sourcecode of my tool is available on github and released unter the BSD license – feel free to play with it and to contribute.

Source 1: http://www.jamendo.com
Source 2: https://github.com/bietiekay/JAMENDOwnloader

Boogie Board

Two weeks ago I had read an article about a “replacement for papernotes” product called “Boogie Board”. The company behind the product claims to replace paper with the bold slogan of “say goodby to paper”.

Well what is it? Basically it’s a liquid crystal display without the logic to adress specific pixels. So think of it like taking the liquid crystal part and leaving out all the transistors and logic to actually display something. Then add a pen or even your finger nail and you can “write” on that display – what’s happening is that obviously the crystals get pushed aside and the background of the “display” shines through – this background is white so when you write on the boogie board everything is white on black…

_MG_9156

_MG_9160

_MG_9162

The only button on the tablet is named “erase” – and that’s what the button does: the whole display flashes two times, one white, and then black and everything is back to where we started. You cannot save. You just press erase and start over. It’s truly a replacement for post-it-notes…

Of course there’s a battery inside, and it’s said to hold for tens of thousands of erases. You cannot change the battery when it’s empty, but on the other hand this gadget is less than 30 Euros and it does look like you can break it up and try your best to exchange the battery yourself. Since the battery isn’t needed to display anything I don’t think I will run out of juice just yet.

Source: http://www.improvelectronics.com

der letzte Flug des Space Shuttle Endeavour

Am kommenden Freitag soll das Space Shuttle Endeavour zum letzen Mal und ein Space Shuttle zum vorletzten Mal abheben. Da will man dabei sein :-)

Ich habe glücklicherweise gerade die Herren (und Damen?) von SpaceLiveCast entdeckt. Offenbar machen die schon eine ganze Weile Livestreams zu den verschiedenen Raumfahrt-Events.

P.S.: Wenn ich einen Wunsch frei hätte, wäre das, dass die Seite einen Video Podcast Feed anbietet….(wird Hilfe benötigt?)

Source 1: http://spacelivecast.de/
Source 2: http://www.raumfahrer.net
Source 3: http://spacelivecast.de/2011/04/29-04-ab-1900-uhr-sts-134-letzter-endeavour-flug/

configuring the nano editor to my needs…

Configuring your favourite Editor on OSX (or Linux, or anywhere else) is important – since nano is my editor of choice I wanted to use it’s syntax highlighting capabilities. Easy as pie as it turned out:

I started with a .nanorc file from this guy and modified it to recognize some of my frequent file-types (like .cs files).

You can download my nanorc.tar – just extract it and put it into your user home directory.

Source 1: http://talk.maemo.org/showthread.php?t=68421
Source 2: http://www.nano-editor.org/dist/v2.2/nano.html#Nanorc-Files
Source 3: nanorc.tar

type to win

In the dusk of Flash it’s nice to see that HTML 5 and JavaScript are here to bring small and fun games to our browsers.

“Z-Type was specifically created for Mozilla’s Game On. I immediately wanted to participate in the competition when I first heard of it, but the deadline seemed so far away that I didn’t bother to begin working on a game back then. Fast forward to this tweet announcing that the deadline was only one week away – it took me by surprise. I still hadn’t even began working on anything. The thought of just submitting my earlier game Biolab Disaster crossed my mind but was immediately dismissed again.”

Great sound, great graphics and in the higher levels quite difficult.

Source 1: http://www.phoboslab.org/ztype/

Photosynth now mobile…

It’s been some months years since the once Microsoft Research Project got public and Microsoft started offering it’s great Photosynth service to the public.

I’ve been using the Microsoft panoramic and Photosynth tools for years now and I tend to say that they are the best tools one can get to create fast, easy and high-quality panoramic images.

There is photosynth.net to store all those panoramic pictures like this one from 2008:

The photosynth technology itself contains several other interesting technologies like SeaDragon which allows high quality image zooming on current internet connection speeds.

This awesome technology is as of now available on the iPhone (3GS and upwards) and it’s better than all the other panoramic tools I’ve used on a phone.

the process of taking the images
after the pictures are taken additional stitching is needed
after the stitching completed a fairly impressive panoramic images is the result

Source 1: Photosynth articles from the past
Source 2: Photosynth in Wikipedia
Source 3: Photosynth on iPhone App Store

Achievement Unlocked: Scaring the hell out of people

Oh boy, it seems that Apple just screwed up big time when it comes to data privacy. Obviously everytime someone attaches an iOS device like the iPhone to a PC or Mac and it does a backup run this backup includes the location data of that iPhone of the last several months. Impressive logging on the one hand and a shame that they did not talk about that in public upfront on the other hand.

There’s a great tool available on GitHub which uses OpenStreetMap to visualize the logged data – it creates a quite impressive graphical representation of where I was the last 6 months…

Source 1: http://petewarden.github.com/iPhoneTracker/

das außer-Haus Backup

Irgendwie werden es auch privat immer immer mehr Daten – mit immer zunehmender Geschwindigkeit… Alle paar Jahre tausche ich bei uns im Haushalt die Festplatten/Speicherlösung komplett aus – was zwar immer wieder mal eine Investitions bedeutet, gleichzeitig aber auch dafür sorgt dass Daten nicht irgendwelchen ungünstigen mechanischen, chemischen oder magnetischen Effekten zum Opfer fallen… Ja so etwa alle zwei Jahre wird alles einmal umkopiert… Das dauerte beim letzten Mal zwar gut eine Woche, aber naja so ist das eben…

Aus vielerlei Grund haben wir auch für einen Haushalt recht viel Bedarf an Speicherplatz – teilweise wohl auch weil meine Frau Photographin ist – aber ich als “werf-nix-weg”-Typ werd da auch einen guten Anteil dran haben…

Herr über alle unsere Festplatten (kein Witz, die Rechner bei uns haben ihre Festplatten eigentlich nur um booten zu können) ist seit jeher ein einzelner Rechner welcher ebenso alle paar Jahre komplett ausgetauscht wird. Dieser Rechner verwaltet im Moment zwischen 12-15 Festplatten verschiedener Größe – Hauptarbeit wird zur Zeit durch drei separate (gewachsene) RAID-5 Volumes erledigt…

Nebenbei: Nein ich kann/will da kein RAID-6 fahren ohne entweder Linux zu verwenden (was aus verschiedenen Gründen nicht geht) oder einen Hardware-Controller zu verwenden, was nach einschlägigen Erfahrungen querbeet durch alle möglichen Hardware RAID Controller ausfällt.

Dem ganzen Festplattenstapel liegt dann ein Standard-PC mit Windows Server 2008 zugrunde – zum einen weil ich so eine Lizenz noch herumliegen hatte und zum anderen weil ich in über 10 Jahren File-Server Erfahrungen sammeln noch nie auch nur ein Byte unter Windows verloren habe. Zusätzlich habe ich einen riesigen Haufen Software welche Windows-only ist ud sozusagen ständig laufen muss um Sinn zu machen (Mail-Server Puffer, Newsserver Mirror, Musik und Video Streaming Server, Medienbibliothek, Videorekorder,…

Diese drei großen RAID Volumes schnappt sich dann Truecrypt und ver- und entschlüsselt zuverlässig vor sich hin – im Endeffekt gibt es kein Byte Daten im Haushalt welches nicht verschlüsselt wäre. Gut für uns.

So ein RAID verhindert nun ja aber nicht dass dennoch oben genannte ungünstige Effekte eintreten und man mal eine oder mehrere Defekte zu beklagen hat. Im Normalfall tauscht man die defekte Festplatte, resynct das RAID und alles funktioniert weiter ohne dass man Daten verloren hätte. Allerdings ist das ja kein Backup. Das ist nur eine erste Absicherung gegen mögliche Defekte.

Getreu folgendem kurzen Musikstück:

RAID ist kein Backup

… ist ein RAID eben kein Backup. Backups erledigt bei mir eine Sammlung von Scripten welche jeweils in festen Abständen Vollbackups und Differenz-Backups erstellt. Da kommt dann ein Haufen 1 Gbyte großer Dateien raus welche dann anschliessend per RSync in mühevoller (und dank funktionierendem QoS unbemerkt) Arbeit außer Haus geschafft werden. Die Komplett-Backups dauern aufgrund der großen Menge einfach ewig lang und lassen sich recht einfach dadurch beschleunigen dass man sozusagen das Backup physisch auf einer externen Festplatte zum Server trägt…die Differenz-Backups sind dann meist immer recht flott durchgelaufen. Speicherplatz im Internet wird ja auch immer billiger und so haben wir auch immer ein gutes Off-Site Backup unserer Daten…

Für Windows gibt es neben den üblichen Cygwin Ports von rsync auch eine gute GUI Version namens DeltaCopy. Das Ding kopiert zuverlässig und auch wenn mal der DSL Router rebootet oder hängt nimmt er selbständig die Kopierarbeit wieder auf sobald Netz wieder verfügbar ist.

Damit DeltaCopy seine Daten irgendwo abladen kann wird auf der Gegenstelle natürlich ein rsync Server vorrausgesetzt. Die Konfiguration eines solchen ist nicht sonderlich kompliziert – im Grunde muss man nur rsync installieren und die rsyncd.conf Datei anpassen. Zusätzlich dazu muss man eine Konfigurationsdatei anlegen in welchem nach dem Schema “Benutzername:Passwort” entsprechend die Nutzeraccounts angegeben werden – das wars eigentlich schon. Rsync ist sehr robust und vor allem auch gut für geringere Bandbreiten geeignet. Wenn sich an einer Datei nur wenige Bytes geändert haben müssen auch nur die geänderten Bytes übertragen werden.

Source 1: http://www.speichergurke.de
Source 2: http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
Source 3: http://de.wikipedia.org/wiki/Rsync

Shairport – someone reversed an AirPort Express

Low Latency Network Audio was a dream for the past years (see an article of 2005 and 2008) and with AirPlay it’s finally there.

I am using the Apple AirPlay technology for several years now… after it got implemented into iOS it’s just fantastic to have the option to have whatever sound source I want to playing loud and clear in any room I want to…

Okay it’s not quite as sophisticated as the sonos solution regarding the control of multiple music sources in multiple rooms but it get’s the job done in an apartment.

So back to the topic: Apple integrated the AirPlay technology into their wireless base station “AirPort Express”. Basically AirPlay is a piece of software which receives an encrypted audio stream over the network and outputs the stream to the SPDIF or audio jack.

Back in 2005 there already was an emulator of this protocol called “Fairport” but Apple decided to encrypt the AirPlay traffic. This led to the problem that the encryption key was unkown because it’s baked into the AirPort Express firmware. And this is where the good news start:

“My girlfriend moved house, and her Airport Express no longer made it with her wireless access point. I figured it’d be easy to find an ApEx emulator – there are several open source apps out there to play to them. However, I was disappointed to find that Apple used a public-key crypto scheme, and there’s a private key hiding inside the ApEx. So I took it apart (I still have scars from opening the glued case!), dumped the ROM, and reverse engineered the keys out of it.”

So to keep things short: Someone got an AirPort Express, dumped the firmware, extracted the AirPlay encryption keys and wrote an emulator of the AirPlay protocol which uses the key. Voilá!

ShairPort is available in source code on the site of the guy and obviously it’s unsure if Apple will react by changing the encryption key in the future. But for the time being it works as advertised:

I took one of my computers and followed the instructions to update perl, install Macports and then run ShairPort. So when ShairPort is run it looks not as appealing as expected:

Notably  it uses IPv6 to communicate between iTunes and ShairPort… Oh I almost forgot to show how it looks in iTunes:

On another side note: It works on Linux, Windows and Mac OS X :-)

Source 1: Apple AirPlay
Source 2: Sonos
Source 3: Apple AirPort Express
Source 4: ShairPort

modifying OS X terminal to make it more useable…

Using OS X for the daily work is getting easier every day. And most of the time I am doing work using the Terminal.app.

So there are some configuration changes necessary to make it even more useable…

  1. Edit /etc/bashrc and add some alias and color definitions
    1. alias ll=”ls -hfG”
    2. alias la=”ls -ahfG”
    3. export LSCOLORS=fxfxcxdxbxegedabagacad
  2. custom color schemes can be defined using the lscolors tool
  3. install screen (using MacPorts for example) and setup a ~/.screenrc
    1. Download a sample .screenrc

Source 1: http://geoff.greer.fm/lscolors/
Source 2: http://www.macports.org/
Source 3: ScreenRC.tar

hacs is getting the first UI elements

I’ve worked on my little holiday project for a while now and it’s making great progress. Since logging is working for almost two weeks now I got some data that should be visualized. One main goal of the project is to have  a great UI to browse the sensor data.

So almost two weeks into the project I’ve started to learn JavaScript Smiley 

The logging server now included an internal http server which serves some pages and RESTful services already. One of those services is the sensor data service which can be asked to output JSON formatted sensor data. If you take that data using jQuery and the flot jQuery plugin you’ll get something like that:

jQuery and flot based hacs UI

Source: http://github.com/bietiekay/hacs

h.a.c.s. milestone 0–in need of a backup tool

This EzControl XS1 device is a complex thing. And currently I am playing with more than 10 sensors and more than 10 actuators. Since poking around with such a device will most certainly lead to a condition where that configuration might get lost (like a power down for more than 30 minutes).

sensors

Therefore I was in need of a backup and restore tool. Because there isn’t one I had to write one myself. Here it is:

xs1-backup
I can haz backup tool

My tool is available as opensource as part of the h.a.c.s. toolkit here. Enjoy!

Source 1: https://github.com/bietiekay/hacs/wiki/H.a.c.s.-toolkit
Source 2: http://github.com/bietiekay/hacs/