Blogroll: Nerdcore NC-Sources OPML

A couple of days ago the well known and much read Nerdcore weblog author created a page he calls NC-Sources which lists all the sources he has in his RSS reader to get new information from. As you can imagine, this is pure gold for those who want to get interesting links to all-nerd pages.

Unfortunately NC-Sources is just available as a web-page which lists the name and the RSS feed URL. You cannot import that into your RSS Reader to use it for your own informational needs.

Here I am to the rescue. I’ve taken all the URLs from that NC Source page. That resulted in a file that lists the page url and the rss-feed url in alternating lines. A short trip to the command line and the use of awk helped to filter just the rss-feed urls to a new file and that was filled into an opml generator.

So now you can download the OPML file to import it into your own RSS reader. Get it here.

Source 1: NC-Sources
Source 2: NC-Sources OPML File
Source 3: OPMLBuilder

Raspberry Pi gets a camera

The first signs of the upcoming camera board for the raspberry pi are showing. During the Electronica 2012 fair RS showed the board to the public for the first time.

Since it’s going to be a 25 Euro add-on for the Pi the specification is quite impressive. The OmniVision OV5647 is used as the Image Sensor – it’s bigger brother is used in iPhone 4. OmniVision says:

“The OV5647 is OmniVision’s first 5-megapixel CMOS image sensor built on proprietary 1.4-micron OmniBSI™ backside illumination pixel architecture. OmniBSI enables the OV5647 to deliver 5-megapixel photography and high frame rate 720p/60 high-definition (HD) video capture in an industry standard camera module size of 8.5 x 8.5 x ≤5 mm, making it an ideal solution for the main stream mobile phone market.

The superior pixel performance of the OV5647 enables 720p and 1080p HD video at 30 fps with complete user control over formatting and output data transfer. Additionally, the 720p/60 HD video is captured in full field of view (FOV) with 2 x 2 binning to double the sensitivity and improve SNR. The post binning re-sampling filter helps minimize spatial and aliasing artifacts to provide superior image quality.

OmniBSI technology offers significant performance benefits over front-side illumination technology, such as increased sensitivity per unit area, improved quantum efficiency, reduced crosstalk and photo response non-uniformity, which all contribute to significant improvements in image quality and color reproduction. Additionally, OmniVision CMOS image sensors use proprietary sensor technology to improve image quality by reducing or eliminating common lighting/electrical sources of image contamination, such as fixed pattern noise and smearing to produce a clean, fully stable color image.

The low power OV5647 supports a digital video parallel port or high-speed two-lane MIPI interface, and provides full frame, windowed or binned 10-bit images in RAW RGB format. It offers all required automatic image control functions, including automatic exposure control, automatic white balance, automatic band filter, automatic 50/60 Hz luminance detection, and automatic black level calibration.”

That sensor delivers RAW RGB Imagery to the RaspberryPi through the onboard camera connector interface:

this actually is a 14 MPixel test-board and not the final 5 MPixel one…

And the part that impressed me the most is that that 5 Megapixel sensor delivers it’s raw data stream and it gets h264 compressed directly within the GPU of the Raspberry Pi. 30 frames per second 1080p without noticeable CPU load – how does that sound? – Not bad for a 50 Euro setup!

Source 1: First Demo
Source 2: OmniVision OV5647 Color CMOS QSXGA Image Sensor

a delicious raspberry pi

Just a couple of days ago – after a waiting time of more than half a year – my personal raspberry pi board arrived. Fantastic!

It’s small. Oh yes, it’s very very small.

What is the Raspberry Pi you may ask:

“The Raspberry Pi is a credit-card sized computer that plugs into your TV and a keyboard. It’s a capable little PC which can be used for many of the things that your desktop PC does, like spreadsheets, word-processing and games. It also plays high-definition video. We want to see it being used by kids all over the world to learn programming.”

For under 40 Euro you get a huge choice of I/O interfaces like USB, Ethernet, HDMI, Audio and Multi Purpose IO pins you can play with if you’re into hardware hacking. This small card is running a fully blown linux and because it has a dedicated graphics core which can hardware decode and encode 1080p h264 it’s definitely a good choice for a home mediacenter (yes, XBMC runs on it.)

It draws so little power that you could use solar panels to power it. It’s all open and sourced and I will use it for a couple of things in the household. Like a cheap Airplay node. Or a more intelligent sensor node for home automation. This thing seriously rocks – finally a device to play with – with reasonable horse-power.

Source 1: http://www.raspberrypi.org
Source 2: http://www.raspbmc.com

configuring the nano editor to my needs…

Configuring your favourite Editor on OSX (or Linux, or anywhere else) is important – since nano is my editor of choice I wanted to use it’s syntax highlighting capabilities. Easy as pie as it turned out:

I started with a .nanorc file from this guy and modified it to recognize some of my frequent file-types (like .cs files).

You can download my nanorc.tar – just extract it and put it into your user home directory.

Source 1: http://talk.maemo.org/showthread.php?t=68421
Source 2: http://www.nano-editor.org/dist/v2.2/nano.html#Nanorc-Files
Source 3: nanorc.tar

das außer-Haus Backup

Irgendwie werden es auch privat immer immer mehr Daten – mit immer zunehmender Geschwindigkeit… Alle paar Jahre tausche ich bei uns im Haushalt die Festplatten/Speicherlösung komplett aus – was zwar immer wieder mal eine Investitions bedeutet, gleichzeitig aber auch dafür sorgt dass Daten nicht irgendwelchen ungünstigen mechanischen, chemischen oder magnetischen Effekten zum Opfer fallen… Ja so etwa alle zwei Jahre wird alles einmal umkopiert… Das dauerte beim letzten Mal zwar gut eine Woche, aber naja so ist das eben…

Aus vielerlei Grund haben wir auch für einen Haushalt recht viel Bedarf an Speicherplatz – teilweise wohl auch weil meine Frau Photographin ist – aber ich als “werf-nix-weg”-Typ werd da auch einen guten Anteil dran haben…

Herr über alle unsere Festplatten (kein Witz, die Rechner bei uns haben ihre Festplatten eigentlich nur um booten zu können) ist seit jeher ein einzelner Rechner welcher ebenso alle paar Jahre komplett ausgetauscht wird. Dieser Rechner verwaltet im Moment zwischen 12-15 Festplatten verschiedener Größe – Hauptarbeit wird zur Zeit durch drei separate (gewachsene) RAID-5 Volumes erledigt…

Nebenbei: Nein ich kann/will da kein RAID-6 fahren ohne entweder Linux zu verwenden (was aus verschiedenen Gründen nicht geht) oder einen Hardware-Controller zu verwenden, was nach einschlägigen Erfahrungen querbeet durch alle möglichen Hardware RAID Controller ausfällt.

Dem ganzen Festplattenstapel liegt dann ein Standard-PC mit Windows Server 2008 zugrunde – zum einen weil ich so eine Lizenz noch herumliegen hatte und zum anderen weil ich in über 10 Jahren File-Server Erfahrungen sammeln noch nie auch nur ein Byte unter Windows verloren habe. Zusätzlich habe ich einen riesigen Haufen Software welche Windows-only ist ud sozusagen ständig laufen muss um Sinn zu machen (Mail-Server Puffer, Newsserver Mirror, Musik und Video Streaming Server, Medienbibliothek, Videorekorder,…

Diese drei großen RAID Volumes schnappt sich dann Truecrypt und ver- und entschlüsselt zuverlässig vor sich hin – im Endeffekt gibt es kein Byte Daten im Haushalt welches nicht verschlüsselt wäre. Gut für uns.

So ein RAID verhindert nun ja aber nicht dass dennoch oben genannte ungünstige Effekte eintreten und man mal eine oder mehrere Defekte zu beklagen hat. Im Normalfall tauscht man die defekte Festplatte, resynct das RAID und alles funktioniert weiter ohne dass man Daten verloren hätte. Allerdings ist das ja kein Backup. Das ist nur eine erste Absicherung gegen mögliche Defekte.

Getreu folgendem kurzen Musikstück:

RAID ist kein Backup

… ist ein RAID eben kein Backup. Backups erledigt bei mir eine Sammlung von Scripten welche jeweils in festen Abständen Vollbackups und Differenz-Backups erstellt. Da kommt dann ein Haufen 1 Gbyte großer Dateien raus welche dann anschliessend per RSync in mühevoller (und dank funktionierendem QoS unbemerkt) Arbeit außer Haus geschafft werden. Die Komplett-Backups dauern aufgrund der großen Menge einfach ewig lang und lassen sich recht einfach dadurch beschleunigen dass man sozusagen das Backup physisch auf einer externen Festplatte zum Server trägt…die Differenz-Backups sind dann meist immer recht flott durchgelaufen. Speicherplatz im Internet wird ja auch immer billiger und so haben wir auch immer ein gutes Off-Site Backup unserer Daten…

Für Windows gibt es neben den üblichen Cygwin Ports von rsync auch eine gute GUI Version namens DeltaCopy. Das Ding kopiert zuverlässig und auch wenn mal der DSL Router rebootet oder hängt nimmt er selbständig die Kopierarbeit wieder auf sobald Netz wieder verfügbar ist.

Damit DeltaCopy seine Daten irgendwo abladen kann wird auf der Gegenstelle natürlich ein rsync Server vorrausgesetzt. Die Konfiguration eines solchen ist nicht sonderlich kompliziert – im Grunde muss man nur rsync installieren und die rsyncd.conf Datei anpassen. Zusätzlich dazu muss man eine Konfigurationsdatei anlegen in welchem nach dem Schema “Benutzername:Passwort” entsprechend die Nutzeraccounts angegeben werden – das wars eigentlich schon. Rsync ist sehr robust und vor allem auch gut für geringere Bandbreiten geeignet. Wenn sich an einer Datei nur wenige Bytes geändert haben müssen auch nur die geänderten Bytes übertragen werden.

Source 1: http://www.speichergurke.de
Source 2: http://www.aboutmyip.com/AboutMyXApp/DeltaCopy.jsp
Source 3: http://de.wikipedia.org/wiki/Rsync

Shairport – someone reversed an AirPort Express

Low Latency Network Audio was a dream for the past years (see an article of 2005 and 2008) and with AirPlay it’s finally there.

I am using the Apple AirPlay technology for several years now… after it got implemented into iOS it’s just fantastic to have the option to have whatever sound source I want to playing loud and clear in any room I want to…

Okay it’s not quite as sophisticated as the sonos solution regarding the control of multiple music sources in multiple rooms but it get’s the job done in an apartment.

So back to the topic: Apple integrated the AirPlay technology into their wireless base station “AirPort Express”. Basically AirPlay is a piece of software which receives an encrypted audio stream over the network and outputs the stream to the SPDIF or audio jack.

Back in 2005 there already was an emulator of this protocol called “Fairport” but Apple decided to encrypt the AirPlay traffic. This led to the problem that the encryption key was unkown because it’s baked into the AirPort Express firmware. And this is where the good news start:

“My girlfriend moved house, and her Airport Express no longer made it with her wireless access point. I figured it’d be easy to find an ApEx emulator – there are several open source apps out there to play to them. However, I was disappointed to find that Apple used a public-key crypto scheme, and there’s a private key hiding inside the ApEx. So I took it apart (I still have scars from opening the glued case!), dumped the ROM, and reverse engineered the keys out of it.”

So to keep things short: Someone got an AirPort Express, dumped the firmware, extracted the AirPlay encryption keys and wrote an emulator of the AirPlay protocol which uses the key. Voilá!

ShairPort is available in source code on the site of the guy and obviously it’s unsure if Apple will react by changing the encryption key in the future. But for the time being it works as advertised:

I took one of my computers and followed the instructions to update perl, install Macports and then run ShairPort. So when ShairPort is run it looks not as appealing as expected:

Notably  it uses IPv6 to communicate between iTunes and ShairPort… Oh I almost forgot to show how it looks in iTunes:

On another side note: It works on Linux, Windows and Mac OS X :-)

Source 1: Apple AirPlay
Source 2: Sonos
Source 3: Apple AirPort Express
Source 4: ShairPort

modifying OS X terminal to make it more useable…

Using OS X for the daily work is getting easier every day. And most of the time I am doing work using the Terminal.app.

So there are some configuration changes necessary to make it even more useable…

  1. Edit /etc/bashrc and add some alias and color definitions
    1. alias ll=”ls -hfG”
    2. alias la=”ls -ahfG”
    3. export LSCOLORS=fxfxcxdxbxegedabagacad
  2. custom color schemes can be defined using the lscolors tool
  3. install screen (using MacPorts for example) and setup a ~/.screenrc
    1. Download a sample .screenrc

Source 1: http://geoff.greer.fm/lscolors/
Source 2: http://www.macports.org/
Source 3: ScreenRC.tar

great SIP Softphone for Linux and Windows

Thank goodness I can uninstall X-Lite! At sones we are using a SIP based telephony solution. And therefore some times a SIP softphone application is needed along with the obligatory hardware SIP telephones. Till today the only half-working software I knew for that task was X-Lite. But a colleague told me today that there is a better software which not even looks better but also works better than X-Lite.

It’s called “Ekiga” and it’s a GTK based open source application which can run on Windows and Linux. It looks clean and therefore nice and works great.

A special tip from me: Abort the Welcome Wizard because the only thing it does is registering you with ekigas’ own services.

Capture

Source: http://ekiga.org/

Mono 2.8 released!

Hurray! Finally the 2.8 version of Mono – the platform independent open source .NET framework is available as of today. I finally don’t have to recompile the trunk every now and then to get my bits running Smiley

The Major Highlights according to the release notes are:

  • C# 4.0
  • Defaults to the 4.0 profile.
  • New Garbage Collection engine
  • New Frameworks:
    • Parallel Framework
    • System.XAML
  • Threadpool exception behavior has changed to match .NET 2.0
    • potentially a breaking change for a lot of Mono-only software
    • See information below in the "Runtime" section.
  • New Microsoft open sourced frameworks bundled:
    • System.Dynamic
    • Managed Extensibility Framework
    • ASP.NET MVC 2
    • System.Data.Services.Client (OData client framework)
  • Performance
    • Large performance improvements
    • LLVM support has graduated to stable
      • Use mono-llvm command to run your server loads with the LLVM backend
  • Preview of the Generational Garbage Collector
  • Version 2.0 of the embedding API
  • WCF Routing
  • .NET 4.0’s CodeContracts
  • Removed the 1.1 profile and various deprecated libraries.
  • OpenBSD support integrated
  • ASP.NET 4.0
  • Mono no longer depends on GLIB

Oh – they even linked my benchmark article.

Source: http://www.mono-project.com/Release_Notes_Mono_2.8

How To strip those TFS Source Control references from Visual Studio Solutions

Every once in a while you download some code and fire up your Visual Studio and find out that this particular solution was once associated to a team foundation server you don’t know or have a login to. Like when you download source code from CodePlex and you get this “Please type in your username+password for this CodePlex Team Foundation Server”.

Or maybe you’re working on your companies team foundation server and you want to put some code out in the public. You surely want to get rid of these Team Foundation Server bindings.

There’s a fairly complicated way in Visual Studio to do this but since I was able to produce unforseen side effects I do not recommend it.

So what I did was looking into those files a Visual Studio Solution and Project consists of. And I found that there are really just a few files that hold those association information. As you can see in the picture below there are several files side by side to the .sln and .csproj files – like that .vssscc and .vspscc file. Even inside the .csproj and .sln file there are hints that lead to the team foundation server – so obviously besides removing some files a tool would have to edit some files to remove the tfs association.

strip-files

So I wrote such a tool and I am going release it’s source code just beneath this article. Have fun with it. It compiles with Visual Studio and even Mono Xbuild – actually I wrote it with Monodevelop on Linux ;) Multi-platform galore! Who would have thought of that in the founding days of the .NET platform?

Bildschirmfoto-StripTeamFoundationServerInformation - Main.cs - MonoDevelop

So this is easy – this small tool runs on command line and takes one parameter. This parameter is the path to a folder you want to traverse and remove all team foundation server associations in. So normally I take a check-out folder and run the tool on that folder and all its subfolders to remove all associations.

So if you want to have this cool tool you just have to click here: Sourcecode Download