intuitive shell command explanations

You want or you have to use shells – command line interfaces. And it’s something that always leads to stackoverflow / google sessions. Or you’re studying man-pages for hours.

But there’s a better way to view and understand these man-pages. There’s explainshell.com. Here is an example of what it can do:

As you can see it not only takes one command and shows you the meaning/function of a parameter. But it takes complex structured commands and unfolds it for you nicely onto a web page. Even the harder examples:

bringing the thinclient back

I had to solve a problem. The problem was that I did not wanted to have the exact same session and screen shared across different work places/locations simultaneously. From looking at the same screen from a different floor to have the option to just walk over to the lab-desk solder some circuits together and have the very same documents opened already and set on the screens over there.

One option was to use a tablet or notebook and carry it around. But this would not solve the need to have the screen content displayed on several screens simultaneously.

Also I did not want to rely on the computing power of a notebook / tablet alone. Of course those would get more powerful over time. But each step would mean I would have to purchase a new one.

Then in a move of desperation I remembered the “old days” when ThinClients used to be the new-kid in town. And then I tried something:

I just recently had moved all house server infrastructure over to Linux and Docker. So what would keep me from utilizing the computing power of that one beefy server in the basement to host all of my desktop needs?

It turns out: Nothing really. Docker is well prepared to host desktop environments. With a bit of tweaking and TigerVNC Xvnc I was able to pre-configure the most current Ubuntu to start my preferred Mate desktop environment in a container and expose it through VNC.

If you wanted to replicate this I would recommend this repository as a starting point.

Even better I found that the RaspberryPi single board computers come with a free pre-licensed and accelerated version of RealVNC.

So I took one of those RaspberryPis, booted up the Raspbian Desktop lite and connected to the dockers VNC port. It all worked just like that.

this is the RaspberryPi client with the windowed docker container VNC session

The screenshot above holds an additional information for you. I wanted sound! Video works smooth up to a certain size of the moving video – after all those RaspberryPis only come with sub Gbit/s wired networking. But to get sound working I had to add some additional steps.

First on the RaspberryPI that you want to output the sound to the speakers you need to install and set-up pulseaudio + paprefs. When you configure it to accept audio over the network you can then configure the client to do so.

In the docker container a simple command would then redirect all audio to the network:

pax11publish -e -S thinclient

Just replace “thinclient” with the ip or hostname of your RaspberryPI. After a restart Chrome started to play audio across the network through the speakers of the ThinClient.

Now all my screens got those RaspberryPIs attached to them and with Docker I can even run as many desktop environments in parallel as I wish. And because VNC does not care about how many connections there are made to one session it means that I can have all workplaces across the house connected to the same screen seeing the same content at the same time.

And yes: The UI and overall feel is silky smooth. And since VNC adapts to some extend to the available bandwidth by changing the quality of the image even across the internet the VNC sessions are very much useable. Given that there’s only 1 port for video and 1 port for audio it’s even possible to tunnel those sessions across to anywhere you might need them.

waking up to another dying hard disk – upgrade time!

At our house I am running a medium-sized operation when it comes to all the storage and in-house / home-automation needs of the family.

This is done by utilizing several products from QNAP, Synology and a custom built server infrastructure that does most of the heavy-lifting using Docker.

This morning I woke up to an eMail stating that one of the mirrored drives in the machine is reporting read-errors.

Since this drive is part of a larger array of spinning-rust style hard disks just replacing it would work but due to the life-time of those drives I am not particularly interested in more replacing in the very near future. So a more general approach seems right.

63083 lifetime hours = 2628 days = 7.2 years powered up

You can see what I mean. This drive is old. Very old. And so are its mates. Actually this is the newest drive of another 6 or so 1.5TB and 1TB drives in this array.

Since this redundant array in fact is still quite small and not fully used as most storage intensive non service-related disk space demands have moved to iSCSI and other means it’s not the case anymore that so many disks, so well redundant with so little disk space are needed anymore. Actual current space utilization seems about 20% of the available 2TB volume.

Time for an upgrade! Taking a look in the manual of the mainboard I had replaced 2 years ago I found that this mainboard does have dual NVMe m.2 ports. From which I can boot according to that same manual.

So I thought: Let’s start with replacing the boot drives and the /var/lib docker portions with something fast.

To my surprise Samsung is building 1 TB NVMe M.2 SSDs to a price I expected to be much higher.

Nice! So let me reeport back when this shipped and I can start the re-set-up of the operating system and docker environment. Which by all fairness should be straight forward. I will upgrade from Ubuntu 16.04 LTS to 18.04 LTS in the same step – and the only more complex things I expect to happen is the boot-from-ZFS(on Linux) and iSCSI set-up of the machine.

If you got any tips or best-practice, let me know.

I just have started the catch-up on what happpened in the last 2 years to ZFS on Linux. My initial decision to use Linux 2 years ago as the main driver OS and Ubuntu as the distribution was based upon the exepectation to not have this as my hobby in the next years. And that expectation was fulfilled by Ubuntu 16.04 LTS.

Join me implementing a neural network to improve accuracy of an OpenSource indoor location tracking system

To all techies reading this:

GIST: I am looking for interested hackers who want to help me implement a neural network that improves the accuracy of bluetooth low energy based indoor location tracking.

Longer version:

I am currently applying the last finishing touched to a house wide bluetooth low energy based location tracking system. (All of which will be opensourced)

The system consists of 10+ ESP-32 Arduino compatible WiFi/Bluetooth system-on-a-chip. At least one per room of a house.

These modules are very low powered and have one task: They scan for BLE advertisements and send the mac and manufacturer data + the RSSI (signal strength) over WiFi into specific MQTT topics.

There is currently a server component that takes this data and calculates a probable location of a seen bluetooth low energy device (like the apple watch I am wearing…). It currently is using a calibration phase to level in on a minimum accuracy. And then simple calculation matrices to identify the most probable location.

This all is very nice but since I got interested in neural networks and KI development – and I think many others might as well – I am asking here for also interested parties to join the effort.

I do have an existing set-up as well as gigabytes of log data.

I know about previous works like „Indoor location tracking system using neural network based on bluetooth

Now I am totally new to the overal concepts and tooling and I start playing with TensorFlow right now.

If you want to join, let me know by commenting!

Source: http://ieeexplore.ieee.org/document/7754772/

use case #3 – sonos auto bookmarker for audiobooks and podcasts

So you’re listening to this audio book for a while now, it’s quite long but really thrilling. In fact it’s too long for you to go through in one sitting. So you pause it and eventually listen to it on multiple devices.

We’ve got SONOS in our house and we’re using it extensively. Nice thing, all that connected goodness. It’s just short of some smart features. Like remembering where you paused and resuming a long audio book at the exact position you stopped the last time. Everytime you would play a different title it would reset the play-position and not remember where you where.

With some simple steps the house will know the state of all players it has. Not only SONOS but maybe also your VCR or Mediacenter (later use-case coming up!).

Putting together the strings and you get this:

Whenever there’s a title being played longer than 10 minutes and it’s paused or stopped the smart house will remember who, where and what has been played and the position you’ve been at.

Whenever that person then is resuming playback the house will know where to seek to. It’ll resume playback, on any system that is supported at that exact position.

Makes listening to these things just so much easier.

Bonus points for a mobile app that does the same thing but just on your phone. Park the car, go into the house, audiobook will continue playback, just now in the house instead of the car. The data is there, why not make use of it?

p.s.: big part of that I’ve opensourced years ago: https://github.com/bietiekay/sonos-auto-bookmarker

How to weigh your cat! – the IoT version

This is Leela. She is a 7 year old lilac white British short hair cat that lives with us. Leela had a sister who used to live with us as well but she developed a heart condition and passed away last year. Witnessing how quickly such conditions develop and evaluate we thought that we can do something to monitor Leelas health a bit to just have some sort of pre-alert if something is changing.

Kid in a Candystore

As this Internet of Things is becoming a real thing these days I found myself in a candy store when I’ve encountered that there are a couple of really really cheap options to get a small PCB with input/output connectors into my house WiFi network.

One of the main actors of this story is the so called ESP8266. A very small and affordable system-on-a-chip that allows you to run small code portions and connect itself to a wireless network. Even better it comes with several inputs that can be used to do all sorts of wonderful things.

And so it happened that we needed to know the weight of our cat. She seemed to get a bit chubby over time and having a point of reference weight would help to get her back in shape. If you every tried to weigh a cat you know that it’s much easier said than done.

The alternative was quickly brought up: Build a WiFi-connected scale to weigh her litter box every time she is using it. And since I’ve recently bought an evaluation ESP8266 I just had to figure out how to build a scale. Looking around the house I’ve found a broken human scale (electronics fried). Maybe it could be salvaged as a part donor?

A day later I’ve done all the reading on that there is a thing called “load-cell”. Those load cells can be bought in different shapes and sizes and – when connected to a small ADC they deliver – well – a weight value.

I cracked the human scale open and tried to see what was broken. It luckily turned out to have completely fried electronics but the load-cells where good to go.

Look at this load cell:

Hardware

That brought down the part list of this project to:

  • an ESP8266 – an Adafruit Huzzah in my case
  • a HX711 ADC board to amplify and prepare the signal from the load-cells
  • a human scale with just enough space in the original case to fit the new electronics into and connect everything.

The HX711 board was the only thing I had to order hardware wise – delivered the next day and it was a matter of soldering things together and throwing in a small Arduino IDE sketch.

My soldering and wiring skills are really sub-par. But it worked from the get-go. I was able to set-up a small Arduino sketch and get measurements from the load-cells that seemed reasonable.

Now the hardware was all done – almost too easy. The software would be the important part now. In order to create something flexible I needed to make an important decision: How would the scale tell the world about it’s findings?

Software

Two basic options: PULL or PUSH?

Pull would mean that the ESP8266 would offer a webservice or at least web-server that exposes the measurements in one way or the other. It would mean that a client needs to poll for a new number in regular intervals.

Push would mean that the ESP8266 would connect to a server somewhere and whenever there’s a meaningful measurement done it would send that out to the server. With this option there would be another decision of which technology to use to push the data out.

Now a bit of history: At that time I was just about to re-implement the whole house home automation system I was using for the last 6 years with some more modern/interoperable technologies. For that project I’ve made the decision to have all events (actors and sensors) as well as some additional information being channeled into MQTT topics.

Let’s refer to Wikipedia on this:

“MQTT1 (formerly MQ Telemetry Transport) is an ISO standard (ISO/IEC PRF 20922) publish-subscribe-based “lightweight” messaging protocol for use on top of the TCP/IP protocol. It is designed for connections with remote locations where a “small code footprint” is required or the network bandwidth is limited. The publish-subscribe messaging pattern requires a message broker. Thebroker is responsible for distributing messages to interested clients based on the topic of a message. Andy Stanford-Clark and Arlen Nipper of Cirrus Link Solutions authored the first version of the protocol in 1999.”

Something build for oil-pipelines can’t be wrong for your house – can it?

So MQTT uses the notation of a “topic” to sub-address different entities within it’s network. Think of a topic as just a simple address like “house/litterbox/weight”. And with that topic MQTT allows you to set a value as well.

The alternative to MQTT would have been things like WebSockets to push events out to clients. The decision for the home-automation was done towards MQTT and so far it seems to have been the right call. More and more products and projects available are also focussing on using MQTT as their main message transport.

For the home automation I had already set-up a demo MQTT broker in the house – and so naturally the first call for the litterbox project was to utilize that.

The folks of Adafruit provide the MQTT library with their hardware and within minutes the scale started to send it’s measurements into the “house/litterbox/weight” topic of the house MQTT broker.

Some tweaking and hacking later the litterbox was put together and the actual litterbox set on-top.

Since Adafruit offers platform to also send MQTT messages towards and create neat little dashboards I have set-up a little demo dashboard that shows a selection of data being pushed from the house MQTT broker to the Adafruit.io MQTT broker.

These are the raw values which are sent into the weight topic:

You can access it here: https://io.adafruit.com/bietiekay/stappenbach

So the implementation done and used now is very simple. On start-up the ESP8622 initialises and resets the weight to 0. It’ll then do frequent weight measurements at the rate it’s configured in the source code. Those weight measurements are being monitored for certain criteria: If there’s a sudden increase it is assumed that “the cat entered the litterbox”. The weight is then monitored and averaged over time. When there’s a sudden drop of weight below a threshold that last “high” measurement is taken as the actual cat weight and sent out to a /weight topic on MQTT. The regular measurements are sent separately to also a configurable MQTT topic.

You can grab the very ugly source code of the Arduino sketch here: litterbox_sourcecode

And off course with a bit of logic this would be the calculated weight topic:

Of course it is not enough to just send data into MQTT topics and be done with it. Of course you want things like logging and data storage. Eventually we also wanted to get some sort of notification when states change or a measurement was taken.

MQTT, the cloud and self-hosted

Since MQTT is enabling a lot of scenarios to implement such actions I am going to touch just the two we are using for our house.

  1. We wanted to get a push notification to our phones whenever a weight measurement was taken – essentially whenever the cat has done something in the litterbox. The easiest solution: Set-Up a recipe on If This Than That (IFTTT) and use PushOver to send out push notifications to whatever device we want.
  2. To log and monitor in some sort of a dashboard the easiest solution seemed to be Adafruits offer. Of course hosted inside our house a combination of InfluxDB to store, Telegraf to gather and insert into InfluxDB and Chronograf to render nice graphs was the best choice.

Since most of the above can be done in the cloud (as of: outside the house with MQTT being the channel out) or inside the house with everything self-hosted. Some additional articles will cover these topics on this blog later.

There’s lots of opportunity to add more logic but as far as our experiments and requirements go we are happy with the results so far – we now regularly get a weight and the added information of how often the cat is using her litterbox. Especially for some medical conditions this is quite interesting and important information to have.

in case of emergency: spoof your MAC address

 

 

There have been several occasions in the past years that I had to quickly change the MAC address of my computer in order to get proper network connectivity. May it be a corporate network that does not allow me to use my notebook in a guest wifi because the original MAC address is “known” or any other possible reasons you can come up with…

Now this is relatively easy on Mac OS X – you can do it with just one line on the shell. But now there’s an App for that. It’s called Spoof:

img

“I made this because changing your MAC address in OS X is harder than it should be. The Wi-Fi card needs to be manually disassociated from any connected networks in order for the change to apply correctly – super annoying! Doing this manually each time is tedious and lame.

Instead, just run spoof and change your MAC address in one command. Now for Linux, too!”

Source: https://github.com/feross/spoof

Debugging Linux: Latency heatmaps

“Odd patterns of I/O latency can be hidden by line graphs and summary statistics, and revealed by histograms and heat maps. In my previous post I showed my Linux iosnoop tool, which can trace block device I/O along with timestamps and latency. This information can be visualized, revealing any odd patterns.”

Bildschirmfoto 2014-07-27 um 12.55.50

Source: http://www.brendangregg.com/blog/2014-07-23/linux-iosnoop-latency-heat-maps.html

Do you need an alternative shell for your terminal?

“Commands have been a big part of computing ever since the 1970’s.  Their power comes from their simplicity.  Just type a word or two to do what you want.  The time has come to bring this power together with the usability and convenience of modern interfaces.”

“Xiki is open and flexible.  It’s open source, and brings together tools, languages, shells, and text editors, rather than competing with them.  Open formats and languages are the best thing for the tech ecosystem.  HTML and JSON made the web what it is today.  And the web arguably made everything else. 

Xiki strives to be the simplest possible way (and ways) to create interactive interfaces.  This means a text in and text out interface.  Since everything is text, almost nothing is against the rules when you’re creating an interface in Xiki.  Xiki stands for “expanding wiki”, and is inspired by the wiki philosophy of fully editable text, with simple syntaxes (like “>” for a heading, and “-” for a bullet).  Xiki extends wiki ideas to user interface in general.”

Source: https://www.kickstarter.com/projects/xiki/xiki-the-command-revolution

Nitrous – full IDE in your browser – with Collaboration!

“Nitrous is a backend development platform which helps software developers save time by cutting out the repetitive parts of creating development environments and automating them.

Once you create your first development environment, there are many features which will make development easier.”

Bildschirmfoto 2014-07-06 um 11.38.49

So what you’re getting is:

  • a virtual machine operated for you and set-up with a single click
  • A full-featured IDE in your browser
  • Code-Collaboration by inviting others to edit your project
  • a debugging environment in which you can test-run and work with your code

Here are some screenshots to get you a feel for it:

Source: https://www.nitrous.io/