To remind you of the recently celebrated 42nd mission anniversary of the still active and data transmitting Voyager 2 space craft.
NASA’s Voyager 2 is the second spacecraft to enter interstellar space. On Dec. 10, 2018, the spacecraft joined its twin—Voyager 1—as the only human-made objects to enter the space between the starsVoyager 2 Homepage
And what reminded me of this astonishing achievement.
Think of this: You are flying at >34k miles per hour. You are >18.5 billion miles away from earth. (It’ll take >16 hours at light speed one-way trip from earth to you). And on top, you are still able to send data back to earth at 159 bytes per second.
Microsoft recently is releasing a lot of tools and assets for developers and terminal monkeys.
This is good. Very nice of them.
The recent release of a font specifically for terminal and code editing use seems worth a mention here:
Cascadia Code was announced this past May at Microsoft’s Build event. It is the latest monospaced font shipped from Microsoft and provides a fresh experience for command line experiences and code editors. Cascadia Code was developed hand-in-hand with the new Windows Terminal application. This font is most recommended to be used with terminal applications and text editors such as Visual Studio and Visual Studio Code.Cascadia Code announcement
What’s most interesting about this: It’s got code ligatures. Just recently a lot of development focussed fonts showed up and they started to incorporate special characters for development specific character combinations:
Cascadia Code supports programming ligatures! Programming ligatures are most useful when writing code, as they create new glyphs by combining characters. This helps make code more readable and user-friendly for some people.Cascadia
If you, like me, are looking into new emerging tools and technologies you might also look at Wireguard.
WireGuard® is an extremely simple yet fast and modern VPN that utilizes state-of-the-art cryptography. It aims to be faster, simpler, leaner, and more useful than IPsec, while avoiding the massive headache. It intends to be considerably more performant than OpenVPN. WireGuard is designed as a general purpose VPN for running on embedded interfaces and super computers alike, fit for many different circumstances. Initially released for the Linux kernel, it is now cross-platform (Windows, macOS, BSD, iOS, Android) and widely deployable. It is currently under heavy development, but already it might be regarded as the most secure, easiest to use, and simplest VPN solution in the industry.bold wireguard website statement
To apply and get started with WireGuard on Linux and iOS I’ve used the very nice tutorial of Graham Stevens: WireGuard Setup Guide for iOS.
This guide will walk you through how to setup WireGuard in a way that all your client outgoing traffic will be routed via another machine (server). This is ideal for situations where you don’t trust the local network (public or coffee shop wifi) and wish to encrypt all your traffic to a server you trust, before routing it to the Internet.WireGuard Setup Guide for iOS.
The 50th Day of the Season of Bureaucracy. The Fluxday marking the approach of the Season of The Aftermath.Aftermath (season)
I’ve written on this topic before here. And as developers venture more into these generative algorithms it’s all that more fun to see even the intermediate results.
Oskar Stålberg writes about his little experiments and bigger libraries on Twitter. The above short demonstration was created by him.
Especially worth a look is the library he made available on GitHub: mxgmn/WaveFunctionCollapse.
Some more context, of questionable helpfulness:
In quantum mechanics, wave function collapse occurs when a wave function—initially in a superposition of several eigenstates—reduces to a single eigenstate due to interaction with the external world. This interaction is called an “observation”. It is the essence of a measurement in quantum mechanics which connects the wave function with classical observables like position and momentum. Collapse is one of two processes by which quantum systems evolve in time; the other is the continuous evolution via the Schrödinger equation. Collapse is a black box for a thermodynamically irreversible interaction with a classical environment. Calculations of quantum decoherence predict apparent wave function collapse when a superposition forms between the quantum system’s states and the environment’s states. Significantly, the combined wave function of the system and environment continue to obey the Schrödinger equation.Wikipedia: WFC
Right. Well. Told you. Here are some nice graphics of this applied to calm you:
If you, like me, once every while need to type the same again and again it might also get tired for you as it got for me.
A specific example: I very frequently need to have the current date available to be entered.
May it be because I need to name a file correctly, prepending it with the current date, or because I need it to refer to a specific date in a text I am currently typing.
The common scheme for dates I am using is YYYY-MM-DD. The 24th of September 2019 becomes 2019-09-24.
For when I am on Windows I am using a small utility called “TyperTask” to have a system wide shortcut available to me that will enter the current date with the press of a button.
As you can see in the screenshot above. By simply adding / editing the TXT file you will be able to specify new shortcuts. In the above case ALT+D or STRG+SHIFT+D will generated my desired date text pattern.
Still working on these…
Still lots of errors and challenges to positioning and casing. It works electrically and in software. Does not yet fit into a case.
It’s supposed to get you these sensors accomodated:
- barometric pressure
- PIR motion
- light intensity
- bluetooth scan/BLE connectivity
- Wifi scan / Wifi connectivity
And a RGB LED as output. All powered by USB and an ESP32.
As iOS 13 has introduced a system wide dark mode into my workflow I had a good reason to give the CSS of this website a little spin.
Depending on your system settings this website now supports Dark Mode.
Episode 006: “Monitoring Release Pipeline” ist fertig und steht bald zum Download und zeit-sourveränem anhören bereit.
Diesmal unterhält Andreas sich mit mir über:
- Traefik: https://traefik.io/
- Drive In Briefkästen: https://twitter.com/aheil/status/1173…
- Automated Konbini: https://www.forbes.com/sites/akikokat…
- Neue Packstationen: https://www.golem.de/news/deutsche-po…
- Arduino Aquarium Futter Automat: https://www.schrankmonster.de/2019/09…
- Louis Rossmann Macbook Repairs: https://www.youtube.com/user/rossmann…
- Make Buch: https://www.dpunkt.de/buecher/12488/9…
- Icinga: www.icinga.com
- SONOS Auto Bookmarker: https://github.com/bietiekay/sonos-au…
- TOTP – Time based one-time password: https://en.wikipedia.org/wiki/Time-ba…
- 1Password: https://1password.com/
- Enpass: https://www.enpass.io/
- ODroid GO: https://wiki.odroid.com/odroid_go/odr…
- ODroid Go: https://www.hardkernel.com/shop/odroi…
Do you always slack off on your computer and worry about getting busted?Not anymore because Daytripper is here to save the day!daytripper
Booting a computer does not happen extremely often in most use-cases, yet it’s a field that has not seen as much optimization and development as others had.
If you own a modern age phone it’s very likely that it will store the photos you take in a wonderful format called HEIC – or “High Efficiency Image File Format (HEIF)”.
So HEIC does not quite fit yet. But you can make it fit with this on Linux.
Imagemagick and current GIMP installations apparently still don’t come pre-compiled with HEIF support. But you can install a tool to easily convert an HEIC image into a JPG file on the command line.
apt install libheif-examples
and then the tool heif-convert is your friend.
You might have asked yourself how it is that some phones charge up faster than others. Maybe the same phone charges at different speed when you’re using a different cable or power supply. It even might not charge at all.
There is some very complicated trickery in place to make those cables and power supplies do things in combination with the active devices like phones. Many of this is implemented by standards like “Quick Charge”:
Quick Charge is a technology found in QualcommSoCs, used in devices such as mobile phones, for managing power delivered over USB. It offers more power and thus charges batteries in devices faster than standard USB rates allow. Quick Charge 2 onwards technology is primarily used for wall adaptors, but it is also implemented in car chargers and powerbanks (For both input and output power delivery).Wikipedia: Quick Charge
So in a nutshell: If you are able to speak the quick charge protocol, and with the right cable and power supply, you are able to get anything between 3.6 and 20V out of such a combination by just telling the power supply to do so.
This is great for maker projects in need of more power. There’s lots of things to consider and be cautious about.
“Speaking” the protocol just got easier though. You can take this open source library and “power up your project”:
The above mentioned usage-code will give you 12V output from the power supply. Of course you can also do…:
Be aware that your project needs to be aware of the (higher) voltage. It’s really not something you should just try. But you knew that.
More on Quick Charge also here.
NTT DoCoMo might not have been the first ones to release feature phones with actual emoji characters to be used in text messaging. But their set of original emojis is just oh-so-beautiful to look at.
With the help of Monica Dinculescu we now can enjoy these emojis on our modern era computing machines.
You can either get the font downloaded for free directly from Monicas page or you could use her SVG code to further make use of the great emojis.
If you go for the SVG link you will get some overview alike the one at the start of this post. If you wanted to further work with the raw vector data (SVG) in there you could use this simple trick:
Step 1: locate the emoji you want in the code of the page. Maybe by utilizing the developer tools of your browser.
Step 2: Copy that specific element that you want to your clipboard / into a new text document.
Step 3: add the proper header tag before the element you’ve copied.
<?xml version="1.0" encoding="utf-8"?>
Step 4: Save the contents now as a file with the .svg ending. You can now open it up in any SVG compatible editor, like Inkscape.
We’ve got several quite big fish tanks in our house. Mainly used by freshwater turtles.
These turtles need to be fed every once in a while. And while this is not an issue normally it’s an issue if you leave the house for travel for an extended period of time.
Of course there are humans checking on everything in the house regularly but as much as can be automated should and will be automated in our household. So the requirement wa to have the turtle feeding automated.
To achieve this is would be necessary to have a fixed amount of turtle food be dispensed into the tanks on a plan and with some checks in the background (like water quality and such).
It’s been quite a hassle to come up with a plan how the hardware should look like and work. And ultimately i’ve settled on retrofitting an off-the-shelf fish pond feeder to become controllable through MQTT.
The pond feeder I’ve found and used is this one:
It’s not really worth linking to a specific product detail page as this sort of feeder is available under hundreds of different names. It always looks the same and is priced right around the same.
If you want to build this yourself, you want one that looks like the above. I’ve bought 3 of them and they all seem to come out of the same factory somewhere in China.
Anyway. If you got one you can easily open it up and start modifying it.
The functional principle of the feeder is rather simple:
- turn the feeder wheel
- take the micro-switch status in account – when it’s pressed down the wheel must be pushing against it
- turn it until the micro-switch is not pressed anymore
- turn some more until it’s pressed again
Simple. Since the switch-status is not known on power loss / reboot a calibration run is necessary (even with the factory electronics) every time it boots up.
After opening the feeder I’ve cut the two cables going to the motor as well as the micro-switch cables. I’ve added a 4-Pin JST-XH connector to both ends. So I can reconnect it to original state if desired.
These are all the parts needed:
I am using a Wemos D1 Mini and a couple of additional components apart from the prototype board:
A PN2222 NPN transistor, a rectifier diode 1N4007 and a 220 Ohm resistor.
I’ve connected everything according to this schematic I’ve drawn with Fritzing:
I’ve then prototyped away and put everything on the PCB. Of course with very limited solderig skill:
As you can see the JST-XH connector on Motor+Switch can now be connected easily to the PCB with all the parts.
Make sure you check polarity and that you did correctly hook up the motor and switch.
When done correctly the PCB (I’ve used 40mm x 60mm prototype pcb) and all cables will fit into the case. There’s plenty of room and I’ve put it to the side of it. I’ve also directly connected an USB cable to the USB port of the Wemos D1 Mini. As long as you put at least 1A into it it will all work.
Since the Wemos D1 Mini sports an ESP8266 and is well supported by Arduino it was clear to me to use Arduino IDE for the software portion of this project.
Of course everything, from schematics to the sourcecode is available as open source.
To get everything running you need to modify the .ino file in the src folder like so:
What you need to configure:
- the output pins you have chosen – D1+D2 are pre-configured
- WiFi SSID + PASS
- MQTT Server (IP(+Username+PW))
- MQTT Topic prefix
Commands that can be sent through mqtt to the /feed topic.
There are overall two MQTT topics:
This topic will hold the current state of the feeder. It will show a number starting from 0 up. When the feeder is ready it will be 0. When it’s currently feeding it will be 1 and up – counting down for every successfull turn done. There is an safety cut-off for the motor. If the motor is longer active than configured in the MaximumMotorRuntime variable it will shut-off by itself and set the state to -1.
This topic acts as the command topic to start / control the feeding process. If you want to start the process you would send the number of turns you want to happen. So 1 to 5 seems reasonable. The feeder will show the progress in the /state topic. You can update the amount any time to shorten / lengthen the process. On the very first feed request after initial power-up / reboot the feeder will do a calibration run. This is to make sure that all the wheels are in the right position to work flawlessly.
So if you want to make it start feeding 3 times:
mosquitto_pub -t house/stappenbach/feeder/feeder-00F3B839/feed -m 3
And if you want to see the state of the feeder:
mosquitto_sub -v -t house/stappenbach/feeder/feeder-00F3B839/state
All in all there are 3 of these going to be running in our household and the feeding is going to be controlled either by Alexa voice commands or through Node-Red automation.
I am still working on it – but it is coming together nicely. During the next vacation our fish tanks are going to be well fed.
TIL that I could do something which I assumed everybody could do. I could make me hear a roaring thunder sound by flexing a muscle I did not know until now.
It’s quite interesting. The muscle is named “Tensor tympani” and it’s here:
The tensor tympani acts to dampen the noise produced by chewing. When tensed, the muscle pulls the malleus medially, tensing the tympanic membrane and damping vibration in the ear ossicles and thereby reducing the perceived amplitude of sounds.https://en.wikipedia.org/wiki/Tensor_tympani_muscle#Voluntary_control
So the eye has an Iris to control how much light makes it in. The ear has this muscle to dampen too loud sounds. And apparently not everyone is able to willingly control it. Bummer!
Contracting muscles produce vibration and sound. Slow twitch fibers produce 10 to 30 contractions per second (equivalent to 10 to 30 Hz sound frequency). Fast twitch fibers produce 30 to 70 contractions per second (equivalent to 30 to 70 Hz sound frequency). The vibration can be witnessed and felt by highly tensing one’s muscles, as when making a firm fist. The sound can be heard by pressing a highly tensed muscle against the ear, again a firm fist is a good example. The sound is usually described as a rumbling sound.https://en.wikipedia.org/wiki/Tensor_tympani_muscle
Some individuals can voluntarily produce this rumbling sound by contracting the tensor tympani muscle of the middle ear. The rumbling sound can also be heard when the neck or jaw muscles are highly tensed as when yawning deeply. This phenomenon has been known since (at least) 1884.
Interesting theories not started in my head. As I am very sensitive to chewing noises of all sorts – either produced by myself or by others. This could give an explanation to why.
Now excuse me, I need to flex this muscle and make the thunder roar!
Augmented Reality – AR – is getting some buzz here and there throughout the last 20 years almost. With hardware becoming more powerful and optics+light hardware becoming cheaper and more efficient it’s still all but close to become widely used and available.
Many refer to some one-trick pony feature in location-based games like “Pokemon Go” to being “AR”. But actual useful cases of AR are there but not feasible with current hardware generations.
Nevertheless a team in california has taken our the scissors and keyboards and made HoloKit:
HoloKit features super sharp optics quality and a 76-degree diagonal field of view. Pairing with a smartphone, HoloKit can perform an inside out tracking function, which uses the changing perspective on the outside world to note changes in its own position. HoloKit merges the real and the virtual in a smart way. While you see through the real world, virtual objects are blended into it. Powered by the accurate gyro and camera on smart phones, HoloKit solidly places virtual objects onto your table or floor, as if they were physically there without physical makers. These virtual objects will stay in the same place even if you walk away, just like real physical objects.https://holokit.io/
HoloKit is different from screen-based AR experience like Tango. You can directly see through the headset and view the real world as is, and in the meantime the virtual objects are projected on top of the real world, as opposed to viewing both the real and the virtual through a smartphone camera.
Browsers can do many things. It’s probably your main window into the vast internet. Lots of things need visualization. And if you want to know how it’s done, maybe do one yourself, then…
And to further learn what it’s all about, go to Amelia Wattenbergers blog and take a stroll:
So, you want to create amazing data visualizations on the web and you keep hearing about D3.js. But what is D3.js, and how can you learn it? Let’s start with the question: What is D3?An Introduction to D3.js
While it might seem like D3.js is an all-encompassing framework, it’s really just a collection of small modules. Here are all of the modules: each is visualized as a circle – larger circles are modules with larger file sizes.
The demon core was a spherical 6.2-kilogram (14 lb) subcritical mass of plutonium 89 millimetres (3.5 in) in diameter, that was involved in two criticality accidents, on August 21, 1945 and May 21, 1946.Wikipedia: Demon core
Now you can have fun without the death-risk in the comfort of your home.
Meet the party-core:
If you’re interested in this topic I can recommend a book:
Atomic Accidents: A History of Nuclear Meltdowns and Disasters: From the Ozark Mountains to Fukushima
A “delightfully astute” and “entertaining” history of the mishaps and meltdowns that have marked the path of scientific progress (Kirkus Reviews, starred review).
Augmented Reality needs proper 3D geometry and the ability to sense the environment to interact with it. At some point I would expect tools to show up that allow us to do some of this ourselves.
Seems like we’re one step closer. Ubiquity6 is reaching out to get early access to interested users:
We’re giving early access to our 3D mapping tools for creators and artists! If you’re interested in trying it out sign up for early access here: https://ubiquity6.typeform.com/to/bmpbkBUbiquity6 on Twitter
Of course. I applied. And I’ve just started testing.
Have you ever wanted a full control over your communication tool ? #SnapOnAir #BlaspBerry v2. A true Qwerty computer KB. @Raspberry_Pipwav robot on Twitter
zero W. @Quectel
3G cellular chip. #Lora RFM95 chip. All opensource.
There’s a full twitter thread here. More pictures, more information.
And there’s a GitHub repository with some schematics, configurations and so on…
I am having a hard time learning japanese and reading/writing the kanji especially.
Having to write japanese city names frequently (for example when doing searches) I still do remember the spoken out version of the name but I do not quite yet remember the kanji version. Also I do not want to switch back and forth in keyboard languages.
For this, especially in macOS and iOS there is a nice way around this. With the built-in “Text Replacement” feature of your Mac or iPhone/iPad you can easily mass-import a mapping between the romanized version of a word and the japanese written out kanji version of that word.
While you are typing then you will be presented with recommendation text replacements, effectively the kanjis of what you’ve just tried to write.
Unfortunately I do not know a way how to mass-import these text-replacements on iOS.
But if you own a macOS computer and you have it synced over iCloud with your mobile phone or tablet you will likely be able to open the text replacement pane in your system settings and import this plist file into it. Simply drag the file (after unzipping the ZIP file) into the text replacement window.
Download the Tokyo-Text-Replacement.zip file. Extract it (double clicking). And drag the .plist file into the Text Replacement Window.
For you to derive your own files you can find the raw data, a list of all designated Ken and Ward names in Tokyo here:
In Nodes you write programs by connecting “blocks” of code. Each node – as we refer to them – is a self contained piece of functionality like loading a file, rendering a 3D geometry or tracking the position of the mouse. The source code can be as big or as tiny as you like. We’ve seen some of ours ranging from 5 lines of code to the thousands. Conceptual/functional separation is usually more important.Nodes.io
*(not to be confused with node.js)
It’s been a year since Zenvent posted this:
A Hackintosh (a portmanteau of “Hack” and “Macintosh”), is a computer that runs macOS on a device not authorized by Apple, or one that no longer receives official software updates.https://en.wikipedia.org/wiki/Hackintosh
The 27th Day of the Season of Bureaucracy: The Day of the Sloth, Holy Day of Slothage. Kick back. Hang around. Grow Moss.Sloth-day
Diesmal geht es um:
- Scanner Pro auf iOS – https://apps.apple.com/us/app/scanner-pro/id333710667
- Scanbot auf iOS – https://scanbot.io/en/index.html
- Abo-Modelle bei Software und Diensten
- RING Kamera und Überwachungssystem – https://de-de.ring.com/
- Canary Indoor Camera – https://canary.is/
- Surveillance Station – https://www.synology.com/en-global/surveillance
- Ring has more than 400 police “partnerships” – https://arstechnica.com/tech-policy/2019/08/ring-has-more-than-400-police-partnerships-company-finally-says/
- Jumbo Privacy – https://blog.jumboprivacy.com/ – App Store: https://apps.apple.com/us/app/jumbo-privacy/id1454039975?ls=1
- Tim Berners-Lee Projekt “Solid”: https://solid.mit.edu/ – https://en.wikipedia.org/wiki/Solid_(web_decentralization_project) – https://solid.inrupt.com/how-it-works
- Ubuntu – https://ubuntu.com/
- Throw-Away Remote VNC Linux Desktop in a Docker container – https://www.schrankmonster.de/2019/08/27/a-throw-away-linux-desktop-in-a-container/
- Virtual Network Computing – https://en.wikipedia.org/wiki/Virtual_Network_Computing
- Stephen Wolfram – https://blog.stephenwolfram.com/
- Speed of Light in Medium – https://en.wikipedia.org/wiki/Speed_of_light