OpenSource GUI Tool For OpenCV And DeepLearning

AI and Deep Learning for computer vision projects has come to the masses. This can be attributed partly to the  community projects that help ease the pain for newbies. [Abhishek] contributes one such project called Monk AI which comes with a GUI for transfer learning.

Monk AI is essentially a wrapper for Computer Vision and deep learning experiments. It facilitates users to finetune deep neural networks using transfer learning and is written in Python. Out of the box, it supports Keras and Pytorch and it comes with a few lines of code; you can get started with your very first AI experiment.

[Abhishek] also has an Object Detection wrapper(GitHub) that has some useful examples as well as a Monk GUI(GitHub) tool that looks similar to the tools available in commercial packages for running, training and inference experiments.

The documentation is a work in progress though it seems like an excellent concept to build on. We need more tools like these to help more people getting started with Deep Learning. Hardware such as the Nvidia Jetson Nano and Google Coral are affordable and facilitate the learning and experimentation.

LoRa Tutorials For The DIY Masses

LoRa is the go-to tech for low power, long range wireless sensor networks. Designing with off-the-shelf modules can be a boon or a bane depending on the documentation and support. Luckily, [Renzo] has prepared a set of tutorials to get you started.
In his seven part series of write-ups, [Renzo] starts by connecting the E32 module from AliExpress to an Arduino as well as an ESP8266 to demonstrate essential communications. Then he discusses the configuration options and the library he created to make like a bit easier. Following that is a series of posts discussing transmission types as well as power saving methods including sleep modes and wake-on-radio.
The information will be extremely handy for someone starting off with the SX1276/SX1278 Wireless Modules which are relatively inexpensive as opposed to more standardized development kits. We love the abundance of fritzing diagrams, arduino code and helper library and hope someone will build on it. You can get the library from Github for your tinkering pleasure.
If you are looking for ideas for this newly discovered skill, have a look at LoRa Enabled Mailbox as well as Electric Fence Monitoring with The Things Network for a bit of IoT action.

Soviet Soyuz Clock Teardown

We love spacecraft and we definitely love teardowns, especially if they are for vintage devices. [Ken Shirriff] writes about taking apart the digital clock module from the Soviet Soyuz series of spacecraft and there are a lot of interesting bits to the device. After all, it has been into space.

The Soyuz series of spacecraft made their maiden voyage in 1966, and are still flying today. The clock in question comes from somewhere in the middle, around 1996. On the outside, it seems like any spaceship gizmo, and the digital clock keeps local time along with a stopwatch and an alarm function. The guts are much more interesting with no less than 10 PCBs sandwiched inside the small enclosure.

The system consists of dual layer-boards with a mix of SMD and through-hole components that are interconnected by a series of wires that are bunched and packed to create a wiring harness. The pictures show a very clever way of setting up the stack and the system is serviceable by design as the bunch opens up like a book. This gives access to the unique looking components that include 14-pin flat pack chips, large ceramic multicoil inductors, green colored resistors, and orange rectangular diodes.

There are isolated PSU boards, control boards, clock circuitry, some glue logic to put things together, and LED displays with driver circuits. [Ken Shirriff] dives into the clocking circuit and the various parts involved along with a comparison with US technology. There is a lot of interesting detail in these boards, and it may be a source of inspiration for some.

If you are looking for more spaceborne tech, have a look at the one that stowed away on the International Space Station.

Thanks for the tip [Thorsten Eggert]

Continue reading “Soviet Soyuz Clock Teardown”

Finding Pre-Trained AI In A Modelzoo Using Python

Training a machine learning model is not a task for mere mortals, as it takes a lot of time or computing power to do so. Fortunately there are pre-trained models out there that one can use, and [Max Bridgland] decided it would be a good idea to write a python module to find and view such models using the command line.

For the uninitiated, Modelzoo is a place where you can find open source deep learning code and pre-trained models. [Max] taps into the (undocumented) API and allows a user to find and view models directly. When you run a utility, it goes online and retrieves the categories and then details of the available models. From then on, the user can select a model and the application will simply open the corresponding GitHub repository. Sounds simple but it has a lot of value since the code is designed to be extendable so that users working on such projects may automate the downloading part as well.

We have seen projects with machine learning used to detect humans, and with AI trending community tools such as this one help beginners get started even faster.

An Apartment-Hunting AI

Finding a good apartment is a lot of work and includes searching websites for available places and then cross-referencing with a list of characteristics. This can take hours, days or even months but in a world where cars drive themselves, it is possible to use machine learning in your hunt.

[veesot] lives in a city between Europe and Asia and was looking for a new home, and his goal was to create a model that can use historical data to not only suggest if an advertised price was right, but also recommend waiting by predicting the decrease in the the future. The data-set includes parameters such as “area”, “district”, “number of balconies” etc and tried to determine an optimal property to view.

There is a lot that [veesot] describes in his post which includes cleaning the data in terms of removing flats that are tool small or tool large. This is essentially creating a training data-set for the machine learning system that will allow the system to generate usable output. [veesot] also added parameters such districts which relate to the geographical location, age of the building and even the materials used in the construction.

There is also an interesting bit about analyzing the data variables and determining cross-correlation which ultimately leads to the obvious conclusions that the central/older districts have older apartments and newer ones are larger. It makes for a few cool graphs but the code can certainly come in handy when dealing with similar data-sets. The last part of the writing discusses applying Linear Regression and then testing its accuracy. Interpreting the model produces interesting results about the trained model and the values of the coefficients.

Continue reading “An Apartment-Hunting AI”

Millenium Falcon HID: Get Unity To Talk To Teensy

Here’s one that proves a hardware project can go beyond blinking LEDs and dumping massive chunks of data onto a serial console. Those practices are fine for some, but [dimtass] has found a more elegant hack for a more civilized age. His 3D Millennium Falcon model gets orientation data from his IMU as an an HID device.

The hardware involved is an MPU6050 6-axis sensor that is interfaced with a Teensy 3.2 board. [dimtass] documents his approach to calibrating the IMU going a bit further by using a Python script to generate offsets. We’ve advocated using Jupyter notebooks in the past and this is a good example of Jupyter plotting the data and visualizing the effect of the offsets in a second pass.

When in action, the Teensy reads IMU data and sends it over a USB RAW HID interface. For the uninitiated, HID transfers are more reliable than USB CDC transfers (virtual serial port) because they use smaller data chunks per event/transaction and usually don’t require special driversOn the computer side, [dimtass] has written a small application that gets the IMU values over the RAW HID and then provides it to the visualization application.

A 3D Millennium Falcon model is rendered in Unity, the popular open source game development engine. Even though Unity has an API, this particular approach is more OS specific using a shared-memory technique. The HID application writes to a file (/tmp/hid-shared-buffer) which is then read by Unity to make orientation changes to the rendered model.

[dimtass] provides lots of details on the tools used to bring his project to life and it can be a great starting point for more projects that need interfacing sensors with a visualization system. We have seen ways to turn a person’s head into a joystick and if you need a deeper dive into Unity, look no further.

Continue reading “Millenium Falcon HID: Get Unity To Talk To Teensy”

From An Eye To An Eye: Human Muscles As A Joystick

The interface between humans and machines has been a constantly evolving field. Sure the computer mouse was a game-changer, but time moves on. We are now looking at integrating machines via soft HMIs for personal applications. A research team led by the University of California, San Diego has presented a paper interfacing a soft lens with the human eye.

The lens itself is a pair of electroactive elastomer films that encapsulates a small quantity of saltwater. These films constitute the muscle and are controlled by an external source of electrical pulses. The signals are generated when electrodes placed around the eye of a subject and detect movement. Actions such as blinking are converted to a zoom-in-zoom-out activity which is designed to mimic human squinting.

The suggested potential applications are visual prostheses, adjustable glasses, VR, and even soft robots eyes. Yes, we are heading from whirring robots to squishy robots, but that also means that people with disabilities can get a second chance. This approach is non-invasive as opposed to brain implants.

[via Phys.org]

[Thanks for the tip Qes]

Continue reading “From An Eye To An Eye: Human Muscles As A Joystick”