Finding Pre-Trained AI In A Modelzoo Using Python

Training a machine learning model is not a task for mere mortals, as it takes a lot of time or computing power to do so. Fortunately there are pre-trained models out there that one can use, and [Max Bridgland] decided it would be a good idea to write a python module to find and view such models using the command line.

For the uninitiated, Modelzoo is a place where you can find open source deep learning code and pre-trained models. [Max] taps into the (undocumented) API and allows a user to find and view models directly. When you run a utility, it goes online and retrieves the categories and then details of the available models. From then on, the user can select a model and the application will simply open the corresponding GitHub repository. Sounds simple but it has a lot of value since the code is designed to be extendable so that users working on such projects may automate the downloading part as well.

We have seen projects with machine learning used to detect humans, and with AI trending community tools such as this one help beginners get started even faster.

An Apartment-Hunting AI

Finding a good apartment is a lot of work and includes searching websites for available places and then cross-referencing with a list of characteristics. This can take hours, days or even months but in a world where cars drive themselves, it is possible to use machine learning in your hunt.

[veesot] lives in a city between Europe and Asia and was looking for a new home, and his goal was to create a model that can use historical data to not only suggest if an advertised price was right, but also recommend waiting by predicting the decrease in the the future. The data-set includes parameters such as “area”, “district”, “number of balconies” etc and tried to determine an optimal property to view.

There is a lot that [veesot] describes in his post which includes cleaning the data in terms of removing flats that are tool small or tool large. This is essentially creating a training data-set for the machine learning system that will allow the system to generate usable output. [veesot] also added parameters such districts which relate to the geographical location, age of the building and even the materials used in the construction.

There is also an interesting bit about analyzing the data variables and determining cross-correlation which ultimately leads to the obvious conclusions that the central/older districts have older apartments and newer ones are larger. It makes for a few cool graphs but the code can certainly come in handy when dealing with similar data-sets. The last part of the writing discusses applying Linear Regression and then testing its accuracy. Interpreting the model produces interesting results about the trained model and the values of the coefficients.

Continue reading “An Apartment-Hunting AI”

BeagleBone Deep Learning Video Demo

BeagleBoard often gets eclipsed by Raspberry Pi. Where the Pi focuses on ease-of-use, the BeagleBone generally has more power for hardcore applications. With machine learning AI all the rage now, BeagleBoard now has the BeagleBone AI, a board with specific features aimed at machine learning. A recent video (see below) shows a demo of using TIDL (Texas Instruments Deep Learning Library). The video includes an example of streaming video to a browser and using predefined learning models to identify things picked up by a web camera.

The CPU onboard is the TI Sitara AM5729. That’s a dual Arm Cortex A15 running at 1.5 GHz. There are also two C66x floating-point DSP processors and two dual ARM Cortex M4 coprocessors. Still need more? You get four embedded vision engines, two dual-core real-time units, a 2D graphics accelerator, a 3D graphics accelerator, and a subsystem for encoding and decoding video and cryptography.

Continue reading “BeagleBone Deep Learning Video Demo”

AI Makes Hyperbolic Brain Hats A Reality

It isn’t often that the world of Hackaday intersects with the world of crafting, which is perhaps a shame because many of the skills and techniques of the two have significant overlap. Crochet for instance has rarely featured here, but that is about to change with [Janelle Shane]’s HAT3000 neural network trained to produce crochet hat patterns.

Taking the GPT-2 neural network trained on Internet text  and further training it with a stack of crochet hat patterns, she was able to generate AI-designed hats which her friends on the Ravelry yarn forum set to crochet into real hats. It’s a follow-up to a previous knitting-based project, and instead of producing the hats you might expect it goes into flights of fancy. Some are visibly hat-like while others turn into avant-garde creations that defy any attempt to match them to real heads. A whole genre of hyperbolic progressions of crochet rows produce hats with organic folds that begin to resemble brains, and tax both the stamina of the person doing the crochet and their supply of yarn.

Perhaps most amusingly the neural network retains the ability to produce text, but when it does so it now inevitably steers the subject back to crochet hats. A Harry Potter sentence spawns a passage of something she aptly describes as “terrible crochet-themed erotica“, and such is the influence of the crochet patterns that this purple prose can even include enough crochet instructions to make them crochetable. It would be fascinating to see whether a similar model trained with G-code from Thingiverse would produce printable designs, what would an AI make with Benchy for example?

We’ve been entertained by [Janelle]’s AI work before, both naming tomato varieties, and creating pie recipes.

Thanks [Laura] for the tip.

Machine Learning With Microcontrollers Hack Chat

Join us on Wednesday, September 11 at noon Pacific for the Machine Learning with Microcontrollers Hack Chat with Limor “Ladyada” Fried and Phillip Torrone from Adafruit!

We’ve gotten to the point where a $35 Raspberry Pi can be a reasonable alternative to a traditional desktop or laptop, and microcontrollers in the Arduino ecosystem are getting powerful enough to handle some remarkably demanding computational jobs. But there’s still one area where microcontrollers seem to be lagging a bit: machine learning. Sure, there are purpose-built edge-computing SBCs, but wouldn’t it be great to be able to run AI models on versatile and ubiquitous MCUs that you can pick up for a couple of bucks?

We’re moving in that direction, and our friends at Adafruit Industries want to stop by the Hack Chat and tell us all about what they’re working on. In addition to Ladyada and PT, we’ll be joined by Meghna NatrajDaniel Situnayake, and Pete Warden, all from the Google TensorFlow team. If you’ve got any interest in edge computing on small form-factor computers, you won’t want to miss this chat. Join us, ask your questions about TensorFlow Lite and TensorFlow Lite for Microcontrollers, and see what’s possible in machine learning way out on the edge.

join-hack-chatOur Hack Chats are live community events in the Hackaday.io Hack Chat group messaging. This week we’ll be sitting down on Wednesday, September 11 at 12:00 PM Pacific time. If time zones have got you down, we have a handy time zone converter.

Click that speech bubble to the right, and you’ll be taken directly to the Hack Chat group on Hackaday.io. You don’t have to wait until Wednesday; join whenever you want and you can see what the community is talking about.

Largest Chip Ever Holds 1.2 Trillion Transistors

We get it, press releases are full of hyperbole. Cerebras recently announced they’ve built the largest chip ever. The chip has 400,000 cores and contains 1.2 trillion transistors on a die over 46,000 square mm in area. That’s roughly the same as a square about 8.5 inches on each side. But honestly, the WSE — Wafer Scale Engine — is just most of a wafer not cut up. Typically a wafer will have lots of copies of a device on it and it gets split into pieces.

According to the company, the WSE is 56 times larger than the largest GPU on the market. The chip boasts 18 gigabytes of storage spread around the massive die. The problem isn’t making such a beast — although a normal wafer is allowed to have a certain number of bad spots. The real problems come through things such as interconnections and thermal management.

Continue reading “Largest Chip Ever Holds 1.2 Trillion Transistors”

Brain-Computer Interfaces: Separating Fact From Fiction On Musk’s Brain Implant Claims

When it comes to something as futuristic-sounding as brain-computer interfaces (BCI), our collective minds tend to zip straight to scenes from countless movies, comics, and other works of science-fiction (including more dystopian scenarios). Our mind’s eye fills with everything from the Borg and neural interfaces of Star Trek, to the neural recording devices with parent-controlled blocking features from Black Mirror, and of course the enslavement of the human race by machines in The Matrix.

And now there’s this Elon Musk guy, proclaiming that he’ll be wiring up people’s brains to computers starting next year, as part of this other company of his: Neuralink. Here the promises and imaginings are truly straight from the realm of sci-fi, ranging from ‘reading and writing’ to the brain, curing brain diseases and merging human minds with artificial intelligence. How much of this is just investor speak? Please join us as we take a look at BCIs, neuroprosthetics and what we can expect of these technologies in the coming years.

Continue reading “Brain-Computer Interfaces: Separating Fact From Fiction On Musk’s Brain Implant Claims”