Chess AI, Old School

People have been interested in chess-playing computers before there were any chess-playing computers. In a 1950 paper, [Claude Shannon] defined two major chess-playing strategies. Apparently, practical chess programs still use the techniques he outlined. If you’ve ever wondered how to make a computer play chess [FreeCodeCamp] has an interesting post that walks you through building a chess engine step-by-step.

The code is in JavaScript, but the approach struck us as old school. However, it is interesting to watch the evolution of code as you go from random moves, to slightly smarter strategy, to deeper searching. Because it is in JavaScript, you can follow along in your browser and find out when the program gets smart enough to beat you. The final version is even on GitHub.

Continue reading “Chess AI, Old School”

Ten Minute TensorFlow Speech Recognition

Like a lot of people, we’ve been pretty interested in TensorFlow, the Google neural network software. If you want to experiment with using it for speech recognition, you’ll want to check out [Silicon Valley Data Science’s] GitHub repository which promises you a fast setup for a speech recognition demo. It even covers which items you need to install if you are using a CUDA GPU to accelerate processing or if you aren’t.

Another interesting thing is the use of TensorBoard to visualize the resulting neural network. This tool offers up a page in your browser that lets you visualize what’s really going on inside the neural network. There’s also speech data in the repository, so it is practically a one-stop shop for getting started. If you haven’t seen TensorBoard in action, you might enjoy the video from Google, below.

Continue reading “Ten Minute TensorFlow Speech Recognition”

Google Machine Learning Made Simple(r)

If you’ve looked at machine learning, you may have noticed that a lot of the examples are interesting but hard to follow. That’s why [Jostmey] created Naked Tensor, a bare-minimum example of using TensorFlow. The example is simple, just doing some straight line fits on some data points. One example shows how it is done in series, one in parallel, and another for an 8-million point dataset. All the code is in Python.

If you haven’t run into it yet, TensorFlow is an open source library from Google. To quote from its website:

TensorFlow is an open source software library for numerical computation using data flow graphs. Nodes in the graph represent mathematical operations, while the graph edges represent the multidimensional data arrays (tensors) communicated between them. The flexible architecture allows you to deploy computation to one or more CPUs or GPUs in a desktop, server, or mobile device with a single API. TensorFlow was originally developed by researchers and engineers working on the Google Brain Team within Google’s Machine Intelligence research organization for the purposes of conducting machine learning and deep neural networks research, but the system is general enough to be applicable in a wide variety of other domains as well.

Continue reading “Google Machine Learning Made Simple(r)”

Project Zero Finds A Graphic Zero Day

After finding the infamous Heartbleed vulnerability along with a variety of other zero days, Google decided to form a full-time team dedicated to finding similar vulnerabilities. That team, dubbed Project Zero, just released a new vulnerability, and this one’s particularly graphic, consisting of a group of flaws in the Windows Nvidia Driver.

Most of the vulnerabilities found were due to poor programming techniques. From writing to user provided pointers blindly, to incorrect bounds checking, most vulnerabilities were due to simple mistakes that were quickly fixed by Nvidia. As the author put it, Nvidia’s “drivers contained a lot of code which probably shouldn’t be in the kernel, and most of the bugs discovered were very basic mistakes.”

When even our mice aren’t safe it may seem that a secure system is unattainable. However, there is light at the end of the tunnel. While the bugs found showed that Nvidia has a lot of work to do, their response to Google was “quick and positive.” Most bugs were fixed well under the deadline, and google reports that Nvidia has been finding some bugs on their own. It also appears that Nvidia is working on re-architecturing their kernel drivers for security. This isn’t the first time we’ve heard from Google’s Project Zero, and in all honesty, it probably won’t be last.

Neural Network Does Your Homework

[Will Forfang] found a app that lets you take a picture of a math equation with a phone and ask for a solution. However, the app wouldn’t read handwritten equations, so [Will] decided to see how hard that would be, using a neural network.

The results are pretty impressive (you can also see the video below). [Will] used his own handwriting on a chalkboard and had the network train on that. He also went even further and added some heuristics to identify fraction bars and infer the grouping from the relative size of the bars.

Continue reading “Neural Network Does Your Homework”

Does This Demo Remind You of Mario Kart? It Should!

Here’s a slick-looking VGA demo written in assembly by [Yianni Kostaris]; it’s VGA output from an otherwise stock ATmega2560 at 16MHz with no external chips involved. If you’re getting some Super Mario Kart vibes from how it looks, there’s a good reason for that. The demo implements a form of the Super Nintendo’s Mode 7 graphics, which allowed for a background to be efficiently texture-mapped, rotated, and scaled for a 3D effect. It was used in racing games (such as Super Mario Kart) but also in many others. A video of the demo is embedded below.

[Yianni] posted the original demo a year earlier, but just recently added detailed technical information on how it was all accomplished. The AVR outputs VGA signals directly, resulting in 100×120 resolution with 256 colors, zipping along at 60 fps. The AVR itself is not modified or overclocked in any way — it runs at an entirely normal 16MHz and spends 93% of its time handling interrupts. Despite sharing details for how this is done, [Yianni] hasn’t released any code, but told us this demo is an offshoot from another project that is still in progress. It’s worth staying tuned because it’s clear [Yianni] knows his stuff.

Continue reading “Does This Demo Remind You of Mario Kart? It Should!”

LTSpice for Radio Amateurs (and Others)

We don’t think [VK4FFAB] did himself a favor by calling his seven-part LTSpice tutorial LTSpice for Radio Amateurs. Sure, the posts do focus on radio frequency analysis, but these days lots of people are involved in radio work that aren’t necessarily hams.

Either way, if you are interested in simulating RF amplifiers and filters, you ought to check these posts out. Of course, the first few cover simple things like voltage dividers just to get your feet wet. The final part even covers a double-balanced mixer with some transformers, so there’s quite a range of material.

Continue reading “LTSpice for Radio Amateurs (and Others)”