We keep seeing more and more Tensor Flow neural network projects. We also keep seeing more and more things running in the browser. You don’t have to be Mr. Spock to see this one coming. TensorFire runs neural networks in the browser and claims that WebGL allows it to run as quickly as it would on the user’s desktop computer. The main page is a demo that stylizes images, but if you want more detail you’ll probably want to visit the project page, instead. You might also enjoy the video from one of the creators, [Kevin Kwok], below.
TensorFire has two parts: a low-level language for writing massively parallel WebGL shaders that operate on 4D tensors and a high-level library for importing models from Keras or TensorFlow. The authors claim it will work on any GPU and–in some cases–will be actually faster than running native TensorFlow.
This is a logical progression of using WebGL to do browser-based parallel processing, which we’ve covered before. The work has been done by a group of recent MIT graduates who applied for (and received) an AI Grant for their work. We wonder if some enterprising Hackaday readers might not get some similar financing (be aware, you have to apply by the end of August).
Thanks [Patrick] for the tip.