Tangible Programming Brings Code Into The Real World

We love the idea of [Amos]’s Tangible Programming project. It reminds us of those great old Radioshack electronics labs where the circuitry concepts took on a physical aspect that made them way easier to digest than abstractions in an engineering textbook.

MIT Scratch teaches many programming concepts in an easy to understand visual way. However, fundamentally people are tactile creatures and being able to literally feel and see the code laid out in front could be groundbreaking for many young learners. Especially those with brains that favor physical touch and interaction such as ADHD or Asperger’s minds.

The boards are color-coded and communicate via an I2C bus. Each board’s logic and communication is handled by an ATTiny or ATMega. The current processing is visible through LEDs or even an OLED display. Numbers are input either through thumbwheel switches or jumpers.

The code concepts will, of course, be simple and focused due to the physical nature of the blocks. Integer arithmetic, simple loops, and if/else conditionals. Quite a lot of concepts can be built around this and it could be a natural diving board into the aforementioned Scratch and eventually an easy to learn language like python.

Read more from this series:
tangible programming

Read more from this series:
mit scratch

Read more from this series:
attiny

Read more from this series:
atmega

Read more from this series:
learning

12 thoughts on “Tangible Programming Brings Code Into The Real World

  1. I dunno how useful this will be. The physicality limits it to simple concepts and processes.

    Rather than actual circuitry in the blocks, and a plug-in buss (which itself is an advanced concept which distracts from the basics), I believe the best implementation would be some sort of physical gridded board with input and output points, on which students place physical blocks representing logic gates, decision branches, blocks containing common functionality, etc. it would look like a physical representation of a UML diagram. Blocks would be symbolically connected with wires. Either through the plugins to the grid, or by optical scanning, the physical diagram would be converted to software.

    At its heart, programming is pretty abstract. If someone can’t quickly make the leap from such a physical representation to an on-screen logic diagram (and then finally to written code), I don’t think they are likely to get much out of this.

    1. Very basic, and lets face it, very expensive implementation for what it does. It will join a long list of other ‘plug pieces together and make things!’ products laying in clearance bins in short order, assuming it makes it that far.

          1. There was bitching about AR/VR support being dropped in phoneland recently. Printing out a bunch of hex cubes with associated functional quality represented by QR code and processing as elementts to a programming language… Nah too complex. Nevermind.

  2. Oooh, I bet Brainfuck is easy to implement as a physical programming sort of thing. However, each device should have a knob for how many times you want to repeat that symbol, otherwise you’ll run out of pieces soon… eh, no. Never mind.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.