It’s nothing short of amazing what trained dancers can do with their bodies, and a real shame that visually-impaired people can’t enjoy the experience of, say, ballet. For this year’s Hackaday Prize, [Shi Yun] is working on a way for visually-impaired people to experience dance performances via haptic feedback on a special device.
This platform, which is called Kinetic Soul, uses Posenet computer vision to track a dancer’s movements. Posenet detects the dancer’s joints and creates a point map to determine what body parts are moving where, and at what speed. Then the system translates and transmits the movements to the 32 pins on the surface, creating a touchable picture of what’s going on. Each 3D-printed pin is controlled with a solenoid, all of which are driven by a single Arduino.
We think it’s interesting that Kinetic Soul can speak to the user in two different languages. The first is more about the overall flow of a dance, and the second delves into the deconstructed details. Both methods allow for dances to be enjoyed in real time, or via video recording. So how does one deconstruct dance? [Shi Yun] turned to Laban Movement Analysis, which breaks up human locomotion into four broad categories: the body in relation to itself, the effort expended to move, the shapes assumed, and the space used.
[Shi Yun] has been user-testing their ideas at dance workshops for the visually impaired throughout the entire process — this is how they arrived at having two haptic languages instead of one. They plan to continue getting input as they work to fortify the prototype, improve the touch experience, and refine the haptic languages. Check out the brief demonstration video after the break.
Yes indeed, dance is a majestic way of expressing all kinds of things. Think you have no use for interpretive dance? Think again — it can help you understand protein synthesis in an amusing way.