Flip-dot displays are grand, especially this one which boasts 74,088 pixels! I once heard the hardware compared to e-ink. That’s actually a pretty good description since both use a pixel that is white on one side and black on the other, depend on a coil to change state, and only use electricity when flipping those bits.
What’s remarkable about this is the size of the installation. It occupied a huge curving wall on the ooVoo booth at 2015 CES. We wanted to hear more about the hardware so we reached out to them they didn’t disappoint. The ooVoo crew made time for a conference call which included [Pat Murray] who coordinated the build effort. That’s right, they built this thing — we had assumed it was a rental. [Matt Farrell] recounts that during conception, the team had asked themselves how an HD video chat for mobile company can show off display technology when juxtaposed with cutting edge 4k and 8k displays? We think the flip-dot was a perfect tack — I know I spent more time looking at this than at televisions.
Join us after the break for the skinny on how it was built, including pictures of the back side of the installation and video clips that you have to hear to believe.
Continue reading “The Giant Flip-Dot Display at CES”
[Ryan Lloyd], [Sandeep Dhull], and [Ruben D’Sa] wrote in to share a robotics project they have been keeping busy with lately. The three University of Minnesota students are using a Kinect sensor to remotely control a robotic arm, but it’s not as simple as it sounds.
Using OpenNI alongside PrimeSense, the team started out by doing some simple skeleton tracking before working with their robotic arm. The arm has five degrees of freedom, making the task of controlling it a bit tricky. The robot has quite a few joints to play with, so the trio not only tracks shoulder, elbow, and wrist movements, but they also monitor the status of the user’s hand to actuate the robot’s gripper.
When everything was said and done, the results were pretty impressive as you can see in the video below, but the team definitely sees room for improvement. Using inverse kinematics, they plan on filtering out some of the joint tracking inaccuracies that occur when the shoulders are moved in a certain way. They also plan on using a robotic arm with even more degrees of freedom to see just how well their software can perform.
Be sure to check out their site to see more details and videos.
Continue reading “Advanced robotic arm control using Kinect”