This LEGO synth made by [Rare Beasts] had us grinning from ear to ear.
It combines elements from LEGO Mindstorms with regular blocks in order to make music with color. A different music sample is assigned to each of five colors: red, blue, green, yellow, and white. The blocks are attached to spokes coming off of a wheel made with
NXT an EV3. As the wheel turns, the blocks pass in front of a fixed color sensor that reads the color and plays the corresponding sample. The samples are different lengths, so changing the speed of the wheel makes for some interesting musical effects.
As you’ll see in the short video after the break, [Rare Beasts] starts the wheel moving slowly to demonstrate the system. Since the whole thing is made of LEGO, the blocks are totally modular. Removing a few of them here and there inserts rests into the music, which makes the result that much more complex.
LEGO is quite versatile, and that extends beyond playtime. It can be used to automate laboratory tasks, braid rope, or even simulate a nuclear reactor. What amazing creations have you made with it? Let us know in the comments.
Continue reading “LEGO Looper Makes Modular Music”
[Myrijam Stoetzer] and her friend [Paul Foltin], 14 and 15 years old kids from Duisburg, Germany are working on a eye movement controller wheel chair. They were inspired by the Eyewriter Project which we’ve been following for a long time. Eyewriter was built for Tony Quan a.k.a Tempt1 by his friends. In 2003, Tempt1 was diagnosed with the degenerative nerve disorder ALS and is now fully paralyzed except for his eyes, but has been able to use the EyeWriter to continue his art.
This is their first big leap moving up from Lego Mindstorms. The eye tracker part consists of a safety glass frame, a regular webcam, and IR SMD LEDs. They removed the IR blocking filter from the webcam to make it work in all lighting conditions. The image processing is handled by an Odroid U3 – a compact, low cost ARM Quad Core SBC capable of running Ubuntu, Android, and other Linux OS systems. They initially tried the Raspberry Pi which managed to do just about 3fps, compared to 13~15fps from the Odroid. The code is written in Python and uses OpenCV libraries. They are learning Python on the go. An Arduino is used to control the motor via an H-bridge controller, and also to calibrate the eye tracker. Potentiometers connected to the Arduino’s analog ports allow adjusting the tracker to individual requirements.
The web cam video stream is filtered to obtain the pupil position, and this is compared to four presets for forward, reverse, left and right. The presets can be adjusted using the potentiometers. An enable switch, manually activated at present is used to ensure the wheel chair moves only when commanded. Their plan is to later replace this switch with tongue activation or maybe cheek muscle twitch detection.
First tests were on a small mockup robotic platform. After winning a local competition, they bought a second-hand wheel chair and started all over again. This time, they tried the Raspberry Pi 2 model B, and it was able to work at about 8~9fps. Not as well as the Odroid, but at half the cost, it seemed like a workable solution since their aim is to make it as cheap as possible. They would appreciate receiving any help to improve the performance – maybe improving their code or utilising all the four cores more efficiently. For the bigger wheelchair, they used recycled car windshield wiper motors and some relays to switch them. They also used a 3D printer to print an enclosure for the camera and wheels to help turn the wheelchair. Further details are also available on [Myrijam]’s blog. They documented their build (German, pdf) and have their sights set on the German National Science Fair. The team is working on English translation of the documentation and will release all design files and source code under a CC by NC license soon.
Developing film at home is most certainly a nearly forgotten art nowadays, but there are still a few very dedicated people who care enough to put in the time and study to this craft. [Jan] is one of the exceptional ones. He’s developing 35mm film with Lego (Dutch, Google translate).
For the build, [Jan] is using the Lego RCX 1.0, the first gen of the Lego Mindstorms, released in the late 90s. According to eBay, this is a significantly cheaper option for programmable Lego. The mechanics of the Lego film developer consisted of multiple tanks of chemicals. The film was loaded on a reel, suspended from a Lego gantry, and dunked into each tank for a specific amount of time.
A second revision of the hardware (translate) was designed, with the film loaded into a rotating cylinder. A series of chemicals would then be pumped into this unit with the hope of reducing the amount of chemicals required. This system was eventually built using the wiper fluid pump from a car. Apparently, the system worked well, judging from the pictures developed with this system. Whether it was easy or efficient is another matter entirely.
You can check out a video of the first revision of the Lego film developing system below.
Thanks [Andrew] for sending this in.
Continue reading “Developing Film With Lego”
Remember in the late 90s and early 2000s when everything had blue LEDs in them? Blinding blue LEDs that lit up a dark room like a Christmas tree? Nobel prize. There’s a good /r/askscience thread on why this is so important. The TL;DR is that it’s tough to put a p-type layer on gallium nitride.
Have a Segway and you’re a member of the 501st? Here’s your Halloween costume. It’s a model of the Aratech 74-Z speeder bike, most famously seen careening into the side of trees on the forest moon of Endor.
[Andrew] needed something to do and machined an iPhone 5 out of a block of aluminum. Here’s the video of icon labels being engraved. The machine is a Denford Triac with a six station auto tool changer. He’s running Mach3, and according to him everything – including the correct tooling – cost far too much money.
Another [Andrew] was working the LEGO booth at Maker Faire New York and has finally gotten his LEGO Mindstorms Minecraft Creeper build written up. Yes, it’s probably smarter than your average Minecraft Creeper, and this one also blows up. He also had a physical version of the classic video game from 1979, Lunar Lander. Both are extremely awesome builds, and a great way to attract kids of all ages to a booth.
[Wilfred] was testing a titanium 3D printer at work and was looking for something to print. The skull ‘n wrenches was a suitable candidate, and the results are fantastic. From [Wilfred]: “Just out of the printer the logo looks amazing because it isn’t oxidized yet (inside the printer is an Argon atmosphere) Then the logo moves to an oven to anneal the stress made by the laser. But then it gets brown and ugly. After sandblasting we get a lovely bluish color as you can see in the last picture.”
The folks at Lulzbot/Aleph Objects are experimenting with their yet-to-be-released printer, codenamed ‘Begonia’. They’re 2D printing, strangely enough, and for only using a standard Bic pen, the results look great.
Everyone is going crazy over the ESP8266 UART to WiFi module. There’s another module that came up on Seeed recently, the EMW3162. It’s an ARM Cortex M3 with plenty of Flash, has 802.11 b/g/n, and it’s $8.50 USD. Out of stock, of course.
Learning with visuals can be very helpful. Learning with models made from NXT Mindstorms is just plain awesome, as [Rdsprm] demonstrates with this LEGO NXT 3-point bend tester that he built to introduce freshmen to flexural deflection and material properties. Specifically, it calculates Young’s modulus using the applied force of a spring and the beam’s deflection. [Rdsprm] provides a thorough explanation in the About section of the YouTube video linked above, but the reddit comments are definitely a value-add.
[Rdsprm] built this from the Mindstorms education base set (9797) and the education resource set (9648). Each contestant endures a 5-test battery and should produce the same result each time. The motor in the foreground sets the testing length of the beam, and the second motor pulls the spring down using a gearbox and chain.
This method of deflection testing is unconventional, as [Rdsprm] explains. Usually, the beam is loaded incrementally, with deflection measured at each loading state. Here, the beam is loaded continuously. Vertical deflection is measured with a light sensor that reads a bar code scale on the beam as it passes by. The spring position is calculated and used to determine the applied force.
[Rdsprm] analysed the fluctuation in GNU Octave and has graphs of the light sensor readings and force-deflection. No beams to bend with your Mindstorms? You could make this Ruzzle player instead.
Continue reading “I am NXT 3-Point Bend Tester. Please Insert Girder.”
The Linux4nano project has been working to port the Linux kernel onto the iPod Nano along with other iPods in general. Although the iPodLinux project has had luck with some older iPods, newer models protect firmware updates with encryption. One of the ways they plan on running code on the device is through a vulnerability in the notes program; it causes the processor to jump to a specific instruction and execute arbitrary code. To take advantage of this, they first need to figure out where their injected code ends up in the memory. Currently, they are testing every memory location by painstakingly loading in a bogus note and recording its effect. Each note takes about a minute to test and they have tens of thousands of addresses to check over several devices.
Continue reading “Lego iPod hacking robot”