LED Bulb Reviews, Evaluations and Teardowns

LED Bulb Reviews

[ElectronUpdate] has posted many great reviews of commercial LED bulbs that one can purchase to replace standard E26 incandescent light bulbs. In his reviews he evaluates the light emitting performance and does a thorough and detailed teardown, evaluating and understanding the circuit technologies used. For the light emission evaluation he uses a light meter and some homemade graph paper to plot the lumens at different angles. Flicker is easily evaluated using a solar panel from a discarded solar path light connected to his oscilloscope. Any flicker will show up quite nicely and can be measured. Of course a kill-a-watt meter makes an appearance in most reviews to read watts and power factor.

Recently [ElectronUpdate] wanted to understand the meaning of CRI which is advertised on many of these commercial LED packages. CRI stands for color rendering index and deals with how colors appear when compared to a natural light source. After doing some researching he found that a CRI over 80 is probably good for LED lighting. The next dilemma was how to measure CRI without expensive scientific equipment. He found a website that we have featured before with free software and instructions on how to build a spectrometer. The web instructions include building a meter box from paper but he found it was much more reliable if built out of wood. We’ll let you follow [ElectronUpdate’s] recommended build if you like, but you’ll need a few items which he does detail.

After a short calibration procedure the final rig will measure power spectral line densities of your light source. [ElectronUpdate] is promising more details on how the colorful measurement data can be related to CRI ratings, but you can get a jump on the details at Full Spectrum Solutions. We also recommend you browse through all of [ElectronUpdate’s] LED bulb reviews on YouTube if the progressing performance and innards of LED bulbs fascinates you as much as it does us.

Perfect Jump Shots with OpenCV and Processing

jumpshot

[ElectricSlim] likes taking “Jump Shots” – photographs where the subject is captured in midair. He’s created a novel method to catch the perfect moment with OpenCV and Processing. Anyone who has tried jump shot photography can tell you how frustrating it is. Even with an experienced photographer at the shutter, shots are as likely to miss that perfect moment as they are to catch it. This is even harder when you’re trying to take jump shots solo. Wireless shutter releases can work, but unless you have a DSLR, shutter lag can cause you to miss the mark.

[ElectricSlim] decided to put his programming skills to work on the problem. He wrote a Processing sketch using the OpenCV library. The sketch has a relatively simple logic path: “IF a face is detected within a bounding box AND the face is dropping in height THEN snap a picture” The system isn’t perfect, A person must be looking directly at the camera for the photo the face to be detected. However, it’s good enough to take some great shots. The software is also repeatable enough to make animations of various jump shots, as seen in [ElectricSlim’s] video.

We think this would be a great starting point for a trigger system. Use a webcam to determine when to shoot a picture. When the conditions pass, a trigger could be sent to a DSLR, resulting in a much higher quality frame than what most webcams can produce.

Continue reading “Perfect Jump Shots with OpenCV and Processing”

Monitoring a sick bird using the Raspberry Pi

sick-bird-monitoring-with-rpi

[Jorge Rancé] was nursing a sick bird back to health. He found it on the street with a broken leg, which required a mini plaster cast for it to heal correctly. But felt bad when leaving the house for long periods. He grabbed some simple hardware and put his mind at easy by building an Internet connected bird monitoring system. It’s really just an excuse to play around with his Raspberry Pi, but who can blame him?

A webcam adds video monitoring using the Linux software called “motion” to stream the video. This is the same package we use with our cats when we travel; it provides a continuous live stream but can also save recordings whenever motion is detected. He added a USB temperature sensor and attached a water level sensor to the GPIO header. These are automatically harvested — along with a still image from the webcam — and tweeted once per hour using a bash script. He just needs to work out automatic food and water dispensing and he never needs to return home! Bird seed shouldn’t be any harder to dish out than fish food, right?

24-hour hackathon project adds object-based automation to hackerspace

hackerspace-automation

[Jeremy Blum], [Jason Wright], and [Sam Sinensky] combined forces for twenty-four hours to automate how the entertainment and lighting works at their hackerspace. They commandeered the whiteboard and used an already present webcam as part of their project. You can see the black tokens which can be moved around the blue tape outline to actuate the controls.

MATLAB is fed an image from the webcam which monitors the space. Frames are received once every second and parsed for changes in the tokens. There are small black squares which either skip to the next track of music or affect pause/play. Simply move them off of their designated spot and the image processing does the rest. This goes for the volume slider as well. We think the huge token for the lights is to ensure that the camera can sense a change in a darkened room.

If image processing isn’t your thing you can still control your audio entertainment with a frickin’ laser.

Continue reading “24-hour hackathon project adds object-based automation to hackerspace”

Embedded solution for uploading webcam pictures to the cloud

carambola-webcam-uploader

We have friends watch the cats when we go out-of-town. But we always leave a server running with a webcam (motion activated using the Linux “motion” software) so we can check in on them ourselves. But this project may inspire a change. It leverages the features of a Carambola2 to capture images and upload them to Dropbox.

In the picture above the green PCB is a development board for the tiny yellow PCB which is the actual Carambola2. It is soldered on the dev board using the same technique as those HC-05 Bluetooth modules. That shielded board includes a Qualcomm SoC running Linux and a WiFi radio. The dev board feeds it power and allows it connect to the USB webcam.

There’s a bit of command line kung-fu to get everything running but it shouldn’t be out of reach for beginners. Linux veterans will know that taking snapshots from a webcam at regular intervals is a simple task. Uploading to a secure cloud storage site is not. A Bash script handles the heavy lifting. It’s using the Dropbox Application API so this will not violate their TOS and you don’t have to figure out your own method of authenticating from the command line.

3D scanner made in a day

diy-3d-scanner

The LVL1 Hackerspace held a hackathon back in June and this is one of the projects that was created in that 24-hour period. It’s a 3D scanner made from leftover parts. The image gives you an idea of the math used in the image processing. It shows the angular relations between the laser diode, the subject being scanned, and the webcam doing the scanning.

The webcam is of rather low quality and one way to quickly improve the output would be to replace it with a better one. But because the rules said they had to use only materials from the parts bin it worked out just fine. The other issue that came into play was the there were no LCD monitors available for use in the project. Because of that they decided to make the device controllable over the network. On the right you can see a power supply taped to the top of a car computer. It connects to the laser (pulled out of a barcode scanner which produces a line of red light) and the turntable. A Python script does all of the image processing, assembling each slice of the scan into both an animated GIF and an OBJ file.

[Thanks Nathan]

Autonomous helicopter works like a Wii remote

autonomous-ir-helicopter

[Jack Crossfire] took one of those inexpensive indoor helicopters and made it autonomous. He didn’t replace the hardware used for the helicopter, but augmented it and patched into the remote control to make a base station.

The position feedback is provided in much the same way that the Wii remote is used as a pointing device. On the gaming console there is a bar that goes under the TV with two IR LEDs in it. This is monitored by an IR camera in the Wii remote and used to calculate where you’re pointing the thing. [Jack’s] auto-pilot system uses two Logitech webcams with IR filters over the sensors. You can see them mounted on the horizontal bar in the cutout above. The helicopter itself has an IR LED added to it that is always on. The base station follows this beacon by moving the cameras with a pair of servo motors, calculating position and using it when sending commands to the remote control’s PCB.

Don’t miss the demo video of the rig after the break.

Continue reading “Autonomous helicopter works like a Wii remote”