Customizing STLs For Off-Brand Devices

[Rob Clarke] needed a mount for his off-brand action camera, but it’s not exactly the kind of thing with a bustling accessory market. To make matters worse, it turns out the camera is so low-key that he couldn’t find a 3D printable mount for it either. Luckily, a check with his calipers confirmed his camera is just about the same size as an old GoPro Hero 3, so all he had to do was modify an existing design to fit his needs.

As anyone who’s worked with STL files will tell you, they are a pain to modify. An STL is essentially a completed solid model, and not really meant to be fiddled around with. It’s a bit like trying to take an edited image and get back to the layers that were used to create it in Photoshop or GIMP. The final output has been “flattened”, so that granular control is lost.

That being said, [Rob] got rather lucky in this case. He found a GoPro mount that was about 90% there, he just needed to adjust the depth and change the positioning of the holes on the side. He loaded the STL into SketchUp, deleted the two sides, and replaced them with new surfaces. This gave him a clean slate to add the appropriate openings for his camera’s USB port and microSD card. To adjust the depth of the mount, he simply stretched the model out on the Z axis.

[Rob] event went ahead and released his modified STLs as a remix of the original case he found on Thingiverse for anyone else that has the same camera. That’s what we love to see.

If you’re interested in learning more about using SketchUp for designing 3D printed parts, check out this excellent guide by our very own [Brian Benchoff].

Continue reading “Customizing STLs For Off-Brand Devices”

Photographing Starman From A Million Miles Away

Love it or loathe it, launching a sports car into space is a hell of a spectacle, and did a great job at focusing the spotlight on the Falcon Heavy spacecraft. This led [Rogelio] to wonder – would it be possible to snap a photo of Starman from Earth?

[Rogelio] isn’t new to the astrophotography game, possessing a capable twin-telescope rig with star tracking capabilities and chilled CCDs for reducing noise in low-light conditions. Identifying the location of the Tesla Roadster was made easier thanks to NASA JPL tracking the object and providing ephemeris data.

Imaging the Roadster took some commitment – from [Rogelio]’s chosen shooting location, it would only be visible between 3AM and 5:30AM. Initial attempts were unsuccessful, but after staying up all night, giving up wasn’t an option. A return visit days later was similarly hopeless, and scuppered by cloud cover.

It was only after significant analysis that the problem became clear – when calculating the ephemeris of the object on NASA’s website, [Rogelio] had used the standard coordinates instead of the actual imaging location. This created enough error and meant they were looking at the wrong spot. Thanks to the wide field of view of the telescopes, however, after further analysis – Starman was captured, not just in still, but in video!

[Rogelio]’s work is a great example of practical astronomy, and if you’re keen to get involved, why not consider building your own star tracking rig? Video after the break.

[Thanks to arnonymous for the tip! If that’s a nickname and not just a request to be anonymous but misspelled.]

Continue reading “Photographing Starman From A Million Miles Away”

Photograph Of Single Atom Captured With A Plain Old Camera

The Engineering and Physical Sciences Research Council awarded a remarkable photograph its overall prize in science photography. The subject of the photograph? A single atom visible to the naked eye. Well, perhaps not exactly the naked eye, but without a microscope. In the picture above (click here to enlarge), the atom is that pale blue dot between the two needle-like structures.

You probably learned in school that you couldn’t see a single atom, and that’s usually true. But [David Nadlinger] from the University of Oxford, trapped a positively charged strontium atom in an ion trap and then irradiated it with a blue-violet laser. The atom absorbs and reemits the light, and a camera can pick up the light, creating a one-of-a-kind photograph. The camera was a Canon 5D Mk II with a 50mm f/1.8 lens — a nice camera, but nothing too exotic.

The ion trap keeps the single atom balanced between two small needle points about 2 millimeters apart. [Nadlinger] did some math that convinced him the photograph could be possible and made it a reality on a Sunday afternoon. The pale dot isn’t especially spectacular by itself, but when you realize that it is the visual effect of a single atom, it is mind-blowing. Turns out, the lab has taken some similar photographs in the past. They don’t remember who took it, but they have a picture of 9 calcium-43 ions trapped, that you can seen below. The ions are 10 microns apart and at an effective temperature of 0.001 degrees Kelvin.

Other winning photographs included patterns on a soap bubble, an EEG headset in use, and microbubbles used to deliver drugs. There’s also an underwater robot, a machine for molecular beam epitaxy that looks like a James Bond villain’s torture device, and lattices made with selective laser melting 3D printing.

If you want to look at atoms from the comfort of your own home, maybe you should build an STM. You might even try NIST’s improved atom probe while you are at it. Just remember you can’t trust atoms. They make up everything.

Photo credit: David Nadlinger

Pipes, Tees, And Gears Result In Smooth Video Shots

It’s depressingly easy to make bad videos, but it only takes a little care to turn that around. After ample lighting and decent audio — and not shooting in portrait — perhaps the biggest improvements come from stabilizing the camera while it’s moving. Giving your viewers motion sickness is bad form, after all, and to smooth out those beauty shots, a camera slider can be a big help.

Not all camera sliders are built alike, though, and we must admit to being baffled while first watching [Rulof Maker]’s build of a smooth, synchronized pan and slide camera rig. We just couldn’t figure out how those gears were going to be put to use, but as the video below progresses, it becomes clear that this is an adjustable pantograph rig, and that [Rulof]’s eBay gears are intended to link the two sets of pantograph arms together. The arms are formed from threaded pipe and tee fittings with bearings pressed into them, which is a pretty clever construction technique that seems highly dependent on having the good fortune to find bearings with an interference fit into the threads. But still, [Rulof] makes it work, and with a little epoxy and a fair amount of finagling, he ends up with a complex linkage that yields the desired effects. And bonus points for being able to configure the motion with small adjustments to the camera bracket pivot points.

We saw a similar pantograph slider a few months back. That one was 3D-printed and linked with timing belts, but the principles are the same and the shots from both look great.

Continue reading “Pipes, Tees, And Gears Result In Smooth Video Shots”

Plastic Model Emulates The First Untethered Spacewalk

Here’s something really wonderful. [Dave Akerman] wrote up the results of his attempt to use a high-altitude balloon to try to re-create a famous image of NASA’s Bruce McCandless floating freely in space with the Earth in the background. [Dave] did this in celebration of the 34th anniversary of the first untethered spacewalk, even going so far as to launch on the same day as the original event in 1984. He had excellent results, with plenty of video and images recorded by his payload.

80’s “Astronaut with MMU” model kit.

Adhering to the actual day of the spacewalk wasn’t the only hurdle [Dave] jumped to make this happen. He tracked down an old and rare “Astronaut with MMU” (Mobile Maneuvering Unit) plastic model kit made by Revell USA and proceeded to build it and arrange for it to remain in view of the cameras. Raspberry Pi Zero Ws with cameras, LoRA hardware, action cameras, and a UBlox GPS unit all make an appearance in the balloon’s payload.

Sadly, [Bruce McCandless] passed away in late 2017, but this project is a wonderful reminder of that first untethered spacewalk. Details on the build and the payload, as well as the tracking system, are covered here on [Dave]’s blog. Videos of the launch and the inevitable balloon burst are embedded below, but more is available in the summary write-up.

Continue reading “Plastic Model Emulates The First Untethered Spacewalk”

Neural Network Zaps You To Take Better Photographs

It’s ridiculously easy to take a bad photograph. Your brain is a far better Photoshop than Photoshop, and the amount of editing it does on the scenes your eyes capture often results in marked and disappointing differences between what you saw and what you shot.

Taking your brain out of the photography loop is the goal of [Peter Buczkowski]’s “prosthetic photographer.” The idea is to use a neural network to constantly analyze a scene until maximal aesthetic value is achieved, at which point the user unconsciously takes the photograph.

But the human-computer interface is the interesting bit — the device uses a transcutaneous electrical nerve stimulator (TENS) wired to electrodes in the handgrip to involuntarily contract the user’s finger muscles and squeeze the trigger. (Editor’s Note: This project is about as sci-fi as it gets — the computer brain is pulling the strings of the meat puppet. Whoah.)

Meanwhile, back in reality, it’s not too strange a project. A Raspberry Pi watches the scene through a Pi Cam and uses a TensorFlow neural net trained against a set of high-quality photos to determine when to trip the shutter. The video below shows it in action, and [Peter]’s blog has some of the photos taken with it.

We’re not sure this is exactly the next “must have” camera accessory, and it probably won’t help with snapshots and selfies, but it’s an interesting take on the human-device interface. And if you’re thinking about the possibilities of a neural net inside your camera to prompt you when to take a picture, you might want to check out our primer on TensorFlow to get started.

Continue reading “Neural Network Zaps You To Take Better Photographs”

Large Format… Videography?

Large format photography gives a special quality to the images it produces, due to the differences in depth of field and resolution between it and its more modern handheld equivalents. Projecting an image the size of a dinner plate rather than a postage stamp has a few drawbacks though when it comes to digital photography, sensor manufacturersdo not manufacture consumer products at that size.

[Zev Hoover] has created a large format digital camera, and is using it not only for still images but for video. And it’s an interesting device, for the way he’s translated a huge large-format image into a relatively small sensor in a modern SLR. He’s projecting the image from the large-format lens and bellows onto a screen made from an artist’s palette, a conveniently available piece of bright white plastic, and capturing that image with his SLR mounted beneath the large-format lens assembly. This would normally cause a perspective distortion, but to correct that he’s mounted his SLR lens at an offset.

He does point out that since less light reaches the camera there is also a change in the ISO setting on the camera, but once that has been taken into account it performs satisfactorily. The result is a camera that allows something rather unusual, for Victorian-style large-format images to come to life as video. He demonstrates it in the video below, complete with friends in suitably old-fashioned looking steampunk attire.

Continue reading “Large Format… Videography?”