Hacking The Wooly Mammoth

In case you can’t get enough Jurassic Park movies, you can look forward to plans a biotech company has to hybridize endangered Asian elephants with long-extinct wooly mammoths using gene splicing and other exotic techniques.

Expect a long movie, the team hopes to have calves after six years and we don’t think a theme park is in the making. The claim is that mammoth traits will help the elephants reclaim the tundra, but we can’t help but think it is just an excuse to reanimate an extinct animal. If you read popular press reports, there is some question if the ecological mission claimed by the company is realistic. However, we can’t deny it would be cool to bring an animal back from extinction — sort of.

We aren’t DNA wizards, so we only partially understand what’s being proposed. Apparently, skin cells from a modern elephant will serve as a base to accept extracted mammoth DNA. This might seem far-fetched but turns out the mammoth lived much more recently than we usually think. When they die in their natural deep-freeze environment, they are often well preserved.

Once the gene splicing is set up, a surrogate elephant will carry the embryo to term. The hope is that the improved breed would be able to further interbreed with natural species, although with the gestation and maturity times of elephants, this will be a very long time to bear fruit.

So how do you feel about it? Will we face a movie-level disaster? Will we get some lab curiosity creatures? Will it save the tundra? Let us know what you think in the comments.

DNA manipulation has gone from moon-shot-level tech to readily accessible in a very short amount of time. In particular, CRISPR, changes everything and is both exciting and scary on what it puts in the hands of nearly anyone.

Harp Uses Frikin’ Lasers

We aren’t sure if you really need lasers to build [HoPE’s] laser harp. It is little more than some photocells and has an Arduino generate tones based on the signals. Still, you need to excite the photocells somehow, and lasers are cheap enough these days.

Mechanically, the device is a pretty large wooden structure. There are six lasers aligned to six light sensors. Each sensor is read by an analog input pin on an Arduino armed with a music-generation shield. We’ve seen plenty of these in the past, but the simplicity of this one is engaging.

Continue reading “Harp Uses Frikin’ Lasers”

3D Objects Without Scanning

There are many scanners — both commercial and homemade — that can take a variety of scans or images of a 3D object and convert it into something like a 3D printable file. When the process works, it works well, but the results can be finicky at best and will require a lot of manual tuning. According to [Samuel Garbett], you might as well just draw your own model using Blender. He shows you how using a Red Bull can which, granted, isn’t exactly the most complicated thing ever, but it isn’t the simplest either.

He does take one photo of the can, so there is a camera involved at some point. He also takes measurements using calipers, something you probably already have laying around.

Since it is just a can, there aren’t many required pictures or measurements as, say, a starship model. Once you have the measurements, of course, you could use the tool of your choice and since we aren’t very adept with Blender, we might have used something we think is easier like FreeCAD or OpenSCAD. However, Blender has a lot of power, so we suspect making the jump from can to the USS Enterprise might be more realistic for a Blender user.

Besides, it is good to see how other tools work and we were surprised that Blender could be relatively simple to use. Every time we see [Jared’s] channel, we think we should learn more about Blender. But if you have your heart set on a real scanner, there are plenty of open source designs you can print.

Ray Casting 101 Makes Things Simple

[SSZCZEP] had a tough time understanding ray tracing to create 3D-like objects on a 2D map. So once he figured it out, he wrote a tutorial he hopes will be more accessible for those who may be struggling themselves.

If you’ve ever played Wolfenstein 3D you’ll have seen the technique, although it crops up all over the place. The tutorial borrows an animated graphic from [Lucas Vieira] that really shows off how it works in a simplified way. The explanation is pretty simple. From a point of view — that is a camera or the eyeball of a player — you draw rays out until they strike something. The distance and angle tell you how to render the scene. Instead of a camera, you can also figure out how a ray of light will fall from a light source.

There is a bit of math, but also some cool interactive demos to drive home the points. We wondered if Demos 3 and 4 reminded anyone else of an obscure vector graphics video game from the 1970s? Most of the tutorial is pretty brute force, calculating points that you can know ahead of time won’t be useful. But if you stick with it, there are some concessions to optimization and pointers to more information.

Overall, a lot of good info and cool demos if this is your sort of thing. While it might not be the speediest, you can do ray tracing on our old friend the Arduino. Or, if you prefer, Excel.

GNU Radio Decodes Voyager Data

With the 44th anniversary of the launch of Voyager I, [Daniel] decided to use GNU Radio to¬†decode Voyager data. The data isn’t live, but a recording from the Green Bank Telescope. The 16 GB file is in GUPPI format which stores raw IQ samples.

The file contains 64 frequency channels of just under 3MHz each. The signal of interest is in one channel, so it is easy to just throw away the rest of the data.

Continue reading “GNU Radio Decodes Voyager Data”

Historical Hackers: The Hacker Of Cragside, Circa 1870

Imagine visiting a home that was off the grid, using hydroelectric power to run lights, a dishwasher, a vacuum cleaner, and a washing machine. There’s a system for watering the plants and an intercom between rooms. Not really a big deal, right? This is the twenty first century, after all.

Armstrong with a 7 inch gun of his design
Image of Armstrong and his 7-inch gun from an 1887 edition of Illustrated London News

But then imagine you’ve exited your time machine to find this house not in the present day, but in the year 1870. Suddenly things become quite a bit more impressive, and it is all thanks to a British electrical hacker named William Armstrong who built a house known as Cragside. Even if you’ve never been to Northumberland, Cragside might look familiar. It’s appeared in several TV shows, but — perhaps most notably — played the part of Lockwood Manor in the movie Jurassic World: Fallen Kingdom.

Armstrong was a lawyer by training but dabbled in science including hydraulics and electricity — a hot topic in the early 1800s. He finally abandoned his law practice to form W. G. Armstrong and Company, known for producing Armstrong guns, which were breech-loading artillery pieces ranging from 2.5 inch bores up to 7 inches. By 1859, he was knighted and became the principal supplier of armaments to both the Army and the Navy.

Continue reading “Historical Hackers: The Hacker Of Cragside, Circa 1870”

Brain electrodes

Brain Interface Uses Tiny Needles

We often look at news out of the research community and think, “we could build that.” But the latest brain-machine interface from an international team including the Georgia Institute of Technology actually scares us. It uses an array of tiny needles that penetrate the skin but remain too small for your nerves to detect. Right. We assume they need to be sterile but either way, we don’t really want to build a pin grid array to attach to your brain.

It seems the soft device is comfortable and since it is very lightweight it doesn’t suffer from noise if the user blinks or otherwise moves. Looking at the picture of the electrodes, they look awfully pointy, but we assume that’s magnified quite a few times, since the post mentions they are not visible to the naked eye.

Continue reading “Brain Interface Uses Tiny Needles”