Sending Open-Source Satellites To Space

An anonymous reader tipped us about two Argentinian satellites (satellite one, satellite two) that were sent in 2013 to space. What is interesting about them? They are both based on commercial off-the-shelf (COTS) components, and the team released the framework & flight computer software for their main platform (named cubesat, GitHub link). Gunter’s space page not only impresses us by showing the quantity of small/amateur satellites sent each month to space, but also lets us know that the hardware source files for CudeBug 1/2 are meant to be released. In the meantime we can only gather that they’re using a Texas Instruments TMS570 running FreeRTOS. Nevertheless, the two different web pages (in spanish and english) offer us a very interesting glimpse of what it takes to send an electronic project to space and how it later behaves.

You may also be interested in checking out ArduSat, a successful kickstarter campaign aimed at sending Arduino experiments in space.

Using The Raspberry Pi To See Like A Bee

Bee

The Raspberry Pi board camera has a twin brother known as the NoIR camera, a camera without an infrared blocking filter that allows anyone to take some shots of scenes illuminated with ‘invisible’ IR light, investigate the health of plants, and some other cool stuff. The sensor in this camera isn’t just sensitive to IR light – it goes the other way as well, allowing some investigations into the UV spectrum, and showing us what bees and other insects see.

The only problem with examining the UV spectrum with a small camera is that relatively, the camera is much more sensitive to visible and IR than it is to UV. To peer into this strange world, [Oliver] needed a UV pass filter, a filter that only allows UV light through.

By placing the filter between the still life and the camera, [Oliver] was able to shine a deep UV light source and capture the image of a flower in UV. The image above and to the right isn’t what the camera picked up, though – bees cannot see red, so the green channel was shifted to the red, the blue channel to the green, and the UV image was placed where the blue channel once was.

Continue reading “Using The Raspberry Pi To See Like A Bee”

Hackaday Space: Pixel Art Contest

During the Final Transmission — which I’ll post about tomorrow — we decided to open up a creative area on the Minecraft server for people to build whatever they wanted as part of a Pixel Art contest. Today we announce the winners of that art challenge, and assign them their points so that we can draw the overall winner of the Final Transmission. Each winner gets additional points added to their score. These were judged by Hackaday alum [Caleb Kraft] since he hadn’t been involved in the shenanigans up to this point and was, considered unbiased, and has a well-developed set of art chops himself. So, here goes…

3rd place : Hack A Tardis

2014-04-30_09.57.12 2014-04-30_09.57.55 2014-04-30_09.58.50

The third place goes to the Dr Who Box by [Marcus1297], entitled ‘Hack a Tardis’. This is a great rendition of the tardis, while its only 2 dimensional it has fine detail, and the beacon beam coming out of the top is a nice finishing touch. Excellent work! [Marcus1297] gets an additional 2.5k points for his score.

2nd place : Nicola Tesla Memorial

2014-04-30_10.00.27 2014-04-30_10.00.41 2014-04-30_10.33.04 2014-04-30_09.59.38 2014-04-30_09.59.56 2014-04-30_10.00.11

Second place goes to [st3al2012] for his stunning Tesla Coil which he dedicated to Nikola Tesla. This was picked because the “Art was exceptional”. There’s a lot of detail in there, not only did he build the main structure of the coil complete with the toroidal ring, but he also showed the core components. The spark gap generator, the capacitors and even the AC outlet. There’s a lot of detail and it looks stunning at night. Great job, [st3al2012] you get an extra 5k points for your scoreboard.

1st place : Portal Cube

2014-04-30_09.55.46 2014-04-30_09.55.57 2014-04-30_09.56.12

First place goes to [XDjackieXD] for his quite amazing Portal Cube. [Caleb] declared this the winner saying that the “Art and execution were exemplary”. We have to agree, the cube looks fantastic, but best of all he went to all of the trouble to create “Want you Gone” (the ending song from the game) using note blocks positioned inside the cube. Lovely work and he thoroughly deserves the 10k points he has received for this.

Congratulations to the winners, and thank you to everyone that contributed. The Minecraft server is still up so if you want to take a look at all the art for yourself connect to it at ‘minecraft.hackaday.com’. We have put up the world and all the plugins used to build it here. I’ll be releasing the source for the MatrixMiner plugin that I developed for the teleporter display when I get a chance to finish it.

Continue reading “Hackaday Space: Pixel Art Contest”

A Z80 Retro Microcomputer For The Papilio Pro FPGA Board

z80

[Will] wrote a 128MHz Z80-based retro microcomputer which runs on a Papilio Pro board. For those who don’t know, the latter is built around a Spartan-6 LX9 FPGA so you may imagine that much work was required to implement all the computer features in VHDL. The T80 CPU core was taken from opencores, the SDRAM controller was imported from Mike Field’s work but [Will] implemented several additional functions on his own:

– a 4KB paged Memory Management Unit to translate the 16-bit (64KB) logical address space into a 26-bit (64MB) physical address space.

– a 16KB direct mapped cache to hide the SDRAM latency, using the FPGA internal block RAM

– a UART interface for external communications

He also ported CP/M-2.2, MP/M-II and UZI (a UNIX system) to the computer. His project is completely open-source and all the source code can be downloaded at the end of [Will]’s write up.

Thanks [hamster] for the tip.

Eye Tracking With The Oculus Rift

ocu

There’s a lot you can do with eye and gaze tracking, when it comes to interface design, so when [Diako] got his hands on an Oculus Rift, there was really only one thing to do.

Like a few other solutions for eye tracking we’ve seen, [Diako] is using a small camera with the IR filter removed to read the shape and location of an eye’s pupil to determine where the user is looking. This did require cutting a small hole near one of the Oculus’ eye cups, but the internal camera works great.

To get a window to the world, if it were, [Diako] slapped another camera onto the front of the Oculus. These two cameras are fed into the same computer, the gaze tracking is overlaid with the image from the front of the headset, and right away the user has a visual indication of where they’re looking.

Yes, using a computer to know where you’re looking may seem like a rather useless build, but stuff like this is used in research and extraordinarily high tech heads up displays. Although he’s not using the motion tracking on the Oculus, if [Diako] were to do so, he’d have the makings of one of the most powerful heads up displays possible.

Continue reading “Eye Tracking With The Oculus Rift”

Hackaday Space: Transmission 3 Puzzles Explained

transmission-3-puzzles-explained

Yesterday we did a run down of Transmission 2 as part of a series of posts covering the ARG that we ran throughout April. Today I’m going to reveal all the details in Transmission 3, how we put it together and what the answers were.

In classic Hackaday fashion we hadn’t planned any of this, so by this point all our initial ideas we already used up and we were now running out of creativity so it was a real slog to get Transmission 3 out the gate. However we somehow managed it and opened Transmission 3 by posting a series of 5 images of space telescopes:

Continue reading “Hackaday Space: Transmission 3 Puzzles Explained”

Airchat, The Wireless Mesh Network From Lulzlabs

With the lessons learned from the Egyptian, Libyan, and Syrian revolutions, a few hardware and software hackers over at Lulzlabs have taken it upon themselves to create a free-as-in-beer and free-as-in-speech digital communications protocol that doesn’t deal with expensive, highly-surveilled commercial and government controlled infrastructure. They call it Airchat, and it’s an impressive piece of work if you don’t care about silly things like ‘laws’.

Before going any further, we have to say yes, this does use amateur radio bands, and yes, they’re using (optional) encryption, and no, the team behind Airchat isn’t complying with all FCC and other amateur radio rules and regulations. Officially, we have to say the FCC (and similar agencies in other countries) have been granted the power – by the people – to regulate the radio spectrum, and you really shouldn’t disobey them. Notice the phrasing in that last sentence, and draw your own philosophical conclusions.

Airchat uses an off the shelf amateur transmitter, a Yaesu 897D in the example video below although a $30 Chinese handheld radio will do, to create a mesh network between other Airchat users running the same software. The protocol is based on the Lulzpacket, a few bits of information that give the message error correction and a random code to identify the packet. Each node in this mesh network is defined by it’s ability to decrypt messages. There’s no hardware ID, and no plain text transmitter identification. It’s the mesh network you want if you’re under the thumb of an oppressive government.

Airchat has already been used to play chess with people 180 miles away, controlled a 3D printer over 80 miles, and has been used to share pictures and voice chats. It’s still a proof of concept, and the example use cases – NGOs working in Africa, disaster response, and expedition base camps – are noble enough to not dismiss this entirely.

Continue reading “Airchat, The Wireless Mesh Network From Lulzlabs”