Meet Jimmy: An Open Source Biped Robot From Intel

Jimmy_Arm_Up

Intel’s CEO [Brian Krzanich] stopped by the Re/Code conference to announce Jimmy, the first robot from the 21st Century Robot project. The project is the brainchild of [Brian David Johnson], Intel’s resident futurist. We love the project’s manifesto:

 Robot Is: Imagined first. Easy to build. Completely open source. Fiercely social. Intentionally iterative. Filled with humanity and dreams. Thinking for her/him/itself.

Jimmy may not be all those things yet, but he definitely is exciting. For starters, he wasn’t built in some secret lab at Intel HQ. Much of Jimmy’s construction took place at Trossen Robotics, a name well known to Hackaday. [Matt] and [Andrew] at Trossen describe all the details in their video down past the break.

This version of Jimmy is a research robot, which mean’s he’s not going to come cheap. Jimmy sports an Intel i5 NUC motherboard, 20 Dynamixel servos, a 5052 aluminum frame and a host of sensors. A  4S 14.8v 4000mAh LiPo battery will power Jimmy for 30 to 60 minutes between charges, so be sure to budget for a few spare packs. The most striking aspect of Jimmy is his 3D printed shell. The 21st Century Robot Project gave him large, friendly eyes and features, which will definitely help with the social aspect of their goals.

Jimmy is all about open source. He can run two flavors of Linux: Ubuntu 14.04 LTS or a custom version of Yocto Pokey. There is a lot to be said for running and developing on the same hardware. No specialized toolchains for cross compiling, no NFS shares to move binaries around. If you need to make a change, you can plug a monitor (or launch an VNC session) and do everything with Jimmy’s on-board computer. Jimmy’s software stack is based upon the DARwIn OP platform, and a ROS port is in the works.

We’re excited about Jimmy, but at $16,000 USD, he’s a bit outside our budget. Thankfully a smaller consumer version of Jimmy will soon be available for around 1/10th the cost.

Continue reading “Meet Jimmy: An Open Source Biped Robot From Intel”

Intel Edison: A Desktop From 1998 In An SD Card

According to the barrage of press releases hitting the Hackaday tip line, the Consumer Electronics Show is upon us with announcements of amazing new technologies such as jackets with a cell phone pocket, alarm clocks with Bluetooth, and iPhone cases with a kickstand. What an age to live in.

Among the more interesting announcements at CES is the Intel Edison, a tiny device that combines a dual core Intel SoC with ‘a Pentium instruction set’, WiFi and Bluetooth adapter, and some amount of storage into an SD card form factor. Apart from that, little else is known about the Intel Edison and the only other primary source for this announcement appears to be Intel CEO [Brian Krzanich]’s CES keynote address.

The Edison will be able to run Linux, ‘other operating systems’, and will support Wolfram, the Mathematica-esque programming language where everything is a data type. Edison will also have an app store. Because that’s a thing now, apparently.

If you can’t wait for Edison to be released sometime in the middle of 2014, we’d suggest you check out the Intel Galileo. It’s an Arduino compatible board based on the same Quark SoC found in the Edison but in a significantly more convenient form factor. The Galileo doesn’t have on board WiFi or Bluetooth, but at least you don’t have to wait for the release of the Edison and the complications of a purpose-built breakout board for whatever application you’re thinking of.

The Intel-powered Arduino

Dev boards based on microcontrollers and ARM System on Chips are everywhere, but finding a small pocketable computer based on an Intel processor has been difficult to find. [Massimo] of Arduino just unveiled a new Intel architecture Arduino-compatible board at the Rome Maker Faire. It’s called the Galileo, and it has everything you’d expect from a juiced-up Arduino running x86.

The main chip is an Intel Quark SoC running at 400MHz with 256 MB of DRAM. On board is a Mini-PCIe slot, 100Mb Ethernet port, Micro SD slot, RS-232, and USB host and client ports. Here’s the datasheet for the Galileo with all the applicable information.

The Galileo can be programmed with the standard Arduino IDE, but from the getting started guide, it looks like this board is running Yocto, a stripped down Linux for embedded environments.

Realistically, what we have here is a board with about the same processing power as a Raspberry Pi, but with Arduino compatibility, and a Mini PCIe port for some really fun stuff. It will be interesting to see what can be made with this board, but if you have any ideas on what to do with a Galileo before it’s released in two months, drop a note in the comments.

Hackaday Links: July 25, 2012

Ever wonder what CPU dev boards look like?

In the realm of highly confidential hardware, it doesn’t get much more secret than upcoming CPUs coming out of Intel. Somehow, a few CPU dev boards wound up on eBay, and [Leon] was cool enough to save all the pictures (Polish, Google translation, or translate in the sidebar). There are a few ongoing auctions right now, but we’d settle for this LGA 1156 breakout board. So cool.

No, we’re not linking directly to the free stuff

TI is giving away a brushless motor controller powered by a Stellaris ARM processor. [Chris] says he’s ordering one to figure out how to make a Stellaris dev board out of the giveaway. This controller is designed for e-bikes, so at the very least we see a few ginormous UAVs in someone’s future.

More rocket stuff!

One of [Bill]’s older hacks was taking a CVS disposable digital camera (remember that?) and stuffing it into the nose code of an Estes D-powered rocket. There’s a ton of videos of the flights [Bill] put up on YouTube.

On another note, [CyberPunk] built a half-scale model of a swing-wing rocket launched glider (pics: 1, 2, 3, 4). He’s currently building the full-size version capable of carrying RC and video gear and wants some feedback.

So, CAD on a tablet?

[spuder] caught wind of a tablet-based engineering notebook a few people are working on. They’re looking for some feedback on their demo video. We think it’s cool – especially the ability to share stuff between devices – but CAD on a tablet makes us extremely skeptical. Tell them what you think; we’d love to see this make it to our phone.

Now if they only made one for editing WordPress posts….

Test-driven development just got cooler. Here’s a Tamagotchi for Eclipse that you ‘feed’ by going from red to green and refactoring your code. Be careful, because having the same code test as red twice will kill your little code ninja.

And now I’ll rant about you.

A few days ago, I posted [Becky Stern]’s light-up handlebars project, and one comment surprised me. Who says guys can’t sew? It’s time to confront the gender roles that show up whenever sewing is used in a project. I’m doing a tutorial on how to sew a parachute, but I need your help. It’ll be a two-parter: one on how to actually use a sewing machine, and another for how to make a ‘chute. Is there anything else you’d like to see?

Binary Division When Your Processor Lacks Hardware Division

[Hamster] wanted to take a look at division operations when the chip you’re using doesn’t have a divide instruction. He makes the point that the divide instruction takes a lot of space on the die, and that’s why it’s sometimes excluded from a chip’s instruction set. For instance, he tells us the ARM processor used on the Raspberry Pi doesn’t have a divide instruction.

Without hardware division you’re left to implement a binary division algorithm. Eventually [Hamster] plans to do this in an FPGA, but started researching the project by comparing division algorithms in C on an AMD processor.

His test uses all 16-bit possibilities for dividend and divisor. He was shocked to find that binary division doesn’t take much longer than using the hardware instruction for the same tests. A bit of poking around in his code and he manages to beat the AMD hardware divide instruciton by 175%. When testing with an Intel chip the hardware beats his code by about 62%.

He’s got some theories on why he’s seeing these performance differences which we’ll let you check out on your own.

HDCP Falls To FPGA-based Man-in-the-middle Attack

fpga-hdcp-maninthemiddle-attack

It’s been a little while since we talked about HDCP around here, but recent developments in the area of digital content protection are proving very interesting.

You might remember that the Master Key for HDCP encryption was leaked last year, just a short while after Intel said that the protection had been cracked. While Intel admitted that HDCP had been broken, they shrugged off any suggestions that the information could be used to intercept HDCP data streams since they claimed a purpose-built processor would be required to do so. Citing that the process of creating such a component would be extremely cost-prohibitive, Intel hoped to quash interest in the subject, but things didn’t work out quite how they planned.

It seems that researchers in Germany have devised a way to build such a processor on an extremely reasonable budget. To achieve HDCP decryption on the fly, the researchers used a standard off the shelf Digilent Atlys Spartan-6 FPGA development board, which comes complete with HDMI input/output ports for easy access to the video stream in question. While not as cheap as this HDCP workaround we covered a few years ago, their solution should prove to be far more flexible than hard wiring an HDMI cable to your television’s mainboard.

The team claims that while their man-in-the-middle attack is effective and undetectable, it will be of little practical use to pirates. While we are aware that HDMI data streams generate a ton of data, this sort of talking in absolutes makes us laugh, as it often seems to backfire in the long run.

[via Tom’s Hardware]

Intel’s New Way Of Creating Randomness From Digital Orderliness

Random number generation is a frequent topic of discussion in projects that involve encryption and security. Intel has just announced a new feature coming to many of their processors that affect random number generation.

The random number generator, which they call Bull Mountain, marks a departure from Intel’s traditional method of generating random number seeds from analog hardware. Bull Mountain relies on all-digital hardware, pitting two inverters against each other and letting thermal noise tip the hand in one direction or the other. The system is monitored at several steps along the way, tuning the hardware to ensure that the random digits are not falling more frequently in one direction or the other. Pairs of 256-bit sequences are then run through a mathematical process to further offset the chance of predictability, before they are then used as a pseudorandom number seed. Why go though all of this? Transitioning to an all-digital process makes it easier and cheaper to reduce the size of microchips.

A new instruction has been added to access this hardware module: RdRand. If it works as promised, this should remove the need for elaborate external hardware as a random number source.

[via Reddit]