Microsoft Bob was revolutionary. Normally you’d hear a phrase like that coming from an idiot blogger, but in this case a good argument could be made. Bob threw away the ‘files’ and ‘folders’ paradigm for the very beginnings of virtual reality. The word processor was just sitting down at a desk and writing a letter. Your Rolodex was a Rolodex. All abstractions are removed, and you’re closer than ever to living in your computer. If Microsoft Bob was released today, with multiple users interacting with each other in a virtual environment, it would be too far ahead of it’s time. It would be William Gibson’s most visible heir, instead of Melinda Gates’ only failure. Imagine a cyberpunk world that isn’t a dystopia, and your mind will turn to Microsoft Bob.
Not everyone will laugh at the above paragraph. Indeed, some people are trying to make the idea of a gigantic, virtual, 3D space populated by real people a reality. For the last few years, [alusion] has been working on Metaverse Lab as an experiment in 3D scanning, virtual web browsers, and turning interconnected 3D spaces into habitats for technonauts. The name comes from Snow Crash, and over the past twenty years, a number of projects have popped up to replicate this convergence of the digital and physical. By integrating this idea with the latest VR headsets, Metaverse Lab is the the closest thing I’ve ever seen to the dream of awesome 80s sci-fi.
I’ve actually had the experience of using and interacting with Metaverse Lab on a few occasions. The only way to describe it is as what someone would expect the Internet would be if their only exposure to technology was viewing the 1992 film Lawnmower Man. It works, though, as a completely virtual environment where potential is apparent, and the human mind is not limited by its physical embodiment.
Looking to add some activity to your day but don’t want to go through a lot of effort? [D10D3] has the perfect solution that enables you to take a leisurely bike ride through Skyrim. A standing bicycle combines with an HTC Vive (using the add-on driver VorpX which allows non-vr enabled games to be played with a VR headset) and a Makey Makey board to make slack-xercise — that’s a word now — part of your daily gaming regimen.
The Makey Makey is the backbone of the rig; it allows the user to set up their own inputs with electrical contacts that correspond to keyboard and mouse inputs, thereby allowing one to play a video game in some potentially unorthodox ways — in this case, riding a bicycle.
Setting up a couple buttons for controlling the Dragonborn proved to be a simple process. Buttons controlling some of the main inputs were plugged into a breadboard circuit which was then connected to the Makey Makey along with the ground wires using jumpers. As a neat addition, some aluminium foil served as excellent contacts for the handlebars to act as the look left and right inputs. That proved to be a disorienting addition considering the Vive’s head tracking also moves the camera. Continue reading “Staying In and Playing Skyrim Has Rarely Been This Healthy”→
The HTC Vive is the clear winner of the oncoming VR war, and is ready to enter the hallowed halls of beloved consumer electronics behind the Apple Watch, Smart Home devices, the 3Com Audrey, and Microsoft’s MSN TV. This means there’s going to be a lot of Vives on the secondhand market very soon, opening the doors to some interesting repurposing of some very cool hardware.
The Vive’s Lighthouse is an exceptionally cool piece of tech that uses multiple scanning IR laser diodes and a bank of LEDs that allows the Vive to sense its own orientation. It does this by alternately blinking and scanning lasers from left to right and top to bottom. The relevant measurements that can be determined from two Lighthouses are the horizontal angle from the first lighthouse, the vertical angle from the first lighthouse, and the horizontal angle from the second lighthouse. That’s all you need to orient the Vive in 3D space.
To get a simple microcontroller to do the same trick, [Trammell] is using a fast phototransistor with a 120° field of view. This setup only works out to about a meter away from the Lighthouses, but that’s enough for testing.
[Trammell] is working on a Lighthouse library for the Arduino and ESP8266, and so far, everything works. He’s able to get the angle of a breadboard to a Lighthouse with just a little bit of code. This is a great enabling build that is going to allow a lot of people to build some very cool stuff, and we can’t wait to see what happens next.
Just in case anyone secretly had the idea that Valve Software’s VR and other hardware somehow sprang fully-formed from a lab, here are some great photos and video of early prototypes, and interviews with the people who made them. Some of the hardware is quite raw-looking, some of it is recognizable, and some are from directions that were explored but went nowhere, but it’s all fascinating.
The accompanying video (embedded below) has some great background and stories about the research process, which began with a mandate to explore the concepts of AR and VR and determine what could be done and what was holding things back.
One good peek into this process is the piece of hardware shown to the left. You look into the lens end like a little telescope. It has a projector that beams an image directly into your eye, and it has camera-based tracking that updates that image extremely quickly.
The result is a device that lets you look through a little window into a completely different world. In the video (2:16) one of the developers says “It really taught us just how important tracking was. No matter [how you moved] it was essentially perfect. It was really the first glimpse we had into what could be achieved if you had very low persistence displays, and very good tracking.” That set the direction for the research that followed.
Troy New York’s Tech Valley Center of Gravity is following up their January IoT Hackathon with another installment. The April 16-17 event promises to be a doozy, and anyone close to the area with even a passing interest in gaming and AR/VR should really make an effort to be there.
Not content to just be a caffeine-fueled creative burst, TVCoG is raising the bar in a couple ways. First, they’re teaming up with some corporate sponsors with a strong presence in the VR and AR fields. Daydream.io, a new company based in the same building as the CoG, is contributing a bunch of its Daydream.VR smartphone headsets to hackathon attendees, as well as mentors to get your project up and running. Other sponsors include 1st Playable Productions and Vicarious Visions, game studios both located in the Troy area. And to draw in the hardcore game programmers, a concurrent Ludum Dare game jam will be run by the Tech Valley Game Space, with interaction and collaboration between the AR/VR hackers and the programmers encouraged. Teams will compete for $1000 in prizes and other giveaways.
This sounds like it’s going to be an amazing chance to hack, to collaborate, and to make connections in the growing AR/VR field. And did we mention the food? There was a ton of it last time, so much they were begging us to take it home on Sunday night. Go, hack, create, mingle, and eat. TVCoG knows how to hackathon, and you won’t be disappointed.
Thanks to [Duncan Crary] for the heads up on this.
When [Cassidy and Chad Lexcen]’s twin daughters were born in August, smaller twin [Teegan] was clearly in trouble. Diagnostics at the Minnesota hospital confirmed that she had been born with only one lung and half a heart. [Teegan]’s parents went home and prepared for the inevitable, but after two months, she was still alive. [Cassidy and Chad] started looking for second opinions, and after a few false starts, [Teegan]’s scans ended up at Miami’s Nicklaus Children’s Hospital, where the cardiac team looked them over. They ordered a 3D print of the scans to help visualize possible surgical fixes, but the 3D printer broke.
Not giving up, they threw [Teegan]’s scans into Sketchfab, slapped an iPhone into a Google Cardboard that one of the docs had been playing with in his office, and were able to see a surgical solution to [Teegan]’s problem. Not only was Cardboard able to make up for the wonky 3D printer, it was able to surpass it – the 3D print would only have been the of the heart, while the VR images showed the heart in the context of the rest of the thoracic cavity.[Dr. Redmond Burke] and his team were able to fix [Teegan]’s heart in early December, and she should be able to go home in a few weeks to join her sister [Riley] and make a complete recovery.
We love the effect that creative use of technology can have on our lives. We’ve already seen a husband using the same Sketchfab tool to find a neurologist that remove his wife’s brain tumor. Now this is a great example of doctors doing what it takes to better leverage the data at their disposal to make important decisions.
Modern DSLR cameras are amazing devices. Mechanics, electronics, and optics, all rolled up in a single package. All that technology is great, but it can make for a frustrating experience when attempting any sort of repair. Lenses can be especially difficult to work on. One misalignment of a lens group or element can lead to a fuzzy image.
[Kratz] knew all this, but it didn’t stop him from looking for a cheap lens deal over on eBay. He found a broken Nikon DSLR 55-200mm 1:4-5.6 AF-S VR camera lens for $30. This particular lens is relatively cheap – you can pick up a new one for around $150 online. Spending $30 to save $120 is a bit of a gamble, but [Kratz] went for it.
The lens he bought mostly worked – the auto-focus and vibration reduction system seemed to be fine. The aperture blades however, were stuck closed. Aperture blades form the iris of a lens. With the blades closed down, the lens was severely limited to brightly lit situations. All was not lost though, as the aperture is a relatively simple mechanical system, which hopefully would be easy to repair.
Keeping screws and various parts in order is key when taking apart a lens. [Kratz] used a tip he learned right here on Hackaday: He drew a diagram of the screw positions on a thick piece of paper. He then stuck each screw right into the paper in its proper position.
Carefully removing each part, [Kratz] found a pin had slipped out of the rod that connects the lens’ internal parts with the external aperture control arm. Fixing the pin was simple. Getting the lens back together was quite a bit harder. Several parts have to be aligned blindly. [Kratz] persevered and eventually everything slipped into alignment. The finished lens works fine, albeit for a slightly noisy auto-focus.
It’s worth noting that there are service and repair manuals for many cameras and lenses out there in the dark corners of the internet, including [Kratz]’s 55-200 lens. Reading the repair procedures Nikon techs use shows just how many tools, fixtures, and custom bits of software go into making one of these lenses work.