CastAR Goes Live On Kickstarter

castar2
[Jeri, Rick and the Technical Illusions crew] have taken the castAR to Kickstarter. We’ve covered castAR a couple of times in the past, but the Kickstarter includes a few new features just ripe for the hacking. First, castAR is no longer confined to a retro-reflective surface. In fact, it’s no longer confined to augmented reality. An optional clip on adapter converts castAR into a “free” augmented reality or a full virtual reality system.

[Jeri] has also posted a video on her YouTube channel detailing the entire saga of castAR’s development (embedded after the jump). The video has a real “heart to heart” feel to it, and is definitely worth watching. The story starts with the early days (and late nights) [Rick] and [Jeri] spent at Valve. She goes through the split with Valve and how the two set up a lab in [Rick’s] living room. [Jeri] also outlines some of the technical aspects of the system. She explains how the optics have been reduced from several pounds of projectors to the mere ounces we see today.

Another surprise addition is the lower level tier rewards of the campaign. The castAR tracking system is offered. The campaign page says the tracking system can be mounted to anything from robots to other VR headsets. The possibilities for hacking are almost endless. We’re curious about setting up our own swarm of quadcopters similar to the UPENN Grasp Lab. The RFID tracking grid is also offered as a separate option. In the gaming system this will be used for tracking tabletop game pieces. Based upon the Kickstarter page, it sounds as if the grid will not only use RFID, but a camera based tracking system. We’re definitely curious what possibilities this will hold.

As of this writing, the castAR Kickstarter campaign is already well past the halfway mark on its way to a $400,000 USD goal.

http://www.youtube.com/watch?v=cc2NQVQK69A

53 thoughts on “CastAR Goes Live On Kickstarter

      1. Meh, Not enough room on the FPGA for 2 6502s needed to replicate functionality of both the the C64 CPU and the 1541 controller. Ellsworth contribution to the 64 scene remains substantial, She was 100% right to walk rather than spend years fitting a round peg into a square hole. If it were me I would have implemented .64 tape emulation for an authentic C64 experience and walked. Ellsworth actually refined the design to an unprecedented level. The way Schönfeld talks, it’s as if he believes the C64 will one day transcend moores law by some clever assembly hack and reclaim it’s rightful place as the greatest personal computer ever made.

        People talk about kids being fanbois about their Playstations and x-boxes, they have seen nothing compared to Commodore sceners,

        Anyway it all worked out in the end. The 1541 Ultimate and similar projects continued the work.

  1. I already pledged for the 4 Player setup. Going back later and upgrading to the VR addon for all 4 pairs, and probably addding the RFID tracking. They have a calculator to work out the cost of addons and even a fully upgraded set of 4 comes out to less than one set of Google Glass (which I would never buy due to privacy concerns)

    1. With assistance of the reward calculator you can get additional items, such as a tracker by itself which ”maybe” could be used with the rift for head positioning. So if castAR tracking does work with the rift, then it’s more then just comparing it’s combining the two.

          1. That’s just what shipping and insurance costs now. The retroreflective screen is quite heavy, and there’s no way on KS to figure out your exact cost, so every KS justs picks a number where it all evens out.
            But if you want, you can have the castAR shipped to a friend in the USA and get it when you visit. :)

  2. I’m grabbing the $395 package. Still waiting on my muOptics thermal imaging camera at Indiegogo (was supposed to be ready by April…it’s October now…) I knew their target date was too early (developer’s deadline curse) but I absolutely trust any crew Jeri Ellsworth is on.

    I’ve been working with the Qualcomm Vuforia API on the Unity3D engine using my Galaxy S3 as a testing device for about 6 months now, and the potential is amazing. I just wonder whether or not consumers will be able to adapt to this new cyborg upgrade.

    I only hope I can get my hands on these glasses before the big dogs release theirs!

    1. I second the waiting forever part. I was a former backer of that project. Jumped ship and backed the Omate Truesmart watch. Night and day difference between those projects. So glad I left. The delay isn’t why I left though. It was the total lack of substantial updates. They can’t even post their useless updates on a weekly basis.

    2. Get a mU refund now if you can (they are getting slower and slower at refunds, probably close to running).

      The mu camera with 99.9% certainty does not exist, they have never produced any credible evidence of it’s existence, there has never been a thermal image released, never a single photo of the “nearly ready” unit or even prototype, all their updates are ridiculous hand waving to make it sound like they are “really close” but contradicting themselves at every turn.

      Get out if you can, otherwise your money, it’s gone.

      Read this thread.
      http://www.eevblog.com/forum/crowd-funded-projects/m-thermal-imager-real-or-fake/

      Read the backer comments in the IGG campaign.

      GET OUT.

    3. They’re quite late, and I get their email updates maybe once every two weeks, but I still have some confidence in the project. I’ve lost more than $150 for even stupider risks, so I’m not going to cry about it if John McGrath rips me off…I’ll just make his life difficult. ;)

      Here’s from the latest update (omitted the hand-waving apology paragraph…)

      “Right now, our hardware design is complete, and we are working on our image processing software. This is without a doubt the most repetitive and tedious portion of the project. Constant changes to the imaging filter and Non-Uniformity Correction filter in order to produce a truly high quality image. Looking back through the updates, I realize that I may not have made it clear that we have been getting image data from the sensor for quite some time now, and that now we are only clearing up that raw data and saving it as video.

      As soon as we have finished the firmware, we will cut the checks and give the manufacturers the go ahead! We can’t wait for everyone to start using the camera.”

  3. neat project but for commercial or every day uses most people will not be pleased with the results of this item. the pico projectors just can not produce the light needed to get what most would consider even a descent picture IMHO. That is probably the biggest deal breaker I believe for most. it is a neat project but not ready for primetime.

    I will say this, I hope i’m wrong and it is better than i expect or have seen in the prototypes.

    1. I believe it is the retroreflector surface that allows the low lux pico projectors to shine in this application.

      I’m curious what material they’re using. If it is possible to use vinyl reflectors this could be a interesting way to make a cheap vr cave.

  4. I can’t believe Valve passed on this idea. Especially when they already had it named Headcrab.

    I suggest calling all HMD’s headcrabs, so it becomes common usage before Valve realizes what they missed.

  5. Very well, but all these stuff still has not very good speed, to track and project images without small, but annoyng delays and offsets (this is related to augmented reality with AR-marks, mainly). So the cheap high-speed cameras probably should be a solution in this case.

        1. For good smooth overshoot-free transient tracking, you need a much higher sample rate / frame rate than you think you need.

          Look at optical mice, they might report position to the computer at 120 Hz, but they’ll often actually sample closer to 3000 Hz.

          The reason is that to eliminate noise requires filtering across time, and doing so always increases your response time, so the only way to a clear signal and quick response is to have a stupid fast sample rate. Merely 2x as fast is not good enough, you need closer to 100x to 1000x times that again.

          You can only get away with 2x as fast (nyquist sampling theorem) if your information is encoded purely in the frequency domain. When it it is transient information (time domain!) then you just can’t get around the requirement for more data to average across – and worse, the only filtering you can use must not exhibit any group delay unsmoothness, otherwise time precision goes out the window.

          In other words, the time impulse response of your filtering kernel must be gaussian – this is very obvious when filtering static 2d images to reduce 2d spatial noise, but less obvious when filtering along the time domain, yet just as inescapable. Gaussian kernels have poor frequency domain performance, which raises yet more headaches if significant signal is possible with energy > 0.5 fs, since you get aliasing. To see what that does, take one of the early optical mouses and move it very suddenly – move too fast and the sensor will decide you’ve moved it in a completely random direction.

          Thankfully gaussian filtering is really easy to approximate – daisy chain about 3 or 4 moving average filters and you’re there, but you still need to keep your sample rate high – for the love of god don’t decimate it!
          Truncate the high order bits instead, because proper gaussian filtering will prevent large sample to sample delta, so you can take the derivative and then truncate with absolutely no loss of information. (like a CIC filter, minus the decimation in the midddle).

          Good transient performance of just one voltage is hard to achieve, let alone high speed video. The only thing in video’s favour is that the frame integration time allows for some integral time filtering, in a synced moving average sense. Sadly I don’t think a fast video camera chip exists that does gaussian time filtering / derivative encoding at the per pixel level – that’d be perfect for this sort of rapid & accurate motion measurement job.

          1. Regarding the example about optical mice, it’s not just about filtering, it’s also about maximum velocity and the amount of surface area the little low-resolution camera views. I once did some calculations based on the maximum velocity spec and resolution spec in the datasheet of one optical mouse sensor datasheet and found that it had to be processing at something close to ten thousand fps.

            That optical mouse scenario is not particularly comparable to castAR though, because that is to do with relative movement tracking, whereas castAR is tracking absolute position with respect to the projection surface

          2. Yes, I understood that the high frequency of shots is used to take average value from a large number of shots and to reduce thus possible error value. But in my opinion it not the most rational way to use high FPS. In my opinion it is better if this high rate is used smoothly to display augmented reality at change of position of the observer or head turns. And why not to use higher resolution of a sensor for error decreasing? If to enlarge resolution of a sensor (number of pixels) in 4 times, its accuracy will grow and the error size will decrease. For example if permission to enlarge four times, level of a mistake decreases twice. And if to enlarge permission by 16 times, accuracy will grow by 4 times.

          3. * For example if resolution to enlarge four times, level of a error decreases twice. And if to enlarge resolution by 16 times, accuracy will grow by 4 times. *

            Sorry, automatic translation :)

  6. I have no doubt it’s very cool, but is it actually use full besides playing boardgames in a dark room?
    I mean , the amount of light generated is most likely very very little..
    and what’s the resolution of that stuff?
    and ouch , 80 usd shipping for non US.. and add there a nice shot of custom duties on top…

    1. The light is plenty. The innovation is that the generated light all gets reflected straight back into your eyes with the retroreflector. They’re deliberately underdriving the projectors as the image is already bright enough. As for the resolution, each eye gets a 720p projector.

      And besides boardgmes? It’ll run Skyrim, TF2, L4D, and most unity games in nausea free Augmented Reality 3D. I’m looking forward to being able to whip my head around to see someone sneaking up on me in Skyrim. If they hit 700K Rick is writing a video player, so you’ve effectively got your own home cinema as large as you want to make the retroreflective screen. Then there’s the potential for viewing 3D printed *.STL files, medical imaging applications, flight sims, etc.

  7. Is there anyone else confused about what they call “AR”? I mean, they use projectors: The viewer can only view things within the castAR surface. It’s not like 3D objects “popup” on the table, they rather “popdown”.

    Whenever I read or hear about AR, I have “video goggles” in mind – Virtual 3D objects blending into real life, as if they were real. Like this:

    http://www.youtube.com/watch?v=7i1NYVaYv8g

    1. Neither seem that realistic. Popup stuff always seems fake because it’s tethered to something and never really interacts with it’s environment, just the sigil it’s tied to.

      When you can fly that helicopter around like an RC copter, and make it pass behind something, then the technology is getting somewhere.

    2. They have a “true AR” (and also a “true VR”) clip on to the glasses that you can attach. I would assume that the “true AR” ones use a half mirrored surface so you can see through (and the “true VR” ones are fully reflective). See Jeri’s video where she briefly shows a clip on prototype.

  8. I would wonder if CastAR could in deliberately refocus images in realtime to compensate for people who ware glasses. So that they could take off their glasses and use it without their glasses. I’m not being funny but I would expect that a large percentage of their initial target audience would wear glasses :) I don’t but I think that it would be a cool selling point.

  9. CastAR is indeed pretty cool and Jeri and the gang are awesome for working so hard to get this to this stage.

    On the subject of Jeri, would it really kill *certain people* to treat her as a human being rather than a piece of meat? You disgust me, and I dare to think what she must think if she sees some of these comments. Grow up.

  10. Well if this is what gets the gamers jollies I’m not sure what to say… just look at the design, application of the technology and you can see its just silly. why not try progressing from the idea of rift instead of back pedalling

    1. What he hell are you talking about? Silly? What??
      And saying jeri has some sort of issue with technology.. right, sure.
      And suggesting to steal some other team’s project instead on top of it all. Now really.
      You are quite the jester.

  11. Additional comment: I really hope this works out, even if I’d never got my hands on it for some reason. You just wish her/them success.
    And thanks to gabe for doing the right thing and letting them have the project. He’s such a unique character.

    1. That’s indeed nice of Gabe. It’s all the more surprising that he’s then openly supporting a competing project (Oculus Rift) and telling people to support that kickstarter project instead, when he failed to get behind the castAR project that was developed at Valve, and fired the inventors instead: http://youtu.be/1QnXHe_MIx4?t=2m18s I guess “unique” is the right word ;) Of course, there might be other reasons behind all this, or he just didn’t see how this project was making sense for Valve. Still, it’s a bit peculiar.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.