CES, the Consumer Electronics Show, is in full swing. Just for a second, let’s take a step back and assess the zeitgeist of the tech literati. Drones – or quadcopters, or UAVs, or UASes, whatever you call them – are huge. Self-driving cars are the next big thing. Flying cars have always been popular. On the technical side of things, batteries are getting really good, and China is slowly figuring out aerospace technologies. What could this possibly mean for CES? Self-flying drone cars.
The Ehang 184 is billed as the first autonomous drone that can carry a human. The idea is a flying version of the self-driving cars that are just over the horizon: hop in a whirring deathtrap, set your destination, and soar through the air above the plebs that just aren’t as special as you.
While the Ehang 184 sounds like a horrendously ill-conceived Indiegogo campaign, the company has released some specs for their self-flying drone car. It’s an octocopter, powered by eight 106kW brushless motors. Flight time is about 23 minutes, with a range of about 10 miles. The empty weight of the aircraft is 200 kg (440 lbs), with a maximum payload of 100 kg (220 lbs). This puts the MTOW of the Ehang 184 at 660 lbs, far below the 1,320 lbs cutoff for light sport aircraft as defined by the FAA, but far more than the definition of an ultralight – 254 lbs empty weight.
In any event, it’s a purely academic matter to consider how such a vehicle would be licensed by the FAA or any other civil aviation administration. It’s already illegal to test in the US, authorities haven’t really caught up to the idea of fixed-wing aircraft powered by batteries, and the idea of a legal autonomous aircraft carrying a passenger is ludicrous.
Is the Ehang 184 a real product? There is no price, and no conceivable way any government would allow an autonomous aircraft fly with someone inside it. It is, however, a perfect embodiment of the insanity of CES.
Every year, more than 30,000 people are killed in motor vehicle accidents in the US, and many many more are injured. Humans, in general, aren’t great drivers. Until dependable self-driving cars make their way into garages and driveways across the country, there is still a great amount of work that can be done to improve the safety of automobiles, and the best hope on the horizon is Vehicle to Vehicle communications (V2V). We keep hearing this technology mentioned in news stories, but the underlying technology is almost never discussed. So I decided to take a look at what hardware we can expect in early V2V, and the features you can expect to see when your car begins to build a social network with the others around it.
Continue reading “V2V: A Safer Commute with Cars Sharing Status Updates”
Humanity has taken one step closer to Skynet becoming fully aware. [Ahmed], [Muhammad], [Salman], and [Suleman] have created a vehicle that can “chase” another vehicle as part of their senior design project. Now it’s just a matter of time before the machines take over.
The project itself is based on a gasoline-powered quad bike that the students first converted to electric for the sake of their project. It uses a single webcam to get information about its surroundings. This is a plus because it frees the robot from needing a stereoscopic camera or any other complicated equipment like a radar or laser rangefinder. With this information, it can follow a lead vehicle without getting any other telemetry.
This project is interesting because it could potentially allow for large convoys with only one human operator at the front. Once self-driving cars become more mainstream, this could potentially save a lot of costs as well if only the vehicle in the front needs the self-driving equipment, while the vehicles behind would be able to operate with much less hardware. Either way, we love seeing senior design projects that have great real-world applications!
Continue reading “Autonomous Vehicle-Following Vehicle”
If you’ve ever wanted your own self-driving car, this is your chance. [Sebastian Thrun], co-lecturer (along with the great [Peter Norvig]) of the Stanford AI class is opening up a new class that will teach everyone who enrolls how to program a self-driving car in seven weeks.
The robotic car class is being taught alongside a CS 101 “intro to programming” course. If you don’t know the difference between an interpreter and a compiler, this is the class for you. You’ll learn how to make a search engine from scratch in seven weeks. The “Building a Search Engine” class is taught by [Thrun] and [David Evans], a professor from the University of Virginia. The driverless car course is taught solely by [Thrun], who helped win the 2005 DARPA Grand Challenge with his robot car.
In case you’re wondering if this is going to be another one-time deal like the online AI class, don’t worry. [Thrun] resigned as a tenured professor at Stanford to concentrate on teaching over the Internet. He’s still staying at Stanford as an associate professor but now he’s spending his time on his online university, Udacity. It looks like he might have his hands full with his new project; so far, classes on the theory of computation, operating systems, distributed systems, and computer security are all planned for 2012.