The leap to self-driving cars could be as game-changing as the one from horse power to engine power. If cars prove able to drive themselves better than humans do, the safety gains could be enormous: auto accidents were the #8 cause of death worldwide in 2016. And who doesn’t want to turn travel time into something either truly restful or alternatively productive?
But getting there is a big challenge, as Alfred Jones knows all too well. The Head of Mechanical Engineering at Lyft’s level-5 self-driving division, his team is building the roof racks and other gear that gives the vehicles their sensors and computational hardware. In his keynote talk at Hackaday Remoticon, Alfred Jones walks us through what each level of self-driving means, how the problem is being approached, and where the sticking points are found between what’s being tested now and a truly steering-wheel-free future.
Check out the video below, and take a deeper dive into the details of his talk.
It is with a heavy heart that we must report the National Science Foundation (NSF) has decided to dismantle the Arecibo Observatory. Following the failure of two support cables, engineers have determined the structure is on the verge of collapse and that the necessary repairs would be too expensive and dangerous to conduct. At the same time, allowing the structure to collapse on its own would endanger nearby facilities and surely destroy the valuable research equipment suspended high above the 300 meter dish. Through controlled demolition, the NSF hopes to preserve as much of the facility and its hardware as possible.
In 1974, it was even used to broadcast the goodwill of humankind to any intelligent lifeforms that might be listening. Known as the “Arecibo Message”, the transmission can be decoded to reveal an assortment of pictograms that convey everything from the atomic numbers of common elements to the shape of the human body. The final icon in the series was a simple diagram of Arecibo itself, so that anyone who intercepted the message would have an idea of how such a relatively primitive species had managed to reach out and touch the stars.
There is no replacement for the Arecibo Observatory, nor is there likely to be one in the near future. The Five hundred meter Aperture Spherical Telescope (FAST) in China is larger than Arecibo, but doesn’t have the crucial transmission capability. The Goldstone Deep Space Communications Complex in California can transmit, but as it’s primarily concerned with communicating with distant spacecraft, there’s little free time to engage in scientific observations. Even when it’s available for research, the largest dish in the Goldstone array is only 1/4 the diameter of the reflector at Arecibo.
While you might think of radar pointing toward the skies, applications for radar have found their way underground as well. Ground-penetrating radar (GPR) is a tool that sends signals into the earth and measures their return to make determinations about what’s buried underground in much the same way that distant aircraft can be located or identified by looking for radar reflections. This technology can also be built with a few common items now for a relatively small cost.
This is a project from [Mirel] who built the system around a Arduino Mega 2560 and antipodal Vivaldi antennas, a type of directional antenna. Everything is mounted into a small cart that can be rolled along the ground. A switch attached to the wheels triggers the radar at regular intervals as it rolls, and the radar emits a signal and listens to reflections at each point. It operates at a frequency range from 323 MHz to 910 MHz, and a small graph of what it “sees” is displayed on an LCD screen that is paired to the Arduino.
Using this tool allows you to see different densities of materials located underground, as well as their depths. This can be very handy when starting a large excavation project, detecting rock layers or underground utilities before digging. [Mirel] made all of the hardware and software open-source for this project, and if you’d like to see another take on GPR then head over to this project which involves a lot of technical discussion on how it works.
Caring for the elderly and vulnerable people while preserving their privacy and independence is a challenging proposition. Reaching a panic button or calling for help may not be possible in an emergency, but constant supervision or camera surveillance is often neither practical nor considerate. Researchers from MIT CSAIL have been working on this problem for a few years and have come up with a possible solution called RF Diary. Using RF signals, a floor plan, and machine learning it can recognize activities and emergencies, through obstacles and in the dark. If this sounds familiar, it’s because it builds on previous research by CSAIL.
The RF system used is effectively frequency-modulated continuous-wave (FMCW) radar, which sweeps across the 5.4-7.2 GHz RF spectrum. The limited resolution of the RF system does not allow for the recognition of most objects, so a floor plan gives information on the size and location of specific features like rooms, beds, tables, sinks, etc. This information helps the machine learning model recognize activities within the context of the surroundings. Effectively training an activity captioning model requires thousands of training examples, which is currently not available for RF radar. However, there are massive video data sets available, so researchers employed a “multi-modal feature alignment training strategy” which allowed them to use video data sets to refine their RF activity captioning model.
There are still some privacy concerns with this solution, but the researchers did propose some improvements. One interesting idea is for the monitored person to give an “activation” signal by performing a specified set of activities in sequence.
In the early morning hours of August 10th, a support cable at the Arecibo Observatory pulled lose from its mount and crashed through the face of the primary reflector below. Images taken from below the iconic 305 meter dish, made famous by films such as Contact and GoldenEye, show an incredible amount of damage. The section of thick cable, estimated to weigh in at around 6,000 kilograms (13,000 pounds), had little difficulty tearing through the reflector’s thin mesh construction.
Worse still, the cable also struck the so-called “Gregorian dome”, the structure suspended over the dish where the sensitive instruments are mounted. At the time of this writing it’s still unclear as to whether or not any of that instrumentation has been damaged, though NASA at least has said that the equipment they operate inside the dome appears to have survived unscathed. At the very least, the damage to the dome structure itself will need to be addressed before the Observatory can resume normal operations.
But how long will the repairs take, and who’s going to pay for them? It’s no secret that funding for the 60 year old telescope has been difficult to come by since at least the early 2000s. The cost of repairing the relatively minor damage to the telescope sustained during Hurricane Maria in 2017 may have been enough to shutter the installation permanently if it hadn’t been for a consortium led by the University of Central Florida. They agreed to share the burden of operating the Observatory with the National Science Foundation and put up several million dollars of additional funding.
It’s far too early to know how much time and money it will take to get Arecibo Observatory back up to operational status, but with the current world situation, it seems likely the telescope will be out of commission for at least the rest of the year. Given the fact that repairs from the 2017 damage still haven’t been completed, perhaps even longer than that. In the meantime, astronomers around the globe are left without this wholly unique resource.
Radars are simply cool, and their portrayal in movies and TV has a lot to do with that. You get a sweet glowing screen that shows you where the bad guys are, and a visual representation of your missiles on their way to blow them up. Sadly, or perhaps thankfully, day to day life for most of us is a little less exhilarating. We can make do with a facsimile of the experience instead.
The project consists of an Arduino Uno outfitted with an ultrasound module that can do basic range measurements on the order of tens of centimeters. The module is then placed on a servo and scanned through a 180 degree rotation. This data is passed back to a computer running a Python application, which plots the results on a Plan Position Indicator, or PPI – the sweeping display we’re all so familiar with.
There appears to be no shortage of reasons to hate on wind farms. That’s especially the case if you live close by one, and as studies have shown, their general acceptance indeed grows with their distance. Whatever your favorite flavor of renewable energy might be, that’s at least something it has in common with nuclear or fossil power plants: not in my back yard. The difference is of course that it requires a lot more wind turbines to achieve the same output, therefore affecting a lot more back yards in total — in constantly increasing numbers globally.
Personally, as someone who encounters them occasionally from the distance, I find wind turbines mostly to be an eyesore, particularly in scenic mountainous landscapes. They can add a futuristic vibe to some otherwise boring flatlands. In other words, I can not judge the claims actual residents have on their impact on humans or the environment. So let’s leave opinions and emotions out of it and look at the facts and tech of one issue in particular: light pollution.
This might not be the first issue that comes to mind when thinking about wind farms. But wind turbines are tall enough to require warning lights for air traffic safety, and can be seen for miles, blinking away in the night sky. From a pure efficiency standpoint, this doesn’t seem reasonable, considering how often an aircraft is actually passing by on average. Most of the time, those lights simply blink for nothing, lighting up the countryside. Can we change this?