Review: Ear Wax Cleaning Cameras As Cheap Microscopes, We Take A Closer Look

Those of us who trawl the world of cheap imported goods will most often stay in our own comfortable zones as we search for new items to amaze and entertain us. We’ll have listings of electronic goods or tools, and so perhaps miss out on the scores of other wonders that can be ours for only a few dollars and a week or two’s wait for postage.

Who knew sticky ears were such big business!
Who knew sticky ears were such big business!

Just occasionally though something will burst out of another of those zones and unexpectedly catch our eye, and we are sent down an entirely new avenue in the global online supermarket.

Thus it was that when a few weeks ago I was looking for an inspection camera I had a listing appear from the world of personal grooming products. It seems that aural hygiene is a big market, and among the many other products devoted to it is an entire category of ear wax removal tools equipped with cameras. These can get you up close and personal with your ear canal, presumably so you can have a satisfying scoop at any accumulated bodily goop. I have a ton of electronics-related uses for a cheap USB close-up camera so I bought one of these so I could — if you’ll excuse the expression — get a closer look.

Continue reading “Review: Ear Wax Cleaning Cameras As Cheap Microscopes, We Take A Closer Look”

Network Booting The Pi 4

We’ve talked about PXE booting the Raspberry Pi 3B+, and then looked at the Raspberry Pi 4 as a desktop replacement. But there’s more! The Pi 4 sports a very useful new feature, the flashable bootloader. Just recently a beta version of that bootloader was released that supports PXE  — booting up over the network — which has become a must-have for those of us who have had consistently bad experiences with root filesystems on SD cards.

Pi with no SD CardWhat are the downsides, I hear you ask? You might see slower speeds going across the network compared to a high quality SD card, particularly with the Pi 4 and its improved SD card slot. PXE does require an Ethernet cable; WiFi is not enough, so you have that restriction to contend with. And finally, this isn’t a portable option — you are tethered to that network cable while running, and tethered to your network to boot at all.

On the other hand, if you’re doing a permanent or semi-permanent install of a Pi, PXE is absolutely a winner. There are few things worse than dragging a ladder out to access a Pi that’s cooked its SD card, not to mention the possibility that you firewalled yourself out of it. Need to start over with a fresh Raspbian image? Easy, just rebuild it on the PXE server and reboot the Pi remotely.

Convinced PXE is for you? Let’s get started! Continue reading “Network Booting The Pi 4”

Found Footage: Elliot Williams Talks Nexus Technologies

Back at the 2017 Superconference, Hackaday Managing Editor Elliot Williams started his talk about the so-called “Internet of Things” by explaining the only part he doesn’t like about the idea is the Internet… and the things. It’s a statement that most of us would still agree with today. If anything, the situation has gotten worse in the intervening years. Commercial smart gadgets are now cheaper and more plentiful than they’ve ever been, but it seems like precious little has been done to improve their inherent privacy and security issues.

But his talk doesn’t serve to bash the companies producing these devices or even the services that ultimately folded and left their customers with neigh useless gadgets. That’s not his style. The central theme of Nexus Technologies: Or How I Learned to Love WiFi” is that a smart home can be wonderful thing, assuming it works the way you want it to. Elliot argues that between low-cost modular hardware and open source software, the average hacker has everything they need to build their own self-contained home automation ecosystem. One that’s not only cheaper than what they’re selling at the Big Box electronics store, but also doesn’t invite any of the corporate giants to the party.

Of course, it wasn’t always so. A decade ago it would have been all but impossible, and five years ago it would have been too expensive to be practical. As Elliot details his journey towards a truly personal smart home, he explains the advances in hardware and software that have made it not just possible on the DIY level, but approachable. The real takeaway is that once more people realize how cheap and easy it is to roll your own smart home gadgets, they may end up more than willing to kick Big Brother to the curb and do IoT on their own terms.

This previously unpublished recording somehow slipped between the cracks of the editing room floor but upon recent discovery, it’s still just as relevant today. Take a look at Elliot’s view on Nexus Technologies, then join us after the break for a deeper dive. Make sure to subscribe to Hackaday’s YouTube channel to get in on the 2019 Hackaday Superconference live stream starting Saturday, November 16th.

Continue reading “Found Footage: Elliot Williams Talks Nexus Technologies”

Tesla’s Smart Summon – Gimmick Or Greatness?

Tesla have always aimed to position themselves as part automaker, part tech company. Their unique offering is that their vehicles feature cutting-edge technology not available from their market rivals. The company has long touted it’s “full self-driving” technology, and regular software updates have progressively unlocked new functionality in their cars over the years.

The latest “V10” update brought a new feature to the fore – known as Smart Summon. Allowing the driver to summon their car remotely from across a car park, this feature promises to be of great help on rainy days and when carrying heavy loads. Of course, the gulf between promises and reality can sometimes be a yawning chasm.

How Does It Work?

Holding the “Come To Me” button summons the vehicle to the user’s location. Releasing the button stops the car immediately.

Smart Summon is activated through the Tesla smartphone app. Users are instructed to check the vehicle’s surroundings and ensure they have line of sight to the vehicle when using the feature. This is combined with a 200 foot (61 m) hard limit, meaning that Smart Summon won’t deliver your car from the back end of a crowded mall carpark. Instead, it’s more suited to smaller parking areas with clear sightlines.

Once activated, the car will back out of its parking space, and begin to crawl towards the user. As the user holds down the button, the car moves, and will stop instantly when let go. Using its suite of sensors to detect pedestrians and other obstacles, the vehicle is touted to be able to navigate the average parking environment and pick up its owners with ease.

No Plan Survives First Contact With The Enemy

With updates rolled out over the air, Tesla owners jumped at the chance to try out this new functionality. Almost immediately, a cavalcade of videos began appearing online of the technology. Many of these show that things rarely work as well in the field as they do in the lab.

As any driver knows, body language and communication are key to navigating a busy parking area. Whether it’s a polite nod, an instructional wave, or simply direct eye contact, humans have become well-rehearsed at self-managing the flow of traffic in parking areas. When several cars are trying to navigate the area at once, a confused human can negotiate with others to take turns to exit the jam. Unfortunately, a driverless car lacks all of these abilities.

This situation proved all too much for the Tesla, and the owner was forced to intervene.

A great example is this drone video of a Model 3 owner attempting a Smart Summon in a small linear carpark. Conditions are close to ideal – a sunny day, with little traffic, and a handful of well-behaved pedestrians. In the first attempt, the hesitation of the vehicle is readily apparent. After backing out of the space, the car simply remains motionless, as two human drivers are also attempting to navigate the area. After backing up further, the Model 3 again begins to inch forward, with seemingly little ability to choose between driving on the left or the right. Spotting the increasing frustration of the other road users, the owner is forced to walk to the car and take over. In a second attempt, the car is again flummoxed by an approaching car, and simply grinds to a halt, unable to continue. Communication between autonomous vehicles and humans is an active topic of research, and likely one that will need to be solved sooner rather than later to truly advance this technology.

Pulling straight out of a wide garage onto an empty driveway is a corner case they haven’t quite mastered yet.

An expensive repair bill, courtesy of Smart Summon.

Other drivers have had worse experiences. One owner had their Tesla drive straight into the wall of their garage, an embarrassing mistake even most learner drivers wouldn’t make. Another had a scary near miss, when the Telsa seemingly failed to understand its lack of right of way. The human operator can be seen to recognise an SUV approaching at speed from the vehicle’s left, but the Tesla fails to yield, only stopping at the very last minute. It’s likely that the Smart Summon software doesn’t have the ability to understand right of way in parking environments, where signage is minimal and it’s largely left up to human intuition to figure out.

This is one reason why the line of sight requirement is key – had the user let go of the button when first noticing the approaching vehicle, the incident would have been avoided entirely. Much like other self-driving technologies, it’s not always clear how much responsibility still lies with the human in the loop, which can have dire results. And more to the point, how much responsibility should the user have, when he or she can’t know what the car is going to decide to do?

More amusingly, an Arizona man was caught chasing down a Tesla Model 3 in Phoenix, seeing the vehicle rolling through the carpark without a driver behind the wheel. While the embarassing incident ended without injury, it goes to show that until familiarity with this technology spreads, there’s a scope for misunderstandings to cause problems.

It’s Not All Bad, Though

Some users have had more luck with the feature. While it’s primarily intended to summon the car to the user’s GPS location, it can also be used to direct the car to a point within a 200 foot radius. In this video, a Tesla can be seen successfully navigating around a sparsely populated carpark, albeit with some trepidation. The vehicle appears to have difficulty initially understanding the structure of the area, first attempting a direct route before properly making its way around the curbed grass area. The progress is more akin to a basic line-following robot than an advanced robotic vehicle. However, it does successfully avoid running down its owner, who attempts walking in front of the moving vehicle to test its collision avoidance abilities. If you value your limbs, probably don’t try this at home.

No, not like that!

Wanting to explore a variety of straightforward and oddball situations, [DirtyTesla] decided to give the tech a rundown himself. The first run in a quiet carpark is successful, albeit with the car weaving, reversing unnecessarily, and ignoring a stop sign. Later runs are more confident, with the car clearly choosing the correct lane to drive in, and stopping to check for cross traffic. Testing on a gravel driveway was also positive, with the car properly recognising the grass boundaries and driving around them. That is, until the fourth attempt, when the car gently runs off the road and comes to a stop in the weeds. Further tests show that dark conditions and heavy rain aren’t a show stopper for the system, but it’s still definitely imperfect in operation.

Reality Check

Fundamentally, there’s plenty of examples out there that suggest this technology isn’t ready for prime-time. Unlike other driver-in-the-loop aids, like parallel parking assists, it appears that users put a lot more confidence in the ability of Smart Summon to detect obstacles on its own, leading to many near misses and collisions.

If all it takes is a user holding a button down to drive a 4000 pound vehicle into a wall, perhaps this isn’t the way to go. It draws parallels to users falling asleep on the highway when using Tesla’s AutoPilot – drivers are putting ultimate trust in a system that is, at best, only capable when used in combination with a human’s careful oversight. But even then, how is the user supposed to know what the car sees? Tesla’s tools seem to have a way of lulling users into a false sense of confidence, only to be betrayed almost instantly to the delight of Youtube viewers around the world.

While it’s impossible to make anything truly foolproof, it would appear that Tesla has a ways to go to get Smart Summon up to scratch. Combine this with the fact that in 90% of videos, it would have been far quicker for an able-bodied driver to simply walk to the vehicle and drive themselves, and it definitely appears to be more of a gimmick than a useful feature. If it can be improved, and limitations such as line-of-sight and distance can be negated, it will quickly become a must-have item on luxury vehicles. That may yet be some years away, however. Watch this space, as it’s unlikely other automakers will rest for long!

Blend Your Last Frogs. Google Turns A Blind Eye To Flash.

Google has announced that it will no longer index Flash files.

Journey with me to a time in a faraway internet; a time before we had monetized social media. A time when the page you shared with your friends was your page and not a page on someone’s network. Way back when Visual Basic was what Python is now and JavaScript was a hack mostly used for cool effects. A hero arose. Macromedia Flash opened the gates to the interactive web, and for a chunk of time it consumed more than a decent portion of humanity’s attention and artistic output.

Computer art was growing, but was it public? How many grandmothers would see a demo?

New grounds were paved and anyone who wanted to become an animator or a web designer could manage it in a few tutorials. Only a few years before Flash took off, people had started talking about computers as a source for art in mostly theoretical terms. There were demoscenes, university studies, and professional communities, of course, but were they truly public? Suddenly Flash made computer art an everyday thing. How could computers not be used for art? In schools and offices all over the world people of varying technical skill would get links to games, animation, and clever sites sent by their friends and colleagues.

For 23 years Flash has had this incredible creative legacy. Yet it’s not perfect by any means. It’s a constant headache for our friendly neighborhood super-conglomerates. Apple hates how it drains the battery on their mobile devices, and that it’s a little village outside of their walled garden. Microsoft sees it as another endless security violation. They all saw it as a competitor product eating their proprietary code bases. Continue reading “Blend Your Last Frogs. Google Turns A Blind Eye To Flash.”

Supercon Keynote: Dr. Megan Wachs On RISC-V

Hackaday has open-source running deep in our veins — and that goes for hardware as well as software. After all, it’s great to run open-source software, but if it’s running on black-box hardware, the system is only half open. While software has benefited mightily from all of the advantages of community development, the hardware world has been only recently catching up. And so we’ve been following the RISC-V open-source CPU development with our full attention.

Dr. Wachs, making her own wedding ring.

Our keynote speaker for the 2019 Hackaday Superconference is Dr. Megan Wachs, the VP of Engineering at SiFive, the company founded by the creators of the RISC-V instruction-set architecture (ISA). She has also chaired the RISC-V Foundation Debug Task Group, so it’s safe to say that she knows RISC-V inside and out. If there’s one talk we’d like to hear on the past, present, and future of the architecture, this is it.

The RISC-V isn’t a particular chip, but rather it’s a design for how a CPU works, and a standard for the lowest-level language that the machine speaks. In contrast to proprietary CPUs, RISC-V CPUs from disparate vendors can all use the same software tools, unifying and opening their development. Moreover, open hardware implementations for the silicon itself mean that new players can enter the space more easily, bring their unique ideas to life faster, and we’ll all benefit. We can all work together.

It’s no coincidence that this year’s Supercon badge has two RISC-V cores running in its FPGA fabric. When we went shopping around for an open CPU core design, we had a few complete RISC-V systems to pick from, full compiler and development toolchains to write code for them, and of course, implementations in Verilog ready to flash into the FPGA. The rich, open ecosystem around RISC-V made it a no-brainer for us, just as it does for companies making neural-network peripherals or even commodity microcontrollers. You’ll be seeing a lot more RISC-V systems in the near future, on your workbench and in your pocket.

We’re tremendously excited to hear more about the project from the inside, and absolutely looking forward to Megan’s keynote speech!

The Hackaday Superconference is completely sold out, but that doesn’t mean that you have to miss out. We’ll be live-streaming the keynote and all other talks on the Supercon main stage, so subscribe our YouTube channel and you won’t miss a thing.

Qantas’ Research Flight Travels 115% Of Range With Undercrowded Cabin

Long-haul flights can be a real pain when you’re trying to get around the world. Typically, they’re achieved by including a stop along the way, with the layover forcing passengers to deplane and kill time before continuing the flight. As planes have improved over the years, airlines have begun to introduce more direct flights where possible, negating this frustration.

Australian flag carrier Qantas are at the forefront of this push, recently attempting a direct flight from New York to Sydney. This required careful planning and preparation, and the research flight is intended to be a trial run ahead of future commercial operations. How did they keep the plane — and the passengers — in the air for this extremely long haul? The short answer is that they cheated with no cargo and by pampering their 85% empty passenger cabin. Yet they plan to leverage what they learn to begin operating 10,000+ mile non-stop passenger flights — besting the current record by 10% — as soon as four years from now.
Continue reading “Qantas’ Research Flight Travels 115% Of Range With Undercrowded Cabin”