How much would you pay for a 3D printer? Granted, when we started a decent printer might run over $1,000 but the cost has come way down. Unless of course, you go pro. We were disappointed that this [All3DP] post didn’t include prices, but we noticed a trend: if your 3D printer has stairs, it is probably a big purchase. According to the tag line on the post, the printers are all north of $500,000.
Expensive printers usually have unique technology, higher degrees of automation, large capacity or some combination of that, and a few other factors. At least two of the printers mentioned had stairs to reach the top parts of the machine. And the Black Buffalo — a cement printer — uses a gantry that looks like it is part of a light show at a concert. It is scalable, but apparently can go up to three stories tall!
Everyone knows what a chatbot is, but how about a deadbot? A deadbot is a chatbot whose training data — that which shapes how and what it communicates — is data based on a deceased person. Now let’s consider the case of a fellow named Joshua Barbeau, who created a chatbot to simulate conversation with his deceased fiancee. Add to this the fact that OpenAI, providers of the GPT-3 API that ultimately powered the project, had a problem with this as their terms explicitly forbid use of their API for (among other things) “amorous” purposes.
[Sara Suárez-Gonzalo], a postdoctoral researcher, observed that this story’s facts were getting covered well enough, but nobody was looking at it from any other perspective. We all certainly have ideas about what flavor of right or wrong saturates the different elements of the case, but can we explain exactly why it would be either good or bad to develop a deadbot?
[Sara] makes the case that creating a deadbot could be done ethically, under certain conditions. Briefly, key points are that a mimicked person and the one developing and interacting with it should have given their consent, complete with as detailed a description as possible about the scope, design, and intended uses of the system. (Such a statement is important because machine learning in general changes rapidly. What if the system or capabilities someday no longer resemble what one originally imagined?) Responsibility for any potential negative outcomes should be shared by those who develop, and those who profit from it.
[Sara] points out that this case is a perfect example of why the ethics of machine learning really do matter, and without attention being paid to such things, we can expect awkward problems to continue to crop up.
How important is it to identify killer asteroids before they strike your planet? Ask any dinosaurs. Oh, wait… Granted you also need a way to redirect them, but interest in finding them has picked up lately including a new privately funded program called the Asteroid Institute.
Using an open-source cloud platform known as ADAM — Asteroid Discovery Analysis and Mapping — the program, affiliated with B612 program along with others including the University of Washington, has already discovered 104 new asteroids and plotted their orbits.
What’s interesting is that the Institute doesn’t acquire any images itself. Instead, it uses new techniques to search through existing optical records to identify previously unnoticed asteroids and compute their trajectories.
You have to wonder how many other data sets are floating around that hold unknown discoveries waiting for the right algorithm and computing power. Of course, once you find the next extinction asteroid, you have to decide what to do about it. Laser? Bomb? A gentle push at a distance? Or hope for an alien obelisk to produce a deflector ray? How would you do it?
In CPU design, there is Ahmdal’s law. Simply put, it means that if some process is contributing to 10% of your execution, optimizing it can’t improve things by more than 10%. Common sense, really, but it illustrates the importance of knowing how fast or slow various parts of your system are. So how fast are Linux pipes? That’s a good question and one that [Mazzo] sets out to answer.
The inspiration was a highly-optimized fizzbuzz program that clocked in at over 36GB/s on his laptop. Is that a common speed? Nope. A simple program using pipes on the same machine turned in not quite 4 GB/s. What accounts for the difference?
When NASA astronauts aboard the International Space Station have to clamber around on the outside of the orbiting facility for maintenance or repairs, they don a spacesuit known as the Extravehicular Mobility Unit (EMU). Essentially a small self-contained spacecraft in its own right, the bulky garment was introduced in 1981 to allow Space Shuttle crews to exit the Orbiter and work in the craft’s cavernous cargo bay. While the suits did get a minor upgrade in the late 90s, they remain largely the product of 1970s technology.
Not only are the existing EMUs outdated, but they were only designed to be use in space — not on the surface. With NASA’s eyes on the Moon, and eventually Mars, it was no secret that the agency would need to outfit their astronauts with upgraded and modernized suits before moving beyond the ISS. As such, development of what would eventually be the Exploration Extravehicular Mobility Unit (xEMU) dates back to at least 2005 when it was part of the ultimately canceled Constellation program.
NASA’s own xEMU suit won’t be ready by 2025.
Unfortunately, after more than a decade of development and reportedly $420 million in development costs, the xEMU still isn’t ready. With a crewed landing on the Moon still tentatively scheduled for 2025, NASA has decided to let their commercial partners take a swing at the problem, and has recently awarded contracts to two companies for a spacesuit that can both work on the Moon and replace the aging EMU for orbital use on the ISS.
As part of the Exploration Extravehicular Activity Services (xEVAS) contract, both companies will be given the data collected during the development of the xEMU, though they are expected to create new designs rather than a copy of what NASA’s already been working on. Inspired by the success of the Commercial Crew program that gave birth to SpaceX’s Crew Dragon, the contract also stipulates that the companies will retain complete ownership and control over the spacesuits developed during the program. In fact, NASA is even encouraging the companies to seek out additional commercial customers for the finished suits in hopes a competitive market will help drive down costs.
There’s no denying that NASA’s partnerships with commercial providers has paid off for cargo and crew, so it stands to reason that they’d go back to the well for their next-generation spacesuit needs. There’s also plenty of incentive for the companies to deliver a viable product, as the contact has a potential maximum value of $3.5 billion. But with 2025 quickly approaching, and the contact requiring a orbital shakedown test before the suits are sent to the Moon, the big question is whether or not there’s still enough time for either company to make it across the finish line.
Depending on who you ask, there’s either 2 vulnerabilities at play in Follina, only one, or according to Microsoft a week ago, no security problem whatsoever. On the 27th of last month, a .docx file was uploaded to VirusTotal, and most of the tools there thought it was perfectly normal. That didn’t seem right to [@nao_sec], who raised the alarm on Twitter. It seems this suspicious file originated somewhere in Belarus, and it uses a series of tricks to run a malicious PowerShell script. Continue reading “This Week In Security: Follina, Open Redirect RCE, And Annoyware”→
There’s a danger in security research that we’ve discussed a few times before. If you discover a security vulnerability on a production system, and there’s no bug bounty, you’ve likely broken a handful of computer laws. Turn over the flaw you’ve found, and you’re most likely to get a “thank you”, but there’s a tiny chance that you’ll get charged for a computer crime instead. Security research in the US is just a little safer now, as the US Department of Justice has issued a new policy stating that “good-faith security research should not be charged.”
While this is a welcome infection of good sense, it would be even better for such a protection to be codified into law. The other caveat is that this policy only applies to federal cases in the US. Other nations, or even individual states, are free to bring charges. So while this is good news, continue to be careful. There are also some caveats about what counts as good-faith — If a researcher uses a flaw discovery to extort, it’s not good-faith. Continue reading “This Week In Security: Good Faith, Easy Forgery, And I18N”→