Slab Casting – A New Way To Combine 3D Printing And Ceramics

Slip casting can be messy both in processing and in making the original plaster mold. What if there was a better way, thanks to 3D printing?

[Allie Katz] has developed a new technique using 3D printed slab molds to make ceramics. By combining the ability of 3D printing to make intricate designs and the formability of clay, they have found a way to make reproducible clay objects without all that tedious mucking about with liquid clay.

[Katz] takes us through a quick “Mould Making 101” before showing how the slab casting press molds were made. Starting with a positive CAD design, the molds were designed to eliminate undercuts and allow for air infiltration since a plastic mold can’t suck the water out of the clay like a plaster one would. Some cookie clay cutters were also designed to help with the trickier bits of geometry. Once everything was printed, the molds were coated with cornstarch and clay was pressed in. After removal, any final details like handles can be added and the pieces are then fired as normal.

If you’d like to see some more 3D printing mixed up with ceramics, check out 3D printing glass with a laser, reliable ceramic slurry printing, or this TPU-based approach.

Continue reading “Slab Casting – A New Way To Combine 3D Printing And Ceramics”

Animated gif of large 1950s computer spitting out a sheet of paper.

Retrotechtacular: 1960s Doc Calls Computers The Universal Machine

It’s weird to think that an abacus would have still been used sixty years ago, or so posits the documentary series The Computer and the Mind of Man. This six part series originally aired on San Francisco local television station KQED in 1962, a time where few people outside of academia had even stood next to such a device.

Episode 3 titled “The Universal Machine” was dedicated to teaching the public how a computer can enhance every type of business provided humans can sufficiently describe it in coded logic. Though mainly filtered through IBM’s perspective as the company was responsible for funding the set of films; learning how experts of the time contextualized the computer’s potential was illuminating.

Continue reading “Retrotechtacular: 1960s Doc Calls Computers The Universal Machine”

Illustrated Kristina with an IBM Model M keyboard floating between her hands.

Keebin’ With Kristina: The One With All The LEGO

It seems like mechanical keyboard enthusiasts are more spoiled for choice with each passing day. But as broad as the open source pool has become, there’s still no perfect keyboard for everyone. So, as people innovate toward their own personal endgame peripherals and make them open source, the pool just grows and grows.

Image by [Bo Yao] via Hackaday.IO
This beautiful addition to the glittering pool — [Bo Yao]’s Carpenter Tau keyboard — is meant to provide an elegant option at a particular intersection where no keyboards currently exist — the holy trinity of open source, programmable, and tri-mode connectivity: wired, Bluetooth, and 2.4 GHz.

Come for the lovely wooden everything, and stay for the in-depth logs as [Bo Yao] introduces the project and its roots, reviews various options for the controller, discusses the manufacture of the wooden parts, and creates the schematic for the 61-key version. Don’t want to build one yourself? It’ll be on Crowd Supply soon enough.

Continue reading “Keebin’ With Kristina: The One With All The LEGO”

A Transistor, But For Heat Instead Of Electrons

Researchers at UCLA recently developed what they are calling a thermal transistor: a solid-state device able to control the flow of heat with an electric field. This opens the door to controlling the transfer of heat in some of the same ways we are used to controlling electronics.

Heat management can be a crucial task, especially where electronics are involved. The usual way to manage heat is to draw it out with things like heat sinks. If heat isn’t radiating away fast enough, a fan can be turned on (or sped up) to meet targets. Compared to the precision and control with which modern semiconductors shuttle electrons about, the ability to actively manage heat seems lacking.

This new device can rapidly adjust thermal conductivity of a channel based on an electrical field input, which is very similar to what a transistor does for electrical conductivity. Applying an electrical field modifies the strength of molecular bonds in a cage-like array of molecules, which in turn adjusts their thermal conductivity.

It’s still early, but this research may open the door to better control of heat within semiconductor systems. This is especially interesting considering that 3D chips have been picking up speed for years (stacking components is already a thing, it’s called Package-on-Package assembly) and the denser and deeper semiconductors get, the harder it is to passively pull heat out.

Thanks to [Jacob] for the tip!

Beyond The Basics: Exploring Exotic Scope Trigger Modes

Will Rogers once said that veterinarians are the best doctors because their patients can’t tell them where it hurts. I’ve often thought that electronic people have a similar problem. In many cases, what’s wrong with our circuits isn’t visible. Sure, you can visually identify a backward diode, a bad solder joint, or a blown fuse. But you can’t look at a battery and see that it is dead or that a clock signal isn’t reaching some voltage. There are lots of ways to look at what’s really going on, but there is no substitute for a scope. It used to be hard for the average person to own a scope, but these days, it doesn’t require much. If you aren’t shopping for the best tech or you are willing to use it with a PC, oscilloscopes are quite affordable. If you spend even a little, you can now get scopes that are surprisingly capable with features undreamed of in years past. For example, many modern scopes have a dizzying array of triggering options. Do you need them? What do they do? Let’s find out.

I’ll be using a relatively new Rigol DHO924S, but none of the triggering modes are unique to that instrument. Sometimes, they have different names, and, of course, their setup might look different than my pictures, but you should be able to figure it out.

What is Triggering?

In simple terms, an oscilloscope plots time across the X-axis and voltage vertically on the Y-axis. So you can look at two peaks, for example, and measure the distance between them to understand how far apart they are in time. If the signal you are measuring happens repeatedly — like a square or sine wave, for example — it hardly matters which set of peaks you look at. After all, they are all the same for practical purposes.

Pretty square waves all in a row. Channel 2 is 180 degrees out of phase (inverted). But is that all there is?

The problem occurs when you want to see something relative to a particular event. Basic scopes often have level triggering. They “start” when the input voltage goes above or below a certain value. Suppose you are looking at a square wave that goes from 0 V to 5 V. You could trigger at about 2.5 V, and the scope will never start in the middle of a cycle.

Digital scopes tend to capture data before and after the trigger, so the center of the screen will be right on an edge, and you’ll be able to see the square waves on either side. The picture shows two square waves on the screen with the trigger point marked with a T in the top center of the display. You can see the level in the top bar and also marked with a T on the right side of the screen.

What happens if there are no pulses on the trigger source channel? That depends. If you are in auto mode, the scope will eventually get impatient and trigger at random. This lets you see what’s going on, but there’s no reference. If you are in normal mode, though, the scope will either show nothing or show the last thing it displayed. Either way, the green text near the top left corner will read WAIT until the trigger event occurs. Then it will say T’D.

Continue reading “Beyond The Basics: Exploring Exotic Scope Trigger Modes”

Up Close And Personal With A MEMS Microphone

If you’ve ever wondered what lies beneath the barely visible hole in the can of a MEMS microphone, you’re in luck, because [Zach Tong] has a $10 pair of earbuds to sacrifice for the cause and an electron microscope.

For the uninitiated, MEMS stands for microelectromechanical systems, the tiny silicon machines that power some of the more miraculous functions of smartphones and other modern electronics. The most familiar MEMS device might be the accelerometer that gives your phone a sense of where it is in space; [Zach] has a deep dive into MEMS accelerometers that we covered a while back.

MEMS microphones seem a little bit easier to understand mechanically, since all they have to do is change vibrations in air into an electrical signal. The microphone that [Zach] tore down for this video is ridiculously small; the SMD device is only about 3 mm long, with the MEMS chip under the can a fraction of a millimeter on a side. After some overall views with the optical microscope, [Zach] opened the can and put the guts under his scanning electron microscope. The SEM shots are pretty amazing, revealing a dimpled silicon diaphragm over a second layer with holes etched right through it. The dimples on the diaphragm nest into the holes, forming an air-dielectric capacitor whose capacitance varies as sound waves vibrate the diaphragm.

The most visually interesting feature, though, might be the deep cavity lying behind the two upper surfaces. The cavity, which [Zach] says bears evidence of having been etched by the deep reactive ion etching method, has cool-looking corrugations in its walls. The enormity of the cavity relative to the thin layers covering it suggests it’s a resonating cavity for the sound waves.

Thanks to [Zach] for this in-depth look at a device that’s amazingly complex yet remarkably simple.

Continue reading “Up Close And Personal With A MEMS Microphone”

Explore Neural Radiance Fields In Real-time, Even On A Phone

Neural Radiance Fields (NeRF) is a method of reconstructing complex 3D scenes from sparse 2D inputs, and the field has been growing by leaps and bounds. Viewing a reconstructed scene is still nontrivial, but there’s a new innovation on the block: SMERF is a browser-based method of enabling full 3D navigation of even large scenes, efficient enough to render in real time on phones and laptops.

Don’t miss the gallery of demos which will run on anything from powerful desktops to smartphones. Notable is the distinct lack of blurry, cloudy, or distorted areas which tend to appear in under-observed areas of a NeRF scene (such as indoor corners and ceilings). The technical paper explains SMERF’s approach in more detail.

NeRFs as a concept first hit the scene in 2020 and the rate of advancement has been simply astounding, especially compared to demos from just last year. Watch the short video summarizing SMERF below, and marvel at how it compares to other methods, some of which are themselves only months old.

Continue reading “Explore Neural Radiance Fields In Real-time, Even On A Phone”