Edge-Lit, Thin LCD TVs Are Having Early Heat Death Issues

Canadian consumer goods testing site RTINGS has been subjecting 100 TVs to an accelerated TV longevity test, subjecting them so far to over 10,000 hours of on-time, equaling about six years of regular use in a US household. This test has shown a range of interesting issues and defects already, including for the OLED-based TVs. But the most recent issue which they covered is that of uniformity issues with edge-lit TVs. This translates to uneven backlighting including striping and very bright spots, which teardowns revealed to be due to warped reflector sheets, cracked light guides, and burned-out LEDs.

Excluding the 18 OLED TVs, which are now badly burnt in, over a quarter of the remaining TVs in the test suffer from uniformity issues. But things get interesting when contrasting between full-array local dimming (FALD), direct-lit (DL) and edge-lit (EL) LCD TVs. Of the EL types, 7 out of 11 (64%) have uniformity issues, with one having outright failed and others in the process of doing so. Among the FALD and DL types the issue rate here is 14 out of 71 (20%), which is still not ideal after a simulated 6 years of use but far less dramatic.

Cracks in the Samsung AU8000's Light Guide Plate (Credit: RTINGS)
Cracks in the Samsung AU8000’s Light Guide Plate (Credit: RTINGS)

As part of the RTINGS longevity test, failures and issues are investigated and a teardown for analysis, and fixing, is performed when necessary. For these uniformity issues, the EL LCD teardowns revealed burned-out LEDs in the EL LED strips, with cracks in the light-guide plate (LGP) that distributes the light, as well as warped reflector sheets. The LGPs are offset slightly with plastic standoffs to not touch the very hot LEDs, but these standoffs can melt, followed by the LGP touching the hot LEDs. With the damaged LGP, obviously the LCD backlighting will be horribly uneven.

In the LG QNED80 (2022) TV, its edge lighting LEDs were measured with a thermocouple to be running at a searing 123 °C at the maximum brightness setting. As especially HDR (high-dynamic range) content requires high brightness levels, this would thus be a more common scenario in EL TVs than one might think. As for why EL LCDs still exist since they seem to require extreme heatsinking to keep the LEDs from melting straight through the LCD? RTINGS figures it’s because EL allows for LCD TVs to be thinner, allowing them to compete with OLEDs while selling at a premium compared to even FALD LCDs.

Continue reading “Edge-Lit, Thin LCD TVs Are Having Early Heat Death Issues”

The experimental setup for entanglement-distribution experiments. (Credit: Craddock et al., PRX Quantum, 2024)

Entangled Photons Maintained Using Existing Fiber Under NYC’s Streets

Entangled photons are an ideal choice for large-scale networks employing quantum encryption or similar, as photons can use fiber-optical cables to transmit them. One issue with using existing commercial fiber-optic lines for this purpose is that these have imperfections which can disrupt photon entanglement. This can be worked around by delaying one member of the pair slightly, but this makes using the pairs harder. Instead, a team at New York-based startup Qunnect used polarization entanglement to successfully transmit and maintain thousands of photons over the course of weeks through a section of existing commercial fiber, as detailed in the recently published paper by [Alexander N. Craddock] et al. in PRX Quantum (with accompanying press release).

The entangled photons were created via spontaneous four-wave mixing in a warm rubidium vapor. This creates a photon with a wavelength of 795 nm and one with 1324 nm. The latter of which is compatible with the fiber network and is thus transmitted over the 34 kilometers. To measure the shift in polarization of the transmitted photos, non-entangled photons with a known polarization were transmitted along with the entangled ones. This then allowed for polarization compensation for the entangled photos by measuring the shift on the single photons. Overall, the team reported an uptime of nearly 100% with about 20,000 entangled photons transmitted per second.

As a proof of concept it shows that existing fiber-optical lines could in the future conceivably be used for quantum computing and encryption without upgrades.

Wacky Science: Using Mayonnaise To Study Rayleigh-Taylor Instability

Sometimes a paper in a scientific journal pops up that makes you do a triple-take, case in point being a recent paper by [Aren Boyaci] and [Arindam Banerjee] in Physical Review E titled “Transition to plastic regime for Rayleigh-Taylor instability in soft solids”. The title doesn’t quite do their methodology justice — as the paper describes zipping a container filled with mayonnaise along a figure-eight track to look at the surface transitions. With the paper paywalled and no preprint available, we have to mostly rely the Lehigh University press releases pertaining to the original 2019 paper and this follow-up 2024 one.

Rayleigh-Taylor instability (RTI) is an instability of an interface between two fluids of different densities when the less dense fluid acts up on the more dense fluid. An example of this is water suspended above oil, as well as the expanding mushroom cloud during a explosion or eruption. It also plays a major role in plasma physics, especially as it pertains to nuclear fusion. In the case of inertial confinement fusion (ICF) the rapidly laser-heated pellet of deuterium-tritium fuel will expand, with the boundary interface with the expanding D-T fuel subject to RTI, negatively affecting the ignition efficiency and fusion rate. A simulation of this can be found in a January 2024 research paper by [Y. Y. Lei] et al.

As a fairly chaotic process, RTI is hard to simulate, making a physical model a more ideal research subject. Mayonnaise is definitely among the whackiest ideas here, with other researchers like [Samar Alqatari] et al. as published in Science Advances opting to use a Hele-Shaw cell with dyed glycerol-water mixtures for a less messy and mechanically convoluted experimental contraption.

What’s notable here is that the Lehigh University studies were funded by the Lawrence Livermore National Laboratory (LLNL), which explains the focus on ICF, as the National Ignition Facility (NIF) is based there.

This also makes the breathless hype about ‘mayo enabling fusion power’ somewhat silly, as ICF is even less likely to lead to net power production, far behind even Z-pinch fusion. That said, a better understanding of RTI is always welcome, even if one has to question the practical benefit of studying it in a container of mayonnaise.

Possible Discovery Of Liquid Water In Mars’ Mid-Crust By The Insight Lander

One of the most sought after substances in the Universe is water – especially in its liquid form – as its presence on a planet makes the presence of life (as we know it) significantly more likely. While there are potentially oceans worth of liquid water on e.g. Jupiter’s moon Europa, for now Mars is significantly easier to explore as evidenced by the many probes which we got onto its surface so far. One of these was the InSight probe, which was capable of a unique feat: looking inside the planet’s crust with its seismometer to perform geophysical measurements. These measurements have now led to the fascinating prospect that liquid water may in fact exist on Mars right now, according to a paper published by [Vashan Wright] and colleagues in PNAS (with easy-read BBC coverage). Continue reading “Possible Discovery Of Liquid Water In Mars’ Mid-Crust By The Insight Lander”

Achieving Human Level Competitive Robot Table Tennis

A team at Google has spent a lot of time recently playing table tennis, purportedly only for science. Their goal was to see whether they could construct a robot which would not only play table tennis, but even keep up with practiced human players. In the paper available on ArXiv, they detail what it took to make it happen. The team also set up a site with a simplified explanation and some videos of the robot in action.

Table tennis robot vs human match outcomes. B is beginner, I is intermediate, A is advanced. (Credit: Google)
Table tennis robot vs human match outcomes. B is beginner, I is intermediate, A is advanced. (Credit: Google)

In the end, it took twenty motion-capture cameras, a pair of 125 FPS cameras, a 6 DOF robot on two linear rails, a special table tennis paddle, and a very large annotated dataset to train multiple convolutional neural networks (CNN) on to analyze the incoming visual data. This visual data was then combined with details like knowledge of the paddle’s position to churn out a value for use in the look-up table that forms the core of the high-level controller (HLC). This look-up table then decides which low-level controller (LLC) is picked to perform a certain action. In order to prevent the CNNs of the LLCs from ‘forgetting’ the training data, a total of 17 different CNNs were used, one per LLC.

The robot was tested with a range of players from a local table tennis club which made clear that while it could easily defeat beginners, intermediate players pose a serious threat. Advanced players completely demolished the table tennis robot. Clearly we do not have to fear our robotic table tennis playing overlords just yet, but the robot did receive praise for being an interesting practice partner. Continue reading “Achieving Human Level Competitive Robot Table Tennis”

A giemsa stained blood smear from a person with beta thalassemia (Credit: Dr Graham Beards, Wikimedia Commons)

Potential Cure For All Of England’s Beta Thalassemia Patients Within Reach

Beta thalassemia and sickle cell are two red blood cell disorders which both come with massive health implications and shortened lifespans, but at least for UK-based patients the former may soon be curable with a fairly new CRISPR-Cas9 gene therapy (Casgevy) via the UK’s National Health Service (NHS). Starting with the NHS in England, the therapy will be offered to the approximately 460 β thalassemia patients in that part of the UK at seven different NHS centers within the coming weeks.

We previously covered this therapy and the way that it might offer a one-time treatment to patients to definitely cure their blood disorder. In the case of β thalassemia this is done by turning off the defective adult hemoglobin (HbA) production and instead turning the fetal hemoglobin (HbF) production back on. After eradicating the bone marrow cells with the defective genes, the (externally CRISPR-Cas9 modified) stem cells are reintroduced as with a bone marrow transplant. Since this involves the patient’s own cells, no immune-system suppressing medication is necessary, and eventually the new cells should produce enough HbF to allow the patient to be considered cured.

So far in international trials over 90% of those treated in this manner were still symptom-free, raising the hope that this β thalassemia treatment is indeed a life-long cure.

Top image: A giemsa stained blood smear from a person with beta thalassemia. Note the lack of coloring. (Credit: Dr Graham Beards, Wikimedia Commons)

The First Fitbit: Engineering And Industrial Design Lessons

It could happen to anyone of us: suddenly you got this inkling of an idea for a product that you think might just be pretty useful or even cool. Some of us then go on to develop a prototype and manage to get enough seed funding to begin the long and arduous journey to turn a sloppy prototype into a sleek, mass-produced product. This is basically the story of how the Fitbit came to be, with a pretty in-depth article by [Tekla S. Perry] in IEEE Spectrum covering the development process and the countless lessons learned along the way.

Of note was that this idea for an accelerometer-based activity tracker was not new in 2006, as a range of products already existed, from 1960s mechanical pedometers to 1990s medical sensors and the shoe-based Nike+ step tracker that used Apple’s iPod with a receiver. Where this idea for the Fitbit was new was that it’d target a wide audience with a small, convenient (and affordable) device. That also set them up for a major nightmare as the two inventors were plunged into the wonderfully terrifying world of industrial design and hardware development.

One thing that helped a lot was outsourcing what they could to skilled people and having solid seed funding. This left just many hardware decisions to make it as small as possible, as well as waterproof and low-power. The use of the ANT protocol instead of Bluetooth saved a lot of battery, but meant a base station was needed to connect to a PC. Making things waterproof required ultrasonic welding, but lack of antenna testing meant that a closed case had a massively reduced signal strength until a foam shim added some space. The external reset pin on the Fitbit for the base station had a low voltage on it all the time, which led to corrosion issues, and so on.

While much of this was standard development and testing  fun, the real challenge was in interpreting the data from the accelerometer. After all, what does a footstep look like to an accelerometer, and when is it just a pothole while travelling by car? Developing a good algorithm here took gathering a lot of real-world data using prototype hardware, which needed tweaking when later Fitbits moved from being clipped-on to being worn on the wrist. These days Fitbit is hardly the only game in town for fitness trackers, but you can definitely blame them for laying much of the groundwork for the countless options today.