Video Ram Transplant Doubles RTX 3070 Memory To 16 GB

Making unobtainium graphics cards even more unobtainable, [VIK-on] has swapped out the RAM chips on an Nvidia RTX 3070. This makes it the only 3070 the world to work with 16 GB.

If this sounds familiar, it’s because he tried the same trick with the RTX 2070 back in January but couldn’t get it working. When he first published the video showing the process of desoldering the 3070’s eight Hynix 1 GB memory chips and replacing them with eight Samsung 2 GB chips he hit the same wall — the card would boot and detect the increased RAM, but was unstable and would eventually crash. Helpful hints from his viewers led him to use an EVGA configuration GUI to lock the operating frequency which fixed the problem. Further troubleshooting (YouTube comment in Russian and machine translation of it) showed that the “max performance mode” setting in the Nvidia tool is also a solution to stabilize performance.

The new memory chips don’t self-report their specs to the configuration tool. Instead, a set of three resistors are used to electronically identify which hardware is present. The problem was that [VIK-on] had no idea which resistors and what the different configurations accomplished. It sounds like you can just start changing zero Ohm resistors around to see the effect in the GUI, as they configure both the brand of memory and the size available. The fact that this board is not currently sold with a 16 GB option, yet the configuration tool has settings for it when the resistors are correctly configured is kismet.

So did it make a huge difference? That’s difficult to say. He’s running some benchmarks in the video, both Unigine 2 SuperPosition and 3DMark Time Spy results are shown. However, we didn’t see any tests run prior to the chip swap. This would have been the key to characterizing the true impact of the hack. That said, reworking these with a handheld hot air station, and working your way through the resistor configuration is darn impressive no matter what the performance bump ends up being.

Continue reading “Video Ram Transplant Doubles RTX 3070 Memory To 16 GB”

Hackaday Links Column Banner

Hackaday Links: February 16, 2021

This is it; after a relatively short transit time of eight months, the Mars 2020 mission carrying the Perseverance rover has almost reached the Red Planet. The passage has been pretty calm, but that’s all about to end on Thursday as the Entry Descent and Landing phase begins. The “Seven Minutes of Terror”, which includes a supersonic parachute deployment, machine-vision-assisted landing site navigation, and a “sky-crane” to touch the rover down gently in Jezero crater, will all transpire autonomously 480 million km away. We’ll only learn about how it goes after the eleven-minute propagation delay between Mars and Earth, but we’ll be glued to the NASA YouTube live stream nonetheless. Coverage starts on February 18, 2021 at 11:15 AM Pacific Standard Time (UTC-8). We’ve created a handy time zone converter and countdown so you don’t miss the show.

As amazing as the engineering on display Thursday will be, it looks like the US Navy has plans to unveil technology that will make NASA as relevant as a buggy-whip company was at the turn of the last century. That is, if you believe the “UFO Patents” are for real. The inventor listed on these patents, Dr. Salvatore Pais, apparently really exists; he’s had peer-reviewed papers published in mainstream journals as recently as 2019. Patents listed to Dr. Pais stretch back to 2004, when he invented a laser augmented turbojet propulsion system, which was assigned to defense contractor Northrup Grumman. The rest of the patents are more recent, all seemingly assigned to the US Navy, and cover things like a “high-frequency gravitational wave generator” and a “craft using an inertial mass-reduction device”. There’s also a patent that seems to cover a compact fusion generator. If any of this is remotely true, and we remain highly skeptical, the good news is that maybe we’ll get things like the Epstein Drive. Of course, that didn’t end well for Solomon Epstein. Or for Manéo Jung-Espinoza.

Of course, if you’re going to capitalize on all these alien patents, you’re going to need some funding. If you missed out on the GME short squeeze megabucks, fret not — there’s still plenty of speculative froth to go around. You might want to try your hand at cryptocurrency mining, but with GPUs becoming near-unobtainium, you’ll have to get creative, like throwing together a crypto mining farm with a bunch of laptops. It looks like the Weibo user who posted the photos has laptops propped up on every available surface of their apartment, and there’s also a short video showing a more industrial setup with rack after rack of laptops. These aren’t exactly throw-aways from some grade school, either — they appear to be brand new laptops that retail for like $1,300 a pop. The ironic part is that the miner says this is better than the sweatshop he used to work in. Pretty sure with all that power being dissipated in his house, it’ll still be a sweatshop come summer.

A lot of people have recently learned the hard lesson that when the service is free, you’re the product, and that what Google giveth, Google can taketh away in a heartbeat, and for no discernable reason. Indie game studio Re-Logic and its lead developer Andrew Spinks found that out last week when a vaguely worded terms-of-service violation notice arrived from Google. The developer of the popular game Terraria was at a loss to understand the TOS violation, which resulted in a loss of access to all the company’s Google services. He spent three weeks going down the hell hole of Google’s automated support system, getting nothing but canned messages that were either irrelevant to his case or technically impossible; kinda hard to check your Gmail account when Google has shut it down. The lesson here is that building a business around services that can be taken away on a whim is perhaps not the best business plan.

And finally, we watched with great interest Big Clive’s secrets to getting those crisp, clean macro shots that he uses to reverse-engineer PCBs. We’ve always wondered how he accomplished that, and figured it involved some fancy ring-lights around the camera lens or a specialized lightbox. Either way, we figured Clive had to plow a bunch of that sweet YouTube cash into the setup, but we were surprised to learn that in true hacker fashion, it’s really just a translucent food container ringed with an LED strip, with a hole cut in the top for his cellphone camera. It may be simple, but you can’t argue with the results.

Continue reading “Hackaday Links: February 16, 2021”

Add An Extra 8GB Of VRAM To Your 2070

Most of us make do with the VRAM that came with our graphics cards. We can just wait until the next one comes out and get a little more memory. After all, it’d be madness to try and delicately solder on new components of something so timing-sensitive as RAM chips, right?

[VIK-on] took it upon himself to do just that. The inspiration came when a leaked diagram suggested that the RTX 2000 line could support 16 GB of RAM by using 2GB chips. NVIDIA never did release a 16GB version of the 2070, so this card is truly one of a kind. After some careful scouring of the internet, the GDDR6 chips were procured and carefully soldered on with a hot air gun. A few resistors had to be moved to accommodate the new RAM chips. During power-on, [VIK-on] saw all 16 GB enumerate and was able to run some stress tests. Unfortunately, the card wasn’t stable and started having black screen issues and wonky clocks. Whether it was a bad solder joint or firmware issues, it’s hard to say but he is pretty convinced it is a BIOS error. Switching the resistors back to the 8GB configuration yielded a stable system.

While a little more recent, this isn’t the only RAM upgrade we’ve covered in the last few months. Video after the break (it’s not in English but captions are available).
Continue reading “Add An Extra 8GB Of VRAM To Your 2070”

A Look At How Nintendo Mastered Dual Screens

When it was first announced, many people were skeptical of the Nintendo DS. Rather than pushing raw power, the unique dual screen handheld was designed to explore new styles of play. Compared to the more traditional handhelds like the Game Boy Advance (GBA) or even Sony’s PlayStation Portable (PSP), the DS seemed like huge gamble for the Japanese gaming giant.

But it paid off. The Nintendo DS ended up being one of the most successful gaming platforms of all time, and as [Modern Vintage Gamer] explains in a recent video, at least part of that was due to its surprising graphical prowess. While it was technically inferior to the PSP in almost every way, Nintendo’s decades of experience in pushing the limits of 2D graphics allowed them to squeeze more out of the hardware than many would have thought possible.

On one level, the Nintendo DS could be seen as a upgraded GBA. Developers who were already used to the 2D capabilities of that system would feel right at home when they made the switch to the DS. As with previous 2D consoles, the DS had several screen modes complete with hardware-accelerated support for moving, scaling, rotating, and reflecting up to four background layers. This made it easy and computationally efficient to pull off pseudo-3D effects such as having multiple backdrop images scrolling by at different speeds to convey a sense of depth.

On top of its GBA-inherited tile and sprite 2D engine, the DS also featured a rudimentary GPU responsible for handling 3D geometry and rendering. Hardware accelerated 3D could only used on one screen at a time, which meant most games would keep the closeup view of the action on one display, and used the second panel to show 2D imagery such as an overhead map. But developers did have the option of flipping between the displays on each frame to render 3D on both panels at a reduced frame rate. The hardware can also handle shadows and included integrated support for cell shading, which was a particularly popular graphical effect at the time.

By combining the 2D and 3D hardware capabilities of the Nintendo DS onto a single screen, developers could produce complex graphical effects. [Modern Vintage Gamer] uses the example of New Super Mario Bros, which places a detailed 3D model of Mario over several layers of moving 2D bitmaps. Ultimately the 3D capabilities of the DS were hindered by the limited resolution of its 256 x 192 LCD panels; but considering most people were still using flip phones when the DS came out, it was impressive for the time.

Compared to the Game Boy Advance, or even the original “brick” Game Boy, it doesn’t seem like hackers have had much luck coming up with ways to exploiting the capabilities of the Nintendo DS. But perhaps with more detailed retrospectives like this, the community will be inspired to take another look at this unique entry in gaming history.

Continue reading “A Look At How Nintendo Mastered Dual Screens”

Trying (And Failing) To Use GPUs With The Compute Module 4

The Raspberry Pi platform grows more capable and powerful with each iteration. With that said, they’re still not the go-to for high powered computing, and their external interfaces are limited for reasons of cost and scope. Despite this, people like [Jeff Geerling] strive to push the platform to its limits on a regular basis. Unfortunately, [Jeff’s] recent experiments with GPUs hit a hard stop that he’s as yet unable to overcome.

With the release of the new Compute Module 4, the Raspberry Pi ecosystem now has a device that has a PCI-Express 2.0 1x interface as stock. This lead to many questioning whether or not GPUs could be used with the hardware. [Jeff] was determined to find out, buying a pair of older ATI and NVIDIA GPUs to play with.

Immediate results were underwhelming, with no output whatsoever after plugging the modules in. Of course, [Jeff] didn’t expect things to be plug and play, so dug into the kernel messages to find out where the problems lay. The first problem was the Pi’s limited Base Address Space; GPUs need a significant chunk of memory allocated in the BAR to work. With the CM4’s BAR expanded from 64MB to 1GB, the cards appeared to be properly recognised and ARM drivers were able to be installed.

Alas, the story ends for now without success. Both NVIDIA and ATI drivers failed to properly initialise the cards. The latter driver throws an error due to the Raspberry Pi failing to account for the I/O BAR space, a legacy x86 feature, however others suggest the problem may lay elsewhere. While [Jeff] may not have pulled off the feat yet, he got close, and we suspect with a little more work the community will find a solution. Given ARM drivers exist for these GPUs, we’re sure it’s just a matter of time.

For more of a breakdown on the Compute Module 4, check out our comprehensive article. Video after the break.

Continue reading “Trying (And Failing) To Use GPUs With The Compute Module 4”

Finding The Random Seed Of Minecraft’s Title Screen

Minecraft is a game about exploring procedure-generated worlds. Each world is generated from a particular “seed” value, and sharing this seed value allows others to generate the same world in their own game. Recently, the distributed computing project Minecraft@Home set about trying to find the seed value of the world shown in the Minecraft title screen, and have succeeded in their goal.

The amount of work required to complete this task should not be underestimated. 137 users contributed 181 hosts with 231 GPUs to the effort, finding a solution in under 24 hours. The list of contributors to the project is a long one. It appears the method to find the seed involved comparing screenshots from various seed worlds to the original image. This took a lot of reverse engineering in order to calculate the camera FOV and other settings of the original capture, such that the results could be compared accurately. Interestingly, the group found two seeds that can generate the requisite world, suggesting the world generator code has some collisions between seed values.

We’re not sure what’s more astounding, the amount of work that went into the project, or that there’s a distributed computing project tackling advanced Minecraft research. Either way, we’re no strangers to Minecraft hacks around these parts. Video after the break. Continue reading “Finding The Random Seed Of Minecraft’s Title Screen”

A Dead Macbook GPU Shouldn’t Stop You, With This BGA Soldering Hack

On some 2011 Macbook Pro models, there is a tendency for the Radeon GPU to fail. This should mean game over for the computer, but surprisingly salvation is offered by its having not one but two GPUs on board. The Intel processor also has a GPU, and Apple use a pile of logic in an FPGA to switch at will between them. The community have produced fresh FPGA code to revive a dead Mac on its Intel GPU, but at the expense of losing brightness control. [Ayilm1] has brought back the brightness with a clever BGA reworking hack that gains access to a brightness control line present on the Intel BD82HM65 Platform Controller Hub chip but not used in the Macbook.

We’re used to impressive soldering work here at Hackaday, and we’ve seen our share of wiring direct to the balls on an upturned BGA chip. This is a similar idea but at another level, as a section of the top insulation on an in-place BGA is removed to expose the microvia above the ball carrying the required signal. A tiny wire is soldered to the exposed pad and taken to a piece of copper tape stuck down to provide mechanical strength, and a piece of enameled copper wire is run from that to the other side of the PCB where lies its destination. It comes with FPGA code to take advantage of it, but even for non-Macbook owners, it’s an extremely impressive piece of work. It’s not the first fine-soldering Macbook fix we’ve seen, either.

Thanks [lightpink784] for the tip.