The Core Duo processor from Intel may not have been the first multi-core processor available to consumers, but it was arguably the one that brought it to the masses. Unfortunately, the first Core Duo chips were limited to 32-bit at a time when the industry was shifting toward 64-bit. The Core 2 Duo eventually filled this gap, and [dosdude1] recently completed an upgrade to a Macbook Pro that he had always wanted to do by replacing the Core Duo processor it had originally with a Core 2 Duo from a dead motherboard.
The upgrade does require a bit more tooling than many of us may have access too, but the process isn’t completely out of reach, and centers around desoldering the donor processor and making sure the new motherboard gets heated appropriately when removing the old chip and installing the new one. These motherboards had an issue of moisture ingress which adds a pre-heating step that had been the cause of [dosdude1]’s failures in previous attempts. But with the new chip cleaned up, prepared with solder balls, and placed on the new motherboard it was ready to solder into its new home.
Upon booting the upgraded machine, the only hiccup seemed to be that the system isn’t correctly identifying the clock speed. A firmware update solved this problem, though, and the machine is ready for use. For those who may be wondering why one would do something like this given the obsolete hardware, we’d note that beyond the satisfaction of doing it for its own sake these older Macbooks are among the few machines that can run free and open firmware, and also that Macbooks that are a decade or older can easily make excellent Linux machines even given their hardware limitations.

What’s old is new again… this can also be done on a Mac Mini as I wrote up sooooo many years ago here: https://www.ambor.com/public/meromswap/meromswap
i did something like this with a mac mini, but the processor was socketed, so it was easy (well, to the extent getting into a mac mini is easy — there is always some cable that never gets reconnected)
I’d be very surprised the BIOS would support such a jump in generation of chip and of bittedness, but then again if it can run coreboot/libreboot it might be possible to rectify.
At-home BGA is really not that big of a deal. (free tip: a quartz space heater can do the work of preheating the board – a little fabricobbling to build a shelf/rack/whatever to hold the board the correct distance above the heater so that the max temp doesn’t exceed ~200*C no matter how long you leave it running, a $50 hot air gun, and some stencils…)
I do believe it’s easily possible to go from zero to your first completed reballing for less than $200 all-in.
I have heard people using bigger halogen lamps as PCB heaters.
If only I had known they were paying that much! I’d be “all-in”!
“but it was arguably the one that brought it to the masses” and arguably not. But good story. I have an old Macbook pro that I really like too bad Apple long ago killed the features that I wanted. M.2 or sata, SODIMs. I upgraded my Macbook for a fraction of the price Apple wanted. I could probably even have upgraded the battery to an improved battery.
I know that Arm is different in many ways but really folks expandable memory and M.2 drives are not that hard. I really don’t care if it is super thin.
If I was the King of Apple which I know I am not I would have left the Pro thick and expandable, the Air thin and not, and the less expensive plastic MacBook for students with the expandability. Of course they are still making money had over fist so they don’t need me as a customer I guess. That is fine.
So which CPU would you argue that title goes to?
Expandable memory is kinda tricky when it’s soldered onto the CPU package directly, and HBM really does need to be soldered onto the CPU. That’s the only practical way to retain signal integrity, while getting better than GDDR7 throughput at lower latency than DRAM.
HBM yes, but Apple is not using HBM :) Apple is soldering ordinary LPDDR to lock you into fixed ram and market segment milk suckers on higher models.
As with cache why wouldn’t different level (speed) of RAM on a same system wouldn’t be implemented? Very high speed and low latency RAM soldered in and slower but still high-speed RAM in a removable package (SODIMM, (LP)CAMM2) for extending the usefulness of a system.
That or we teach software engineer to stop shipping bloated programs to stop the constant need for upgrading memory but as it’s been a business driven by MBA for few decades now and not a science anymore that’s not going to happen.
With current RAM prices bubble (driven by AI) any memory upgrade is a pure XD unless your shitting money from breakfast.
Yeah that sucks. It’s the first time my one of my hoarding habits has been rewarding : now my friends who used to laugh about how I was keeping any stick of RAM I was given (from EDO to DDR4 including some funky ones from SUN servers) are a bit quiet
You will be “happy” to know Apple has a patent on that and it at least appears that was the plan with Mwhatever macs in the beginning, but then they discovered its more lucrative to make machines obsolete by design.
https://www.techpowerup.com/277760/apple-patents-multi-level-hybrid-memory-subsystem
Dang it. The Apple from my 1989 SE/30 is not the same as the Apple of today
They definitely didn’t invent this, but challenging an apple patent might be painful
I would argue that the inherent and fundamental limitation of soldered on unified RAM would disallow expandable memory or make it prohibitely expensive and/or complex.
The GPU and CPU share this RAM pool which means you need low latency AND high bandwidth which is unfeasible with traditional RAM sticks. Even if you were to have them in addition to the soldered on RAM, Apple would have to dedicate extra wafer space, PCB space and power to a feature that would potentially go unused. You also have to solve the fact that you have 2 vastly different RAM types which you need to manage. Normal PCs usually do this by setting the clock to the lowest common denominator which you really can’t do. Otherwise you tank the performance and an 8gb Macbook would perform better than a 16 GB one with 8gb soldered and 8gb slotted.
You can probably manage it by treating the soldered RAM as some sort of L4 and the slotted L5 but that introduces additional complexity which if pulled off would be impressive, but it’s impressive for a reason.
Slotted SSDs on the otherhand… Apple is preventing you from giving your own as a result of greed. I would gladly accept a chassis half a cm or so thicker if it meant I don’t have to use a bulky external drive that ruins the portability much more than thickness does or spend 200 USD for a measly 256 GB of SSD storage.
Ha! I had one of those 1st gen MacBook Pros as a work machine, which I also used personally (It was the mid-2000’s and I was the young, brash IT decision maker). It ran hotter than hell, but I sorely missed it when I changed jobs.
I couldn’t initially get a Mac at the new place, but I made a hackintosh from a junk pile Core Duo Gateway Laptop that I was delighted to find had the same Intel chipset and graphics as the Intel transition Apple developer kits. It ran reliably for two weeks before I did the same swap… except my Core2 Duo came from a desktop machine. Even at the time, I was surprised to find a socketed desktop CPU in a laptop, but it was nonetheless nice of Gateway to make the upgrade easy on me.
I eventually took a Dremel to the palm rest and epoxied-on a trackpad from an early plastic MacBook. Of course, the lid bore the ubiquitous Apple logo sticker that came in the box with my iPhone 3G. I got some good use out of that machine before we hit the OS update treadmill that made hackintoshes fraught for daily driving long before Apple silicon landed.
Laptops in particular have improved so much since then, i just can’t see the appeal. My Core 2 Duo laptop was the last garbage laptop i ever had. Since then, they’ve all been much less expensive, much cooler, and much better battery life. haha i’m actually angry at the thought of someone doing all this work to only have an old fashioned burns-your-lap PC experience. Fans in laptops are obsolete, man.