A Laptop With An External Graphics Card?

laptop with external gpu

It used to be that desktop computers reigned king in the world of powerful computing, and to some extent, they still do. But laptops are pretty powerful these days, and in our experience, a lot of engineering companies have actually swapped over to them for resource hungry 3D CAD applications — But what if you still need a bit more power?

Well, [Kamueone] wasn’t satisfied with the performance of his Razer Blade GTX870m laptop, so he decided to hack it and give it its own external graphics card.

Now unfortunately this really isn’t quite a simple as running some PCIE extender cables — nope. You’ll have to modify the BIOS first, which according to [Kamueone], isn’t that bad. But after that’s done you’ll also need a way to mount your graphics card outside of the laptop. He’s using an EXP GDC Beast V6 which uses a mini PCIE cable that can be connected directly to the laptop motherboard. You’re also going to need an external power supply.

[Kamueone] ran some benchmarks and upgrading from the stock onboard GTX870m to an external GTX 780ti resulted in over three times the frame rate capability — 40fps stock, 130fps upgraded!

80 thoughts on “A Laptop With An External Graphics Card?

  1. This is an exremely cool hack but I don’t really see the point. If you need to use an external power supply, monitor and video card… why do it on your laptop? Just not worth it except for the cool factor.

          1. Exactly! You, too, can become a captain of industry by increasing your electric bill and offsetting it by the few cents you would earn from GPU mining.

          2. While Bitcoin mining on GPUs might be largely obsolete, there are other altcoins that are sometimes worth mining on GPUs.

            Also, the few who happen to live where electricity is “free” (dorm rooms, a few apartments) will always profit from GPU mining given they already own the machine for other uses.

      1. This is not “dockable” solution. You have ziff ribbon connector which is kinda fragile and not so easy to connect. I have seen some mPCI-e to PCI-E adapters using HDMI cables instead (AFAIR 2 of them) – that would make it more movable.

        I have used something similar in my dell latitude (no hacks required, just adapter + psu). I stopped because:
        * I was always worried about ziff ribbon, which I hid in clearance between second disc bay and laptop case during transport
        * my NVS card used no so new chip, meaning software-SLI (without additonal gpu-gpu cable) was not supported (BTW this is quite cool feature in itself if you don’t care about performance but about multi-monitor setup like I did)
        * multi-card setup (expect for SLI) is PITA on Linux, as you will end up with two X servers… Also while I think it is possible to do PCIE hotswap on linux I doubt connector is up to it and even if it would do, I wish you luck with configuring X server so you don’t lose your work each time you connect/disconnect egpu;)

    1. My work issues laptop workstations for work due to remote usage, meetings, lab visits etc. I do a lot of CAD and FEA/cfd etc (though the fea/cfd is usually run remotely on a better computer. CAD programs can greatly accelerate with better graphics cards. My company was too cheap to get a better graphics card on my current laptop unfortunately (possibly a communication issue in ordering.) If I did this, my productivity would increase a decent amount for some tasks. I do most of my work at my desk so this would be apart of my docking station at work. Makes complete sense and I’ve seen it done before though it’s still somewhat of a cobbled together solution.

    1. That’s just a benchmark, this setup will be capable of playing games at a higher fidelity and/or framerate.

      It is a pointless endeavor though, a simple desktop built around the card wouldn’t have cost much more, and has the added benefits of not melting under load and simply being a separate device.

        1. I am dumb. I know that I am dumb.

          Please explain:

          If a PC built around that graphic card costs LESS than the solution presented here – in what way would the PC built around the graphic card HINDER anyone from still taking the laptop with him, and thus having it portable? In what way would a PC built around a graphic card render a laptop not-portable?

          I do not get it. Did I mention that I am dumb?

          1. With a desktop brings about another PITA: synchronising the data back and forth. with a GPU dock you can have a large display and high CPU power at the workplace, while you can take your WORK data along with the machine you run it on.
            With a desktop the simplest solution would be having the work computer as a virtual machine on an external SSD (my preferable way as you can afford even having the notebook stolen / destroyed and all you have to do is get a new on, plug in the external disk, install VirtualBox and proceed with the work) and running its backups to a corporate NAS in the background..

    2. Brian, your reasoning is flawed. Yes, there are a lot of good reasons why someone would be willing to pay more for faster hardware.

      Game writers know their games will be played on machines of vastly different capabilities. Games can tune a bunch of parameters up and down (resolution, fine/coarse texture maps, realistic physics, smoke effects, degree of object tessellation). Powerful machines get better quality images; slower machines worse, such that a 10x more powerful machine won’t have 10x the frame rate.

      Also, for a variety of reasons, the workload per frame varies wildly, so having an over-powered machine means that on simple frames the graphics card might be idle much of the time, but on demanding scenes it can keep going at 60 Hz or whatever, while a weaker machine will start suffering from frame stutter.

      Another reason is to somewhat future-proof your investment. If you buy a card which barely handles today’s most demanding game, it will start having to bail water with the next generation game. Buying 50% more power than you need today may prevent you from buying a whole new card next year.

      Some games can be played with LCD shutter glasses sync’d to the LCD frame rate; if your hardware can manage it, you can generate 60 fps for each eye independently on one display.

      1. But humans can’t tell the difference between 30fps and 60fps (persistence of vision). Although I understand the increased frame rate (through interpolation or more image frames) can reduce motion blur and image “tearing”. And the increased power of the graphics processor may off-load the CPU, allowing more intensive work to be done by the CPU without causing frame stutter.

        1. See, you were doing so well, then you tipped your hand as a troll by trotting out the thoroughly-debunked “humans can’t tell the difference between 30fps and 60fps” pile of steaming horseshit. You got a few bites, but you could have drawn this out much longer if you had kept your ace in the hole a little longer.

        2. You get illusion of motion around 25fps, but that’s not really what’s important. What you really should look for is the ability to detect a single black frame between a load of white ones and vice versa. I don’t remember exact numbers, but detecting a black frame stops being possible around 60FPS and white one around 100.

        3. Here’s a readily accessible and real-world example that this is false. The Hobbit series was the first widely-released motion picture to be filmed and projected at 48fps instead of the traditional 24fps. Try Googling “The Hobbit fps” and check out some of the widely polarized opinions on this choice. Some hated it, some loved it, but just about everyone noticed a significant difference. (Granted it’s not 30 vs. 60 fps as in your claim, but if the human ability to detect increasing frame rates ends at 30fps, the reaction to a mere additional 6fps would not have been so drastic.)

          1. Yea in the /80’s/90’s the industry standard for CRT monitors was 60hz. People who were sensitive(got headaches or dizzyness) to the crt scan could turn them up (to 75 or 80)and have most of tier symptoms relieved. Some people can detect flicker of LED bulbs up to 120hz(the US standard for most LED bulb flicker). 25/50hz movies from Europe look foreign to US eyes. Point is that refresh rate is detectable up to about 120hz for some people. Some frequencies are pleasing, while some are horrible. Each person has their own set of frequencies they like/can view. But calling BS about people not noticing the difference is just stupid.

        4. The fps isnt only about what you see. Thats a number related to how fast the system can output the graphics, certainly. Within that is also how fast the software is able to push information through the system. If you slow the graphics you can slow the overall program and/or create all sorts of visual artifacts.

        5. Woah, woah, woah, pump the brakes. Human eye perception is not at 30fps, the threshold is 56fps and above. You absolutely can tell the difference between 30 and 60, just try watching a soap opera and a movie, its 59 vs 24, big difference, and absolutely noticeable. Nvidia and other companies have lots of research on this, more than just realistic reasoning from past experiences.

        6. We can very much tell the difference between 30 and 60 fps. But that’s hardly a noteworthy point for this discussion.

          It’s not as if the previous card ran all games/application at 40 fps, and now it will run them all at 130 fps. Rather, if there was a game that he had that only ran at 10 fps, he’ll likely be able to run it at the same settings with 30 fps now. Likewise, if he had a game that he could run at a decent 40-50 fps with low settings, he could run it at much higher settings(so that it looks better) and keep the same frame rate.

          Hell, I can run the first Diablo game at 40fps or better on the computer I had in 1999. Why did I upgrade 3 times to the computer i have now? I mean, my newest computer can run Diablo at something like 5000 fps. The answer is because I want to be able to run more modern games with higher quality settings.

    3. If you’re referring to the myth that humans can’t see images faster than a given framerate (I’ve heard all common framerates quoted, from 24Hz, to 30Hz, 60Hz and 75Hz), it’s absolute BS. Humans eyes do not run off a 60Hz clock, and can see details significantly faster than that (note that recognising what is being seen can take somewhat longer than just seeing it).

      On old CRT monitors I could see the scanlines on lower refresh rates and had to increase it to the maximum that my monitor supported (85Hz) to mostly eliminate it from the center of my vision – but even then I could still just barely see it in the center of my vision and it was still very clear in my peripheral vision.

      Many pro-gamers play at 120fps (or higher) so that they can respond much faster to the stimulus on the screen, and even that is still not the real limit of what the eye can see.

      Personally I play games in Stereo 3D (using nvidia 3D vision – I’m one of the community modders). Stereo effectively halves the framerate, so you need a significantly more powerful GPU (or two) to maintain a playable framerate. 3D Vision runs at 60Hz per eye, so the experience will be best when the GPU can render the game at a *minimum* of 120fps. As Virtual Reality becomes mainstream, even this will be insufficient – Occulus has found that 75fps per eye to be the minimum (150fps overall), and Valve is suggesting that games really need to be hitting 90fps per eye (180fps overall) to eliminate VR sickness, and in addition to all this we need higher and higher resolutions to eliminate the “screen door” effect…

      1. I couldn’t see scan lines at 60Hz, but on most CRTs 59 or 60 Hz would give me a headache after a short period of use. Just bumping it to 70Hz go rid of it. Could be the brightness peaking at the same time as the brightness of the lightbulbs in the room, or it could be something else. While I didn’t do any double blind testing on myself (that would be tricky) I did find that I could walk into a computer lab or look at a friends monitor and determine if it was 59/60Hz or “something else”.

        But, 3D Vision doesn’t require a framerate of 120, just a monitor with 120Hz refresh. The frame rate of the game may be higher or lower.

  2. It’s too bad laptops with the ports to do this never really took off. I remember a Sony Vaio model several years ago that had a proprietary USB3 port with optical fiber PCIe (the original Intel Thunderbolt / Light Peak, before Apple made it their own thing on a copper interface) — it was extremely thin (would be called an ultrabook these days) but could be docked (with just that special USB connector) to an external unit with a graphics card (and an optical drive IIRC) — and it could actually feed the graphics from that card back to the laptop’s screen, much like laptops with switchable integrated and discrete graphics do these days. Dock it at home and play games, or pick it up and have an ultra-portable laptop. Of course, it was Sony-expensive, and as I recall the USB people got mad at them for not meeting the standards, and it never went anywhere beyond that model. Would have been cool though!

    1. Good point. How much would it cost to put external PCIe slots on laptops? Industrial data collection systems use them- if PCIe over cable swere more common, it would be cheaper.

      1. Man I really wanted that Sony laptop.

        There were external PCIE 1x slots in the form of expresscard, which was supposed to replace pcmcia, which was used to add usb ports, wifi, networking and card readers. All things that come standard on laptops now or work fine hanging off the USB.

        People did do expresscard to graphics card adapters. They even bought out the internal mPcie to a hacked on mini hdmi port so that you could get PCIE 2x by combining them both.

        But the pcie bandwidth isn’t there, and you’ve already spent a fair bit on a decent enough GPU, so youre just better off spending the cash you’ve spent on weird adapters on a cheap mobo, CPU and HDD and actually having a gaming pc.

  3. THIS IS A SERIOUS HACK. I have wanted exactly this for a loooong time. I do 3D modeling with my main and only computer, a 17″ Inspiron 9300 with a GEForce 6800 separate card inside, but it’s a laptop from 2005. Autodesk Inventor I run, and over the last 10 years have moved frequently, including 3 times to Japan from US. I needed a laptop all this time for portability daily, but I needed high graphics power for Inventor, mainly, and some gaming.

    Even with the separate dedicated graphics card, it is all circa 2005, so not that powerful. Was awesome- in 2005. Everything since, even the damn java applets and youtube, make this thing lag. I needed something like this for years, that would let me keep the laptop that’s fine, and give me graphics capability- that I could upgrade just a card for, and not the entire laptop. Cards stay cheap compared to the equivalent in a laptop. I only need the capability of the card, not the cost of an entirely new computer.

    Yes, I *think* they make some laptops with dedicated cards now still, but rare. And extremely expensive. And large, and heavy. With very very bad battery life. I wanted something small, light, easily taken around the world, but plug in serious graphics upgradeable at my choosing when I needed it for modeling my 100+ part assemblys.

    I know this sounds ridiculous- but we have had laptops for 20+ years. I want my computer to be portable- because it can be, and I need it to be. Everytime I hear someone tell me “go buy a desktop” I think, hey, why do you need an Ipod? There are perfectly good victrolas laying around… People who aren’t willing to make a solution to a problem are just losing
    a business opportunity.

    There are some people out there that this hack was made for- because no one sells my solution. Or they do, and the laptop and it’s capbility will just be worthless in 2-3 years. I really like this!

      1. Dell alone makes plenty of laptops with video cards. Hell there alienware line has one with duel dedicated graphics cards. Look up the precision line of dell for some nice ones you arnt paying for all the extra lights and garbage on like the alienware

        1. No, no no no! They aren’t the only ones, and never buy from Dell/Alienware.

          I say this as someone who had to deal with the quality, or lack thereof from them. It might work fine, otherwise, you have a nasty waste of time. (It occurred just outside the refund period.) End result, another laptop was purchased to be actually usable. I think I probably should have looked more at lemon laws, but I wasn’t the one handling the initial interactions. 3/12 months usable the first year, between shipping it to them, them claiming it didn’t have a problem, getting it back, sending it back, them replacing the motherboard, and claiming everything was fine. Cue for most of the components. (RAM, CPU, motherboard again). When I finally was asked to look at it, popped in memtest86, and I’ve never seen that much red on memtest. I think that was after they’d replaced the RAM.

          Eventually they replaced it. Because of that experience, I will never buy another Dell/Alienware, and strongly recommend against getting something from them. I’ve never seen people get jerked around so much. The BBB had been contacted as well by the person, and they lied to the BBB. I think it was when they were called on it (original person had kept good notes about each time they called), they was when they decided to replace it with a different model.

          Please note, this was purchased right around the time Dell acquired Alienware, and took over the support. They had a good reputation for customer support prior to that time.

      2. If the old one is otherwise running fine, why on earth would he want to replace it? Just because something is disposable doesn’t mean it should be disposed… The total cost to produce all these plastic, metal, and chemical devices is faaaaaar higher than the price you pay to get one. Kudos to [Drew] for not giving in to the consumer hamster wheel that demands that we replace everything as soon as possible, over and over and over again!

        1. Because his other problems with youtube and java lagging aren’t becuase of the graphics card in the first place, but because he’s running a 10 year old laptop with a comparatively slow processor, memory, and hard drive.

          Upgrading the video card would be pointless because his other hardware is ten years obsolete AND it’s laptop hardware which is slower to begin with. There’s significant driver/API overhead in for example DirectX which slows the whole thing down if you don’t have enough CPU power and memory bandwidth to deal with it.

          1. Ok- for anyone that cares- I neglected to mention specs. Yes, I get that the lag isn’t entirely due to the old graphics card. That is really small. I just like this hack because it would allow graphic upgrade for rendering!

            The laptop I use is a Dell Inspiron 9300 with a version of Ubuntu Linux a couple years old installed. I have so much custom stuff setup for multilingual Japanese predictive character input and all the other software workarounds I wrote code for to keep malfunctioning components broken, that I don’t feel like taking 3 months minumum to remember and resetup all of it, and pray it works on a new computer with new linux, that I just don’t want the damn hassle. My time is precious to me. I don’t have the 3 months to figure it all out again that I did when I was unemployed during the recession.

            At the time I bought it- I knew Dell was still shitty back then, but I took a chance, and got every upgrade I could. It’s still decently fast for most stuff, honestly. That’s why I bother.

          2. Well, the old processor and the bus solution would be gimping the new GPU’s speed to roughly half of what it could achieve in a proper setup because there’s significant processing done in the drivers.

            The whole thing is rather a waste of money. If you start counting hours as to how long you’d have to tweak it to make your Ubuntu even work with the latest hardware, you’d be back in square one.

      3. I drive a brand new car now. But I didn’t do it because I can afford it. I did it because I didn’t feel like continuing to pour money into a broken piece of shit, what I drove before for many many years. I just *happen* to be able to afford leasing it. I felt better pouring the same money away going toward buying something that WASNT falling apart.

        Sometimes people choose to keep old things and not buy new things. Not because they can’t afford it- but because they really like the old thing. Or because the old thing isn’t broken. I get the idea of buying a new computer every few years. That doesn’t mean I just agree with it though. If it aint broke, and it works, why throw it out? It cost me over $2500 when I bought it- I’m going to get my money’s worth out of it, damnit!

        I don’t believe in pissing away my hard earned money on things I don’t need. My laptop works fine- except for graphics, which works on here just fine, but every website with a shitton of java applets and autoloading movie ads and inability to render Inventor pisses me off. Hence, you didn’t read my reasoning.

        And for the record- I have been getting a student copy of Inventor since 2005. That’s cheap. I never said I had the ridiculous sum of thousands of dollars of software on my computer for the full version. I use that at work.

        This will sound incredibly biting- but I forgive you. You’re like most people- quick to assume, quick to judge and fein offence, rather than try to understand the situation.

    1. While the hack is a work of art and something I’ve never seen attempted and succeeded at, it’s about damn near useless. The idea behind the hack is to disconnect from the card when you gotta go mobile. Obviously you don’t take a power supply and external video card with you. The point was to plug in at home for more rendering power. But then you probably add in the bigger monitor sitting on the desk next, because you can’t stand looking at a 15 inch screen. Then you get yourself keyboard and mouse since the laptop will be closed and pushed under the monitor.

      See, by the time you are done buying shit, you could have just put the desktop in, and left the laptop to when you need to have some power on the road. So this hack invalidates itself.

      1. Thank you. It was hard work.

        I want to thank all the people for reading my decumentation. The purpose of hacking this thing was to have a mobile laptop i can take with me without the eGPU. And when I am at home i can also use it as a “Desktop PC” with a big LCD screen. The Processor is speedy enough just the gpu needed a little upgrade. Now i do not need a Desktop PC anymore. I can take all my Data with me. Another reason was to evolve my self. Like i always do. I need something to learn from and mod. The hole reason i am buying stuff like that is to open it up, learn it, break it, repair it, give respect to it and understand it. I put knowledge above money. Else i wont learn a thing because of the fear of loosing that money when it breaks…. Its a hackers and modders way of thinking

    2. Don’t you want some monitor space for heavy duty cad work and editing??? Well, not many people think it’s reasonable to edit full length 4K movies on their ipod either, so not really a good example. If you throw all of that money into a laptop, when any part of it is unusable, the rest is Junk too. With a desktop, you can put lots of money into a monitor or wall full of monitors, and keep them, mean while updating the processor, motherboard, graphics, storage etc. Yes you will eventually replace all of it, but much more slowly if you are smart.

      If laptop mfg’s put external PCIe connectors on them, that would be ideal. They could also just stick to a darn uniform, modular hardware standard, so you could replace and upgrade parts.

      I do not think they want laptops and all in one PCs (like apple’s) to be modular or upgradeable, because the computer you want to buy, exactly what you are describing, will be the end of you buying a new one every time Microsoft releases the next “upgrade” (loosely speaking) to their OS. The manufacturers have everyone trained to view $800!!!!! phones as throw away items, and your dream laptop would not be a throw away consumer product.

      I have also kept laptops for 7 years for work, and I think they are doing their best to make sure we stop doing that. It makes no difference that the machine really does actual work just fine, just that it runs the latest thing to sell new consumer gadgets.

      1. Totally agreed. I can afford a new $2500 laptop. I just don’t like throwing away that kind of money every 3 years to stay whatever people think is “current”.

        People are now trained to not only pay $600 for a phone, but even over 1k$ a month on your phone bill! And to throw the phone away after a couple years!!

        What is wrong with the human race? How do people actually think that mindset is sustainable, acceptable, and not at all likely to bankrupt you?? It is crazy!

        I understand the specs of what I need enough to do what I need to with a laptop. I use to sell electronics for a living, and build computers as well. I’m not some shmuck that has no clue about actual specs. But why in gods name do people honestly believe that a laptop 10 years old, properly upgraded over time, is worth throwing out?

        Because it’s mostly plastic, and not metal?

        Using that logic, most people should throw out their cars every 3-4 years. The same people who find carrying around an easily dropped and destroyed $600 smartphone in their pocket. That you sign a contract for years to get.

        I swear, I have to will myself daily to stay sane in the face of this maddness. Modern society makes absolutely no sense to me, and I am only 31!

        1. Totally agree with you. We are in an insane society. I believe we’re all a bit sociopathic considering our behavior, largely pushed this way by those that run society, and also because we’ve accepted this way of life. I just moved into a house with a big grass yard and I saw I used 26,000 gallons of water in my first month (live in the AZ). Seriously. I cant with a good conscience continue to do that.

          But with regards to your technological problems, isnt it possible to take your hard drive and put it in other computers to meet your processing needs? Like, clone your drive onto a new one so it is exactly the same set up? 10 years is getting your money’s worth, but $2500 is way too much to spend on a laptop. That’s $250/year. You could have picked up a $1200 laptop twice and been better off. Pick up a used ultrabook with discrete graphics and you’ll be set for a while. Asus makes an ultrabook with discrete nvidia card and ships with 12GB ram for like $1200 new. I bought a lenovo thinkpad yoga with discrete graphics like new (used for less than a month) for $800.

  4. This is exactly what I’m looking for. I only need to wait for a decent 12″ laptop and DDR4… A home gaming system and a portable office at once… Plus the docking station (with psu and gpu) would act as pluggable ethernet, usbs, additional fans, power supply…

    1. I’m in the same boat, I’ve got a nice older laptop: Quad Core i7, 8 gigs of ram, two internal SATA drives, Bluray, 120Hz LCD, 3D Vision, I think GT460m, all the perks. It was a used/returned model that had a broken keyboard key, so half price when these were brand new parts.

      The only part that has needed updating is the GPU. Sure, I could drop some coin on a case, get a mobo and CPU, swap the drives over or pick up some SSDs (the laptop is that old that SSDs weren’t yet the go-to) . . . but why!? With this option (if it would work on my Toshiba, time to poke around the bios) I would just need a monitor (sure, >120Hz are not cheap but I want one anyways because my laptop is just 15″), a PSU (which could be really low power), and a GPU. Worst case? I buy a mobo and CPU and case, and have a new desktop!

      Been moving and traveling so much that I haven’t had a desktop in . . . seems like almost 10 years. It was at least 6 years, 3 apartments ago.

  5. Am I the only one that thinks this should have never been a hack? Why is this not a simple case of adding PCI extender cables? Besides power. What about laptops becoming MORE flexible rather than less?

      1. This.

        Someone design a lightweight magnesium alloy chassis 17″ widescreen laptop, less than 5 pounds, maybe with a touchscreen, with serious power, a 32 GB SSD for the operating system, a 1 TB 7200 RPM traditional HDD for better longterm data stability, a dedicated soundcard with multiple optical and digital coaxial outputs, 16 GB of DDR3 RAM, a serious graphics card and decent battery life. Like, 5 hours or more unplugged.

        Do that, make it something that will last, machined connectors and buttons, for less than $2k, and I will throw my goddamned wallet at you and demand you take my money before I shove it down your throat.

        I wish there was a bespoke laptop maker that really offered serious possibility. If they can make the Sprout 3D scanner workstation, why can’t they make a serious portable and powerful laptop to order that lasts?

        1. The problem is that the 1TB spinning drive eats into your battery life like mad; better to have 2 SSDs. Additionally, all of those components would weigh close to 5lbs, not counting the chassis. Sure, someone out there might be able to custom build a motherboard with selected audio and network driver chips and ports for all you are asking for, but that price wouldn’t be made up for by selling the laptop at 2 grand.

  6. Laptop with external desktop graphics is not that uncommon.
    There are even commercial ones available for some laptops.

    Dell/Alienware has one called Alienware Graphics Amplifier.
    And there is one for MSI, the MSI GamingDock.

    These are external cases, with power supply and a full PCIe x16 Slot.

    1. A guy i know has been using essentially the same as OP for about a year now. I don’t remember him doing much hacking in the BIOS, though.

      Then again, i also know a guy who’s been trying to make it work for half a year. Depends on the Notebook, i think.

  7. There are ExpressCard to PCIe adapters, but they’re limited to an x1 single lane connection. Mini PCIe also has but a single lane, though some pins are reserved for a second, if any manufacturer implements it.

    So unless this particular laptop model has a two lane Mini PCIe connector in it, the hack to extend it out was pointless, should have simply plugged into the ExpressCard slot.

    What I’d like to see is an update to ExpressCard that widens the connector to the full width of the wide card and uses 100% of the expansion for more PCIe lanes.

  8. This hack is nothing new. just Google for “egpu expresscard” and you will find quite some information on this topic. I’ve used such an adapter for two years. Nevertheless nice if your notebook only contains an onboard GPU.

  9. Most of you ask why and not why not. This person had a laptop that they loved and a video card so they put 2 and 2 together and got 9. Yes a desktop would have been a faster solution. This is Hackaday not WhatshitcanIbuyatWalmart.com.

  10. I tried this, I wanted to make a laptop with VShpere Hypervisor installed, and then run VM’s on it with the GPU passed through. Maybe even the onboard GPU as well, if I am lucky. Like I do here with desktop hardware:

    https://hackaday.io/project/4927-hydra-multi-os-gaming-console-controller

    However, the onboard Intel graphic would have to be disabled, as I found on the above build on a Mini-ITX board that also had intel graphics. Or something along those lines, you really can’t tell what exactly is required until it works. I installed VMware on another drive, and could pass thru the GPU to VM’s, but would get blue screens or the the same error I got on the mini-ITX before disabling the onboard GPU. It’s my kids laptop, so I had to stick to non-invasive methods. I would probably have to do it while he’s sleeping……..

    It also turns out the external PCIE -> express card device I bought is probably not the best one for hacking, but the others seem to use a huge desktop PSU. And this on happened to be very cheap on ebay. Maybe I’ll try again, but it’s hard to find a laptop with an express slot + VTD + BIOS i can edit? (I have the phoenix editor, and have monkeyed with the 440BX reference bios in ESXi – NO fear when editing the BIOS on a VM) +onboard GPU I can pass thru, otherwise I’m probly booting ESXi off a stick with a partitioned internal drive.
    Unless it “magically” worked 100%, this would have just been a stunt after all.

    A laptop designed for what I tried would be another story.

  11. I did this 3 years ago with a PCIe2.0 adapter ($50) but the best thing to do is install the Nvidia Optimus drivers so the external GPU renders on the internal screen (it compresses the frame buffer and passes it back to the iGPU (intel only)) which makes it truely plug and play (literally you can hot plug it)

  12. “Hacked” the bios of my Dell E6430 to enable the secondary GFX card (apparently fitted for power saving measures tho it comes free on the chipset) and now rocking 3 monitors on it. It’s pretty beefy as it is, but productivity increases going from 2-3 is actually quite substantial for the work I do.

    Hacked – as in googled and followed experiences of others and applied it to my machine. It had been disabled by Dell from factory.

    Oh and didn’t loose my wifi either…

  13. I was thinking of doing this as well. For portability and like people have said play games while docked at home. My question is this is a mini pci e x1 slot correct? If I have a removable graphics card, which is a pcie 3.0 x 16 slot on my laptop it would have a lot more bandwidth and should have more improvement over the wifi slot which is mini pcie 2.0 x1 correct? There would be about a 30 plus difference looking at the specs of pci e 3.0 x 16 which is 126.032 Gbit/s vs the Pcie 2.0 x1 which is 4 Gbit/s. Am I right? Is there a considerable difference. Doing the Math I’d say definitely. Now if the dedicated grahics card is soldered I’d probably just not take full advantage of the full card but it would be better than this dedicated gpu. Dedicated Gpu is AMD 7670M. The one I’m planning to get is the Gigabyte G1 Gaming 980 with the Windforce cooler. I’d have to guess I would be getting a substantial upgrade with this compared to the ddr3 memory this AMD card has. Any suggestions? Desktop will have to wait for now until NV Link, or PCI E 4.0, USB 3.1 Wireless AD all come together along with Skylake or Cannondale CPU’s. Until then, I’ stuck with this temporary solution.

Leave a Reply to MikrySoftCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.