Intel Discontinues Joule, Galileo, And Edison Product Lines

Sometimes the end of a product’s production run is surrounded by publicity, a mix of a party atmosphere celebrating its impact either good or bad, and perhaps a tinge of regret at its passing. Think of the last rear-engined Volkswagens rolling off their South American production lines for an example.

Then again, there are the products that die with a whimper, their passing marked only by a barely visible press release in an obscure corner of the Internet. Such as this week’s discontinuances from Intel, in a series of PDFs lodged on a document management server announcing the end of their Galileo (PDF), Joule (PDF), and Edison (PDF) lines. The documents in turn set out a timetable for each of the boards, for now they are still available but the last will have shipped by the end of 2017.

It’s important to remember that this does not mark the end of the semiconductor giant’s forray into the world of IoT development boards, there is no announcement of the demise of their Curie chip, as found in the Arduino 101. But it does mark an ignominious end to their efforts over the past few years in bringing the full power of their x86 platforms to this particular market, the Curie is an extremely limited device in comparison to those being discontinued.

Will the departure of these products affect our community, other than those who have already invested in them? It’s true to say that they haven’t made the impression Intel might have hoped, over the years only a sprinkling of projects featuring them have come our way compared to the flood featuring an Arduino or a Raspberry Pi. They do seem to have found a niche though where there is a necessity for raw computing power rather than a simple microcontroller, so perhaps some of the legion of similarly powerful ARM boards will plug that gap.

So where did Intel get it wrong, how did what were on the face of it such promising products fizzle out in such a disappointing manner? Was the software support not up to scratch, were they too difficult to code for, or were they simply not competitively priced in a world of dirt-cheap boards from China? As always, the comments are open.

Header image: Mwilde2 [CC BY-SA 4.0].

195 thoughts on “Intel Discontinues Joule, Galileo, And Edison Product Lines

    1. No, they still don’t understand anything.

      The failure of Intel’s IoT stuff was purely related to the documentation. The documentation simply wasn’t there. You couldn’t find a datasheet with a list of registers anywhere. There were zero templates/examples. I know people who had a direct line to Intel engineers that tried to get a simple SPI thing going. Didn’t happen, even with the best help you could have.

      So, instead of solving the problem by 1) releasing documentation and 2) putting a few Edison and Galileo boards, sparkfun’s entire sensor inventory and a few interns in a room and telling them to build shit, they just pulled the plug. But before that, they spent millions on two seasons of a reality show to advertise their IoT solution.

      But hey, we got a nifty defcon badge out of the whole thing. That was probably the best marketing move they made, but again, poor documentation. I didn’t see anyone successfully (electronically) hack that badge.

      1. This right here. It failed purely because of the lack of documentation.

        I looked and looked but couldn’t find a simple picture that detailed what pins are what and where on the breakout board. Ended up mapping the pins by trial and error.

        1. Every now and then, we here at work (small design company) get a call from someone who wants us to try Intel parts in our designs. We invite them in, and the fun begins. No docs unless Intel likes you, and that involves money and annual commitments, multiple NDAs signed in blood, etc.

          One does not simply “buy” an Intel part, and design it in. Oh, no. That would be too easy.

          1. I’ve colleagues having similar experience but we want to use Thunderbolt and trying to get specs to even establish if we can was impossible (only licensees had access but given how much that costs, it’s quite an outlay when it may not even be possible and we had our request refused). There is some light on the horizon because Intel is claiming that they are opening the standard to be free for OEMs and device manufacturers (announced May 2017) but we’ll see if Intel actually follow through and allow it to be more widely adopted & used.

          2. I work extensively with both Broadcom and Intel at work, and they’re both absolutely terrible to deal with for the same reasons. And we’re large enough to ship $millions of each of their parts annually. The worst isn’t the number of hoops you have to go through to get docs, it’s not knowing about the existence of some errata, appnote, or subcomponent documentation because you haven’t been granted access to that corner of their library.

          3. The stupid part about this is that excellent understanding of the standard Intel peripherals going back decades is part of what makes OS development on PCs so much better than on anything else. People know how these things work, nobody needs an NDA on it, they just write the driver. Now we have the Intel that made the IMEI.

          4. Good god, and if you think Intel is bad, try Samsung. I was told by a Samsung rep once that I couldn’t see the NDA required to see the datasheet for one of their SoCs until their marketing and finance folks were satisfied that we would have the necessary credit to make the one million unit minimum order quantity for this chip.

            Since we inherited the design from the customer who inherited it (and a crate of the SoC in question that had presumably fallen off the back of a truck somewhere) I had to work from a pirate copy of the datasheet from Russia to get the required register maps to write the device drivers the customer required… (Which is probably why they said “he’ll with it, let’s hire a consultant”).

            Many companies of that scale really don’t want the hassle of any customer ordering in small volume for personal or custom small volume commercial / industrial purposes and even their reps and resellers are discouraged from providing documentation to the “little guy”.

          5. drenehtsral: on the both side, The Giants also have the contract with their BIG buyer. They have to give some advantage for them to gap the market before we hack and exploit the system, messed up with the sales margin.

            Only the company or chip maker who doesn’t provide a complete solution , ex TI, Microchip , could provide the whole datasheet. If they also provide ended solution they have to feed the downstream with those advantages.

          1. Link please. You claim to have seen documentation I’m sure you can share it here right? I mean who am I likely to believe, the one guy who made a statement that everything is okay and completely not characteristic of the vendor, or the many other complaining voices after the product flopped spectacularly.

            Speaking of, you built a product around it? What are you stupid or something?

      2. +1 on this,
        No *useful* documentation for registers, peripherals or layout.
        We had a direct line to Intel and could not get any info on doing designs with these devices, no luck with buying them at all.
        cool devices with lot sof potetial, but no way for us to put them to use in teh sme weay we use ARM devices .. :(

      3. “The failure of Intel’s IoT stuff was purely related to the documentation.”

        …and the difference between this and the Raspberry Pi situation, when one needs all the details and minutiae associated with the machine-level operation of a processor/GPU is, exactly?…

      4. I agree that for the Edison, documentation was a huge issue. I found the Galileo 2 documentation to be OK, I think that one issue with the Joule was the price. So I guess I am down to using the Curie if I still want x86.

      5. Exactly this. I had a chance to speak with one of their marketing guys for the Edison line at SXSW a couple years ago. He asked my opinion after he overheard me tell someone I had to drop the Edison from a project due to lack of support. I simply told him, with the lack of documentation and/or community support, it simply wasn’t ready for the IoT market. In order for it to resonate, it would have to be a much more open platform.

        He pretty much just walked away.

      6. Of course they wont put shit like registers out there. Thats the intels main thing protecting their IP. It is just a matter of time before intel crashes and burns in all aspects of their business. They are not about putting their product in every household they are about monopoly and milking the most money out of market. Thats why i dont use intel products like last 17 years. The same story goes for microsoft and its last in line fail number 10. People are so against their products that they had to bully people into installing windows 10 against their will. The nazi agenda continues, its just in the economy domain. No wonder world is failing badly in every aspect there is. People arent that dumb anymore u will see massive migration from google soon aswell since its no longer search engine but a poorly made advertisement platform. Its only a matter of time before people realize the need for a new problem. Since the maker community is exploding and there are lots of people getting involved in all that crap that was hidden from them for years, problems get solved fast this days. If you want to make world a really better place intel is like dumbest money on the planet. But the nice thing about all this is that people are becoming aware where they will put their money and will start thinking ahead and invest in companies that are really making things better not just doing campaigns to look like they are green, socially aware, etc. No wonder they want to take your freedoms on the internet, no wonder google is augmenting search results, no wonder bil gates wants to poison you with gmo and spy on you. But dont worry revolution is like a natural evolutionary thing for people shit tends to change after years of oppression. I would also like to addres people at intel and say fuck you, and your time is at an end, same goes for microsoft google etc. Time will cure the cancer that you are on this world.

    2. Microsoft didn’t liked that Linux was used by Intel on their brand new IoT line of devices. Microsoft sabotaged all of those devices and their market. Apple didn’t liked that Intel provide powerful alternative to Apple overpriced junk, so they sabotaged Intel IoT too. Not strange that great independent Intel IoT product line failed. At the end most of the Monkeys who turn their Laptops every day do what is written on the screen from Microsoft or Apple. There was no Paul Allen & Bill Gates in that new generation of people to boost in and lift up what Intel conceived to be next great generation of micro devices.

    1. Better sell it to some sucker while its worth something it will never have any history value anyway. Or maybe im wrong maybe it will be worth something when they write history of intel and how it failed.

  1. Wow this really came out of the blue – a three month phaseout period with no replacement is pretty rapid. I know some of these modules made it into a number of commercial products, whose manufacturers are likely rather unhappy right now.

    1. It’s pretty much Intel way of doing things (only paranoid survives).
      That’s why nobody trust them for long term support outside pc world. Their next failure will be automotive.
      Auto makers want 10+years support and don’t believe Intel can (want) to do it.

      1. They’ve already gotten priced out of the automotive game without any major wins. TI, NXP, Renesas, and nVidia are the historical incumbents. And now that Qualcomm and Samsung have dove head-first into automotive, Intel is dead on arrival.

        1. Also Freescale as PPCs are more commonly used in automotive control modules than anything else.
          A lot of them used to use 68HC11 and 68000 family processors and they moved to PPC in the mid 2000s.

          1. Freescale/NXP is great about guaranteed availability and long term support. If they could come out with an SoC with even half the horsepower as some of nVidia’s automotive focused Tegras, they would own the self driving car market by default.

          2. I was lumping Freescale in with NXP since there isn’t a Freescale any longer. Yes, NXP e200/300 MCUs are everywhere in automotive, but the MCU space is not relevant to the Intel and competition topic. And even they are slowly pushing ARM CMx series MCUs over their own IP now (PPC & HC11/S*12).

            NXP does have many ARM CAx MPUs like Vybrid, i.MX6, etc. But I’m not sure they made a mistake or not by canceling the i.MX7 and initially retreating from the automotive infotainment/ADAS high-ground some years ago. Cell phone makers have a default economy of scale advantage in automotive as that is essentially what is in your dash (in multiple places). Qualcomm has announced some big wins and auto-qualification of Krait SoC. And Samsung’s purchase of Harmon has guaranteed their eventual entry.

            There is some room to innovate in the automotive space and pickup new part markets as the architectures evolve. But a generic x86/64 AEC-Q100 compute module from Intel isn’t one of them.

          3. Freescale is now NXP and soon to become Qualcomm. You may not be as safe as you think you are. In all fairness, every chip manufacture has become a sea of uselessness in terms of documentation and support. All they want is a commitment to several million dollars in revenue. The maker community is stuck with Raspberry PI and Arduino.

    1. pretty much, or anything that requires x86 or perhaps where its just easier to use x86 than another arch. That being said ARM is pretty much the name when it comes to embedded CPUs in IoT or hell just in general

    2. Exactly, the only thing going for x86 always was that it could run Windows programs, but that is next to irrelevant in the embedded world these days, people are just not that keen on doing the whole WinCE thing all over again with Win10…

      1. The tool-chain is extensive, from free to proprietary. There’s a large pool of people experienced with it. Extensive documentation, of all kinds from beginner to experienced. Yes THIS product is a counter-example, but x86 in general isn’t.

    3. If they made something that had a basic – slow x86 processor, and full opengl, from an onboard intel GPU, this would be great for many things where you could use a Pi.

      Being able to use ordinary drivers, and no OpenGL drivers would certainly have shaved about a month off my last project (as opposed to the Pi with Dispmanx etc).

      I can’t see intel hitting the price-point though.

      1. Pi price point? Definitely not.

        Pi form factor (or quite close) – take a look at Upboard – not a bad board at all IMO, but it is much more expensive. (but not much more expensive than a complete Edison setup including carrier – probably one big reason Edison is dead)

      2. x86 embedded boards are quite plentiful, in thin clients. Some of the newer ones can take 4 gigs RAM, SATA hard drives and have decent integrated GPUs. A few even have one PCI slot. Wouldn’t be surprised if the latest models have PCI Express.

        I’m going to put MS-DOS on an old WYSE Sx0 to run the DOS control program for a Light Machines PML 2000 CNC milling machine. That mill only needs a single RS232C connection to its control computer, and the Sx0 has that. Hopefully I’ll be able to get the USB ports working with DOS for loading GCODE files from a flash drive.

        It’ll boot from a 64 meg Apacer 2.5″ IDE flash module and load all the software into a RAM disk in 512 meg DDR1 on a single SODIMM.

      3. > If they made something that had a basic – slow x86 processor, and full opengl, from an onboard intel GPU, this would be great for many things where you could use a Pi.

        Err, if you want basic and slow the outgoing devices would be exactly that… A Pi runs circles around that crappy i586 based Quark core. Even an Atom (which is also available on small boards) it many times faster…

  2. The thing which always made me steer away from the edison was the insane connector. It’s kinda hard to experiment with something when you need an electron microscope just to deal with its pin pitch

    1. Yeah. Once you included a breakout board, the price/performance of their solution was simply not compelling – ignoring all of the OTHER issues others have brought up.

      The only case where it was compelling was if you needed the significantly reduced size – IF one of the smaller breakout boards had the I/O you needed. I know of only one case (the DIYAPS project) where this turned out to be the case.

  3. in the universe of things, there is only one who can survive and it is not by chance that the contestants are open source. And check out how smaller and easier it is to work with Onion, Yun, Raspberry. Intel’s gear was meant to be a failure the moment they did not support developers the way others do.

    1. I’m a software engineer that decided to use the Edison for a personal wearable computing system. The electrical pinouts and data sheets for the breakout board were okay — no worse than any other board I’ve tried to use. Even the hirose connector wasn’t /that/ big of a deal since the breakout board was available.

      Honestly, the documentation and electricals weren’t that big of a deal — the biggest issue I had with them was the fact that the operating system was a steaming pile of hacked together garbage based on Yocto Linux with Arduino compatibility heaped on top. I mean, the SoC was capable of running full blown Android installs (prior experience using it at work) and yet it was targeted for… IoT?! Literally no reason to hamstring the board the way they did.

      1. I was first gifted an Intel Edison as conference swag, which was around 2015.

        I was able to find decent enough documentation at that time for what I needed. Frim around 2015-2017, they had a community forum that was pretty decent too for asking questions and getting answers from Intel engineers. They were pretty helpful from my personal experiences. I even found and fixed a bug in their WiFi setup script. At one point, I figured out how to get a custom Yocto build working with the “bitbake” toolset. It was very unique compared to other systems that I’ve encountered, and Yocto didn’t seem to have great documentation even compared to other more advanced Linux distros such as Gentoo or CoreOS. So that part was pretty unfortunate. The worst experiences I had with the platform were actually resolved by moving to ResinOS (now known as BalenaOS)

        This distro still supports Intel Edison to this day! It’s very similar to CoreOS in that: It’s a container-centric bare bones OS for IoT boards. They support RaspberryPi, Intel Edison / Galileo, and many other hardware platforms.

        It’s a vreat option for breathing new life into these old abandoned Intel boards because you can avoid the limitations of Yocto and its’ build system by using any base distro image of your choosing with Docker containers.

        I’m running containers using Ubuntu Linux right now on an Intel Edison, and I’m able to compile and install custom kernel modules on the Intel Edison running BalenaOS using the privileged Docker container mode. This platform makes it much easier than Yocto ever did!

  4. While the Galileo doesn’t shock me, nor does the Edison to be honest, I am surprised at how quickly the Joule was resigned to join them in the graveyard!

    The Joule must barely have made it into distributors warehouses before it was canned… Out of the 3, that’s the one I had some real applications in mind for. Ah well. Anyone want some Edison modules, going cheap? ;)

  5. My guess is the ARM boards that are out there are better value and have a bigger community and better available documentation, code (drivers, kernel trees etc) and support and that unless you absolutely need an x86 CPU for some reason the ARM stuff is the better choice.

    1. The raspberry Pi was never the fastest SoC around, but it did have very well documented hardware.
      Recall, Broadcom slowly opened up their IP over several years as the bare metal hackers mined the hardware.

  6. This is why my company is doing the transition over to ARM based chips. We were merrily humming along, liked the x86 chipset our products were built on, when suddenly an EOL for the chipset came out. Last time buys were announced and no real “use this as a direct replacement” was offered. At least non that was going to be supported by our stack (Win CE). So we looked at what was going on and found that ARM had a least published availability till dates with most vendors. So we started redesigning for a different architecture.

    Intel suddenly saying “and were done” has happened before. Designers just have to accept it and move on/up in the product lines.

  7. This is another proof that marketing is not everything : you can’t spend millions of freebies and prevent people from using them. I saw the boards, WTF ? They don’t understand their market !
    Intel was being Intel and now shareholders can breathe again.

    1. ” I saw the boards, WTF ? They don’t understand their market ! ” my thoughts as well, when I got mine I thought this is gonna be really hard for makers and beginners to use, let alone understand its potential. Now if you are a full blown engineer with plenty of PCB layout experience the edison could be very useful, the only people I knew of using them were full blow EEs with years of experience.

      1. Meh, I helped design a product with them (a IoT power meter) and it was not that bad. I did find a couple little quirks with them. At startup they inrush 10 amps on the main power rail. If you cant supply this 10 amp inrush the module will not start up. I tried limiting max current on inrush and if I did anything to limit it the module would just not start. Ended up redesigning the power front end for it which was not really that big of a deal, just copied the circuit on their dev boards and it worked perfectly.

    1. I like the fact that Intel gave it an attempt. Market diversity is more important in the long run for an industry than an arduino-raspi monopoly, even if they are more open and have a better design.

      1. Thanks, this news sure isn’t the best one regarding Keymu, but (oh how much) I agree with all the other comments out there: for a computer module designed for everyone, it sure lacks documentation and support from Intel. Getting easy things done with this module was and is A PAIN, and it shouldn’t be!
        I still remember how a trivial thing like using SPI with DMA turns out to ba a nightmare on the Edison.

        Anyway, it’s sad for the project but it’s for the best really, Keymu as many other projects out there deserve better than the Edison.

        1. “The console does work but there are always improvements to be made, and it just seems useless now to keep working on it. I guess I have to sit and wait for an equivalent alternative.”

          Wait.. what ?!! I discovered your Keymu console prototype on Had.io two weeks ago and I bought an Edison module after that, expecting to give it a go with your project. I never heard good things about this way too expensive Intel IOT module (with its overpriced breakout boards) but thougth that you probably choose it wisely for emulation purpose or some nice features, and not only X86 in a tiny package.

          I’m kind of broke but was lucky enough to find it sold as new on Amazon at 17€ (it was corrected to 94€ since !). Price was reasonable and I was planning to use it later with your board because you wrote “everything will be provided so that you can build your own. “, but it doesn’t look like such a judicious purchase now that you seem almost relieved to drop it before providing anything.

          Your comment doesn’t sound like if a lot of specific sofrware development would be thrown away, but more like an unreleased product (even open source) that will have to find a replacement and place an order at a different supplier, and, if you weren’t to publish Edison-based software and schematics after all, not really anymore intented to the (many) people who already have an unused module in a drawer, like me now, thanks to you :((( (no big deal, I’m used to make bad decisions based on other’s broken promises anyway ;)).

          1. To be honest I’m glad that you’re almost yelling at me because it made me understand people actually bought an edison chip especially to build Keymu for themselves, and that’s very gratifying. So for the few crazy adventurers like you out there, of course I will provide it all.
            I have never said I wouldn’t though (and it was never gonna be the case), but to make it clear, despite yesterday’s annouce, I will finish the article and build the website on which I will provide the code, the electrical and mechanical design with tutorials and notes. I am still working on it but I will not leave these things unfinished.

            I will not however keep developing with the Intel Edison, thout would be unproductive now. Instead, I will try to find another computer module that could fit this project. If mechanichal changes need to be made, that’s perfectly fine by me, however the size HAS to be either the same or smaller.
            It might be the smallest emulation console in the world but it is barely small enough for a keychain, so the number one requirement for the next iteration with another chip/computer module will be to either keep it the same size or make it smaller.
            And sadly for now I do not know any other existing solution, the Intel Edison had a lot of flaws, but its size/power ratio was unique.

          2. Thank you for your reply Vincent, I was rather disappointed just after reading you, sorry if I was bitter, now I’m better :) When I saw your project, I thought that finally the reviled module was getting some love, when in fact it was a marriage of (in)convenience. Now that the platform is dead for sure, it’s a perfectly fine time for you to move on and go to a greener grass.

            Don’t feel too commited to publish the Edison-based project if you got better things to do (I’m not even sure to build this whole tiny console myself, personnaly I was more into the software aspect and board design than the finished object). And you are probably right, this EOL annoucement could dampen general interest about this cool DIY project you wanna provide to people, now I understand better your point of view, and I wish you good luck, I’ll stay tuned.

  8. The Edison was too expensive and not marketed to the right group
    There are plenty of different ways to control a neo-pixel over wifi, plenty of ways to text you when a motion sensor activates and plenty of ways to make an rc car
    They should have aimed at more complex projects … The hardware was there

    Plus using a connector never designed to be mated over and over in a device designed to be mated over and over was a bad move

  9. I bought an Edison because it seemingly offered a lot of power in a small package. Apart from a connector that requires a breakout to do anything useful, thus nullifying the size advantage, the problem was just about zero support and documentation. So, after testing the module with a blinking light routine, it now just sits on the shelf. I also bought a Samsung Artik5 module to evaluate, primarily for the same reason, a lot of power in a small package, but also because it has built in crypto hardware. Once again, no support or documentation, especially for the crypto hardware and an almost complete focus on using their centralized cloud makes a nice piece of hardware essentially useless.

  10. Why did they fail? Because they were shipping too powerful a product, with too many features, and seriously overpriced as a result. It also lacked any solid documentation, but quite frankly I think that was a side issue. Look at the ESP8266, it flat out had no documentation, but because it was priced correctly the community wrote their own.

      1. It didn’t have on release though. On release the community was left with minimalist chinese documentation. I don’t remember how long it took espressif to realise they had a winner on their hands but at least months, possible over a year.

        1. Agreed about the minimal documentation on launch, and I’m an Espressif engineer :) the story was that at the time, the ESP8266 as a fully functional open development platform wasn’t something that was really envisioned; the focus was more on customizable ODM applications. Only when the community got a hold of the ESP8266 and ran with it, we decided it may be a good idea to actually document stuff, but obviously, with a relatively small team already having their hands full of the actual planned stuff, adding unplanned stuff takes a while.

          1. It seems the very cheap price point for a very capable chip on an already tinker friendly prototype PCB layout is what ripped out the all-scrutinizing-eye of the community, Not just merely caught their eye.

            .

            If Intel had a $5 to $25 x86 SoC platform >=1200Mhz/core with tonnes of GPIO* with built in >= 512MB on-die fast RAM (They have the fab-labs to do it)… Hmm, and maybe dropping IME alongside other useless locked down energy burners. *Featuring most of what they can be asked to document… It wouldn’t be a Pi-Killer, but it’ll be a start in the right direction.

            *=sentences are relative to each-other.

      1. Not just any old 400 MHz x86, but a 400 MHz 486. CISC design from 1989 vs RISC designs from 2004-2014. On paper at least I think it could well be slower than the sub-$2 STM32 boards from China running at 72 MHz simply because of the absurd number of clock cycles required for stuff like multiplication.

  11. Instructables/Intel gave me like 15 of the Edisons to give away/promote, but I just couldn’t find any good reason to use them. It did some cool stuff in such a small package, but the documentation was terrible, their implementation of BusyBox was non-standard and some coreutils were just broken, so building anything on it became a nightmare. It was expensive for what it did and I’m glad the IoT stuff is being spearheaded by ESP.

    1. Was close to saying the same thing. Tried to build some prototype stuff but kept having to try to build packages from source and giving up because I didn’t have time to solve all the build problems. It was always much quicker to just use a Pi and I was skeptical about the Edison being a solid long-term commercial option.

      Also, lol, actually clicked reply without noticing the name.

      1. Intel sent me 3 Edisons to play with. I plugged one in. The login prompt was buggy.

        Yes, that’s right. The login prompt.

        I told the Intel rep, as tactfully as I could, that I was seeing a severe shortfall in user-friendliness compared to Arduino, and never heard from him again.

  12. FiOT, failed Internet of things, welcome intel to the family. I Wonder if all those documents were easier to get with a NDA as that seems to be the trend, no NDA no docs. The one things x86 has never done for me that arm devices has is bare metal code, when you start involving a bios and areas of the board hat have thier own OS + code like IME and i cant look at it. well that just puts me right off.

      1. Qualcomm desired to be the Nintendo of the cellphone world. They set things up as a completely closed world with their JAVA knockoff called BREW. Binary Runtime Environment for Wireless.

        If you wanted to write a BREW app you first had to pay for the SDK. Then your app had to be approved by Qualcomm AND every cellphone service provider, for every phone model you wanted it to be available for. Copying apps from one device to another was very difficult due to Qualcomm’s DRM made to prevent sideloading.

        Thus there were/are very few BREW apps available and the only freeware ones came from the likes of Verizon, Sprint and AT&T. It’s not possible to write a freeware/shareware/donationware BREW app and put it on a public website for anyone with a BREW phone to download and install.

        Qualcomm’s desire for total software control of their system has made BREW an also-ran in the phone market.

    1. It all depends how much time you are willing to invest. Trust me, you need documentation. I admire your macho attitude, sadly misplaced though it is. I dearly love reverse engineering, but there is no excuse for not providing top quality low level documentation for a product like this. Without it, it is doomed to fail. The poster child in all of this is the 4000+ page technical reference manual for the chip in the Beaglebone Black. That is the way to do things!! Thank you Texas Instruments!!!

      On the flip side, I would rather be working with ARM than x86 anyway, so it is a total win to see these die.

  13. Please forgive my ignorance. Why does an x86 board need a ton of documentation? Beyond what IO address is mapped to what pin (they did at least document that right???) isn’t it pretty much the same as the desktop processors which people have been documenting since the 1970s? I thought that was the whole point of having an x86 board? I still have my college textbook about x86 assembler as well as a few assembler books I bought at garage sales in the early 90s with the intention of one day (that still has not come) actually reading them.

    My own only purposes for an x86 board would be to make a print server for a printer with no ARM drivers and maybe to serve as a really low end desktop that actually supports shitty internet videos that still haven’t been upgraded from Flash to HTML5. For everything else a much less expensive ARM board serves just fine. Intel’s offerings definitely did not fit my use case!

    My own uses for x86 don’t seem like much but had they actually matched ARM on price, power consumption and solderability I would have been saying “x86 does everything I want and is familiar, what do I need ARM for?”. I would have used x86 for everything! I think Intel has really screwed up and thrown away their own monopoly. Am I really alone on this?

    In other words, I prefer to keep everything on just one platform. One reason, which I suspect I share with many others is that there is only so much time in a day and I don’t want to learn two. A second reason, which I suspect is rare is that I always dreamed of having all my computers and devices sharing resources per something along the lines of Mosix/Open Mosix/Linux PMI. The fact that a couple little things require me to have x86 while ARM makes so much more sense for everything else really ticks me off.

    1. These weren’t PCs. You need significant patches to the kernel to deal with the lack of ACPI / e820 / mptables and standard PC peripheral I/O (no 16550 @ 3f8, no port 80, etc). On Edison at least, all of the I/O was through a coprocessor (i think it was a Quark?) so any flash, SPI, GPIO, or WiFi access was through a mailbox to be serviced by the coprocessor’s RTOS. And Intel barely documented that.

      The only advantage would have been if it used a more common Linux distro (Debian, OpenSuSE, etc) since you could install x86 fork packages. But since Intel only pushed Yocto there was no point to the x86 compatibility.

      The only thing these boards had going for them was MIPS/W, and that was entirely due to Intel’s heavy investment into process technology and not any innovation whatsoever.

      1. … Hm. Odd. Pretty sure most motherboards load firmware off an SPI Flash rom? … And isn’t SMBus built on i2c? And how do motherboards monitor their voltage, fan speeds, and twiddle their hard disk or network LEDs if they lack ADCs and GPIOs? (Hint, i2c GPIO expanders slaved to SMBus)

  14. I was really hoping for a Edison-like board with a new HPS variant marrying an x86 core to a high speed Altera programmable fabric. While this announcement doesn’t significantly increase the precipitation chance of raining on my dream, it does start to cloud up the sky.

    1. They released an atom + altera fpga E600C SOC( “stellarton” ) way back in 2010. Don’t know if it’s still available though. Haven’t seen any updates.

      Only made sense if you were going to make custom boards, otherwise too many pins to bring out from the FPGA.
      Also, these days you have the Zynq which is the right way to do it, since the Zynq comes up like a regular ARM SOC with standard peripherals and the FPGA fabric can later be tapped into. The Stellarton on the other hand was an FPGA first, much more pain bringing it up from POR

    1. They sure look interesting, they could be ESP32 contenders. I’m specifically eyeballing those hardware neurons for pattern matching, but where is the example code for that?

  15. Something else as well as all of the above. When I design home automation – for my own home – I want to know that whatever prepackaged parts I use will be available in a decade when these fail. No exceptions for “well this is shinier” or “better” or “cheaper” (well, intel fails on that anyway). I don’t even use my own custom PCB’s with PIC uPs like the old days, because…I don’t like having to make more. (I’m not young). This is the real reason I use overpriced/mip arduino or pi – there are so many – and yes, so much support – I’ll be able to buy them 10 years after I’m dead on ebay…
    While intel and others have a long, horrible track record of which dropping these is merely the latest example. Can’t count on a vendor? Then don’t count on them. They should not be surprised at the lack of design-ins. They just want the one big score, the lottery win of some huge product – then they’re fine. Pretending they care about the (also enormous) money from we little guys – you’re kidding. They are defined by their actions.
    Companies i used to design product for would barely use a PC mobo because of fear they wouldn’t be around forever and they’d be stuck without a product. This few month or a year crap? No way.
    So it’s not just us they lose on this behavior. It’s their own badly defined target.

    1. Intel is great but they do this to themselves, a very aggressive bean counter group who plots the ROI on walking to the water cooler, has to be profit or forget it. They will can projects that are 96% finished and on time.

  16. Was @ BAMF a few years ago when the Edison was released. Was out looking for the next core for a significant project at the time. Took one look at the development chain, docs, price and the goofball connector and put it back on the shelf. Went with a beefy Atmel ATMega chip that could be integrated into the design for under $10 single quantity, the docs were right on the website, and a tool chain that was already installed in our dev environment.

    On the other hand, don’t forget intel was/is pretty strong in the embedded community for a very long time with the 8085, 80186, and the i960. The i960, in particular was a pretty rocking bit of kit for it’s day.

    On the intel NDA front, I get to see some distance into the future on several intel product lines at $DAY_JOB via NDA. These people know what they are doing, technology wise, and have a clear vision. This decision smells like bad marketing to me.

    Remember Captain Zilog anyone?

  17. Intel Edison while a neat product, had many flaws:
    – Was marketed to the wrong crowd(almost non-existent) crowd.
    – Intel forced their build of yocto on everyone instead of going with something straightforward like Debian,
    – Had a hard to use connector meaning that the user had to pay at least as much on breakout boards as the darned SBC to be able to do hardware interfacing,
    – Had the most obscure GPIO pin multiplexing that I’ve ever seen

    I hope Intel learns a lesson from this experience. That lesson should be that they ought to stick to making i3//5/7/Xeon CPUs and forget about the IOT/Hobbyist/SBC space. Their corporate mentality just doesn’t work well within that space.

    GoodBye Edison, Gallileo and Joule

    1. If you still have some and like debian – check out ubilinux. i was using edisons as BTLE gateways long before the rpi shipped with bluetooth. the added onboard NAND was USB client were always welcomed benifits.

  18. The two biggest problems why it failed no clear documentation and no tools that allowed someone to get really low level with the chip.
    Other mistakes use of an oddball hirose connector that was difficult to source and work with.

  19. I have my Galileo in a drawer close by, so I can comment on this with some authority. These things deserved to die.

    1) The utter lack of documentation (or miserable nature of it) is certainly a key factor, and has been mentioned many times already. People I know “with clout” could not get questions answered or decent documentation.

    2) Beyond that, the miserable price/performance ratio. I benchmarked these relative to an ARM based board, and guess what? The 400 Mhz x86 on the Galileo cranks Dhrystones at about 4/10 the rate of a less expensive and better documented ARM board running at 1Ghz. Why pay more for something that runs 4/10 the speed of something smaller and cleaner? Unless you have some weird nostalgia for Intel and x86.

    1. On our algorithm Joule gave 6x boost comparing to Raspberry Pi 3, and 10x on synthetic benchmark.

      The biggest benefit is connectivity: fast 24 MB/sec Wi-Fi and USB 3.0

      The next best vendor to work with remains nVIDIA

  20. Dumb question – why are x86 platforms still so bloody expensive? The design is ancient so I would think the silicon would be incredibly cheap by now (for lower end processors anyway). Any time I’ve ever seen an x86 based anything, I’ve assumed that it will be ridiculously expensive compared to its ARM counterpart.

    1. Three main reasons:

      1) x86 is pretty old and over time it has accumulated a lot of (perhaps superfluous) instructions and is a CISC architecture. These still need to be implemented, which bumps up the complexity and hence cost. ARM is a RISC instruction set (although still with a lot of instructions!).

      2) x86 is essentially a monopoly for Intel, AMD is the only real competition. In contrast, the ARM core is licensed to vendors, so there’s intense cost pressure between chip manufacturers.

      3) ARM chips are generally more integrated, so you don’t need as many external support devices which lowers the cost. Also, they are lower power so the power supply demands are greatly relaxed (this is actually a surprisingly large cost in a motherboard).

      Hope that’s helpful, and would love to hear other opinions :)

  21. I hope whoever handled their software support staff takes note. Their support was hopeless. One example, just take a look at how one re-flashes a new chip if the initial java script does not work (https://software.intel.com/en-us/flashing-firmware-on-your-intel-edison-board-windows). Can you believe such a run around in 2017. There was one good support person there “Jose” the rest were hopeless. I spend a lot of time with this chip see:-
    http://s100computers.com/My%20System%20Pages/Edison%20Board/Edison%20CPU%20Board.htm
    in the end I had to figure most of the stuff out myself.

  22. I saw an intel ad the other day that advertised some “deep learning Iot future” muck aimed at small business with no actual advertised product, except for “cool futuristic intel”. It was an effective ad, except for that fact. I think intel is going the way of IBM, they feel their market share is garunteed, so they focus on just keeping in the public attention.

  23. I have two robots using Edison and replaced a Beaglebone with one. Mainly, I gained space. An Edison on the small breakout board on a custom PCB can fit a lot in, with free WiFi. I don’t like Yocto Linux much (contrary to several comments, it doesn’t run DOS!) but the Eclipse-based tool chain was fine. Their mraa library is poorly designed but at least it’s open source. Overall, I’ve had success with it. I’ll buy a couple more before the end of the year and then maybe switch to Beaglebone Blue.

  24. Whilst Intel refuse to licence out their arch they will struggle to make in roads. ARM will licence theirs to anyone, they are an IP company. That is why ARM is winning, was very easy for phone manufactures to create a SoC. If you design a good GPU and you need a CPU as has to be in the same die. You have one option, by an ARM licence.All the other points raised just helped it.

  25. I use the Edison for my autonomous robot project. I have had no major issues interfacing computer vision, lidar, and motors with the board. My biggest issue was the 1.8v needing to be level shifted to 5v for my md22 motor driver. With that being said, I`d really prefer that the whole mess use 5v rather than 3.3. This build was the first electronics project i`ve ever undertaken, in my experience it was pretty easy to use.

    I am interested in changing to a different platform that will have support for years to come. What does everyone recommend?

    I`ve looked at Beaglebone and Raspberry Pi but I really would prefer a dual core processor.

    1. Lucky for you if you didn’t have any problems, I couldn’t list all of them on the top of my head.
      I you really need a multi core cpu, you can go for the Rapsberry Pi3, it has a quad-core cpu.

  26. I was a greenhorn designing fast instrumentation in the early 1990s, and I decided to use an FPGA. I had to choose between products from Intel, Xilinx and Altera had products that I could use, and I felt that two latter companies are small and unknown, so Intel would be the safe choice; I got an eval board, designed a proof-of-concept circuit… and Intel close down their FPGA division. They have a pretty tup-down and ruthless management—if they decide something is not in their roadmap, they get rid of it really quickly.

    1. This is how TI operates as well… If the Management sees a product line not making enough $$$ they will punt it in a heartbeat. TI has burned me a few times and so now I have learned.

  27. 20 years ago, everyone would beat you over the head for failing to use the x86 “industry standard”, but now that ARM is the industry standard across many sectors, the same argument is notable for its absence. Why? Because it was really just a marketing ploy for intel fanbois.

    Surely it’s time to just give up: at anything less than the most powerful desktop CPU out there, ARM is not only the dominant architecture, it’s the better one. And it did it from a position of market and financial insignificance. We moved on from DEC, it’s just time to move on from Intel.

  28. What I liked about the Edison was that you could simply spend upwards of $50 for the board (plus shipping). You could hold it in your hand, put in on a shelf, even hand it to other people. Then, if you wanted to turn it on, all you had to do was spend another $20 for an add on board (plus shipping), wait for that to arrive, then spend hours searching the Internet for someone who got it working and was nice enough and capable enough to write an understandable tutorial. I, for one, am looking forward to their next attempt, hopefully it’ll be even more expensive and even less powerful than anything else.

      1. Nah,
        Should be a black-box device called Intel BlackBox Guess it’s purpose:

        The advert actor/model should be a tall brunette with moderately largish tits, and a bright face.
        She should be wearing a “W*ore of Babylon”-red+black coulored laced tight dress with a full Cleavage exposure above the bra section.
        The BlackBox should literally be a black box pressing from under the actor/model’s tits to raise them thus making them look bigger.

        Not subliminal advertising……. Much

Leave a Reply to TomCancel reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.