This looks like the end of the road for Intel’s brief foray into the “maker market”. Reader [Chris] sent us in a tip that eventually leads to the discontinuation notice (PCN115582-00, PDF) for the Arduino 101 board. According to Intel forum post, Intel is looking for an alternative manufacturer. We’re not holding our breath.
We previously reported that Intel was discontinuing its Joule, Galileo, and Edison lines, leaving only the Arduino 101 with its Curie chip still standing. At the time, we speculated that the first wave of discontinuations were due to the chips being too fast, too power-hungry, and too expensive for hobbyists. Now that Intel is pulling the plug on the more manageable Arduino 101, the fat lady has sung: they’re giving up on hardware hackers entirely after just a two-year effort.
According to the notice, you’ve got until September 17 to stock up on Arduino 101s. Intel is freezing its Curie community, but will keep it online until 2020, and they’re not cancelling their GitHub account. Arduino software support, being free and open, will continue as long as someone’s willing to port to the platform.
Who will mourn the Arduino 101? Documentation was sub-par, but a tiny bit better than their other hacker efforts, and it wasn’t overpriced. We’re a little misty-eyed, but we’re not crying. You?
81 thoughts on “The End Of Arduino 101: Intel Leaves Maker Market”
Last nail in the coffin…
Well since MIG was already shutting down, this is more like the last product on the shelf…
Wouldn’t give a damn, really.
I agree. Intel simply stuck a toe in the mater to test the hacker community. I doubt the company ever saw the small-board computer market as useful. Intel might have considered this a marketing exercise to keep the company’s name in peoples’ minds. Oh, well, with all the other boards, particularly those from Parallax with the multicore Propeller and the ones that use an ARM Cortex processor, the hacker community still has many excellent choices.
What’s an Arduino 101? :/
Exactly my thought. Didn’t know it existed.
did it come after Arduino XP or Arduino 80?
Wait…there’s Arduino’s other than the $2 Nano knockoffs on aliexpress?
Seriously, I love those Nano’s but Arduino really has lost their way, They’ve been standing still for 5+ years while the maker movement blew up around them. Compare the particle photon to anything $19 gets your in Arduino-land for example.
I think I coincided with Arduino Bob.
that’s an arduino-for-workgroups. don’t even ask about how many floppies it takes to load software onto it!
I bought a 101 not too long ago but didn’t get started with it yet, guess I’ll pass it on to someone else now.
Popularity bandwagon, Too much Too Late and the prices were Too High..
Regular other boards go for < 1/4 of the price of Intel's, Their marketing did not save them this time surprised they didn't force retailers to sign agreements and only sell their boards with Quark chips etc cause they've never made garbage before and forced retailers into that submission ever.
I think that they just completely misunderstood the market.
I have just checked the specs of the Arduino 101 (another bad sign when you learn about a board only when it is discontinued!) and basically the Curie chip was a fairly slow x86 MCU, with a decent set of peripherals and some rather mind-bogglingly dumb ideas, like hw acceleration for neural networks. Did they really believe someone will program neural nets on an 32MHz MCU with some 80kB of RAM and no meaningful high speed interfaces for stuff like cameras, on a board marketed to beginners and non-engineering types (as is the entire Arduino ecosystem)? Or was it there only so that some manager can put “Advanced AI support!” bullet point in the power point deck? What were they smoking?
Things like this, that have forced them to sell the hw at cost or even at a loss while Chinese competitors were selling basic, far less capable Arduinos (which still do 99% of what most people need them to do) for $10 and still making profit, poor marketing and nonexistent documentation is what doomed it.
Yeah it was strange, I never found anyone using the neural network stuff for serious applications. They now have the Movidius for AI acceleration, maybe that one makes a bigger impact.
The General Vision component inside Curie was horribly documented and “supported” very late in the game. Too much focus on selling a “Raspberry Pi Killer” and/or getting design wins from the top two tiers of the Fawkes Maker spectrum (Pro-Maker, Entrepreneur). Oh the stories from America’s Greatest Makers, and in fact the entire history of the Maker Innovator Grup [disclaimer: I was soft eng in MIG and a mentor on the show, and if you were at an Intel-sponsored hackathon you may have run into me…]
Woulda coulda shoulda….
Intel and AMD…..
Intel has blobbiness issues with their mainline stuff, I’d expect the same for the “Maker’s” market… i.e. an equivalent to the ME in order to pre-boot a board into a boot-loader loadable state… RPi does a similar thing but at least the firmware is on an SD card instead of SPI/I2C-flash.
AMD are not much better either… Worse than Intel if you include their teasing of open-sourcing their pre-boot platform security processor (AMD-PSP) architecture: Would make a good solution for running critical tasks whilst in a shutdown/standby state to save batteries… An ultra-low-power execution mode. But nope.
RPi – the community are (presumably still) working to open source the GPU side of things…
This is before the Intel documentation crisis. One has to delve into the history of x86_64… or just plain x86, just to use the sandboxed x86 architecture environment whilst (usually) being restricted to a UEFI sandbox…. That is as far as I could decipher from such x86-maker communities before giving up on post 2012 “modern” x86 as a platform for DIY/maker/hacker uses.
Don’t know how many felt the same, though others’ knowledge and experiences are likely to vary (and possibly more accurate than mine).
Seriously, x86 could of been a thing given the p-states capability and in my experience: the idle temperature of a heatsink-less core i5 2nd Gen as my reference of what x86 is capable of (power saving wise).
Intel should just stay away from making anything mobile. They keep coming up with new platforms and then ditching them after people have invested time/money on them.
They’re just firing in the air to keep the competition down. They won’t bother actually aiming at anything unless there’s some serious threat to their CPU market.
“We’re a little misty-eyed, but we’re not crying. You?”
I was really hoping for something in the price range of a RasPi with an Intel compatible chip. I didn’t really want to do any “maker stuff” with it though. My plans were things such as CUPS server for a printer with crappy Intel-only Linux support and maybe some sort of cheap Pi-like desktop that supports Wine and Flash.
Intel acted like they wanted to be the Cadillac of IoT and I just wanted them to be the parts shop that keeps my old jalopy going.
The need for an x86 SBC with 1-2 GB ram, can run Linux and has similar performance to the raspberry pi is still there. The SBC must have things like WiFi, HDMI, USB and preferably an Ethernet port. Sata would also be nice. Add some gpios, pwm,ADC, and spi and i2c and you’ll be golden. Heck even at $60 this would be an easy buy for me because x86 generally has much better support for Linux due to its unified ecosystem and x86’s popularity. Of course Intel would have to prove their dedication to this platform which can be easily done by guaranteeing that it can be sourced for a decade.
This, this, a thousand times this. Take the Compute Stick’s motherboard, add a SATA port and GPIO, and you’re close. Make a bigger version based off the NUC’s motherboard form factor.
Even if they don’t take off in the maker community like the Pi did, there’s a huge market for industrial control that could use replacements for their old IBM PS/2s full of Comtrol cards.
WiFi would be good but for 1/2 my own use cases I wouldn’t use it and could just use a cheap usb module if I had to for the other 1/2.
HDMI is nice for the desktop like applications. I would leave that unused too for the more server or embedded type stuff. Actually.. if I am going to use this for a desktop like application it’s only because I am looking for cheap. VGA LCDs abound in the local thrift shops here. HDMI capable displays do not. I would really like to see some SBCs with built in VGA! But again.. for 1/2 my applications the display would only be used for initial setup and I could make do with even composite for that.
Ethernet – yup, I use Ethernet a lot. I too want it in there. But then again… that too is easy to hang off of the board by USB. I don’t want to pay a lot just to have Ethernet.
SATA, Yes! Ok.. I would only use this in a few places myself. Elsewhere it would just be an empty connector. But.. let me add one more requirement. SATA that is built into the chip, not a SATA USB chipset just pasted onto the board. I use my Banana Pi as a file server and that works great for me! They say it only works for power-sipping laptop micro-drives or something like that. All that really means is that I ignore the HD-power output connector on the SBC and power the hard drive directly from the power supply. My plain old 3 1/2″ desktop drive works just fine this way!
Still.. of the SBCs that do have SATA most are just USB2-SATA chipsets pasted onto the board. So.. your hd access speed is still bottlenecked by USB. Maybe someday they will use USB3 chipsets for this and that might be ok. For now though, these internal-USB based SATA SBCs are no better than a plain SBC with a SATA adapter on it’s USB port. In that case.. what’s the point?
GPIOs, PWM, ADC… I don’t really get the point. SBCs usually run a multi-tasking operation system. That means you can’t guarantee real time access to those pins. Just give me as many hardware UARTs as you can and I will dangle micro-controllers off of them for all this pin-access stuff. I don’t want my hardware physically crashing into itself because the SBC is busy maintaining it’s filesystem or anything like that!
SPI, I2C… sure, so long as it is implemented in hardware… with it’s own buffer. Actually, I2C might be a better solution for those microcontrollers I wanted UARTS for.
Intel made many mistakes, but among the worst is Arduino 101’s poor quality software. Arduino compatibility involves much, much more than simply porting the core library functions like digitalWrite and pinMode.
The Wire library (among the most important libs) in particular _still_ has terrible issues. Here’s a couple times I put quite a bit of effort into giving them reproducible test cases. I even shipped a hand-wired shield of one of their engineers.
As you can see, issue #238 (I2C is not reliable) remains unresolved to this day, despite multiple engineers contributing to these github issues for over a year. In contributing test cases, with code, photos and even sending them hardware, I had hoped they would manage to resolve these problems.
I wish I had known about the extremely poor software and support before buying one over an UNO. I bought it for the BLE and accelerometer and I thought with these on board I’d have access to a lot more ideas ect. I’ve had problems with the poor software as the libraries of for it aren’t compatible with almost all RF transceiver libraries. (and it’d be almost impossible for me to write a library that is compatible). As well as BLE is really weird to work with and I’ve found no good examples for writing android apps that use it, only stuff like Blinky.
Now I know it would have been easier to get a cheap UNO and then separately get accelerometer and bluetooth.
Thank you very much for your effort, I greatly appreciate your work. :-)
Historically Intel was never really an endorser of small volume, hacker-ish customers and so this is not surprising. Way back in the late 80’s and onward companies like Motorola (which became Freescale and now NXP) actually created free compilers for their various microcontroller offerings (68xx, 68xxx, Coldfire, Kinetis, etc) in order to provide usage of their devices to those companies, hackers, students who could not afford the very expensive commercial compilers. The result was that a lot of University Engineering courses and projects used Motorola (Freescale/NXP) devices and thus promoted familiarity of them that carried into jobs that the Students eventually found. At the same time Intel had no interest in this market and focused entirely on their relationship with Microsoft/Windows… which made them a ton of $$$. I find it interesting that Intel seeing the Desktop/Windows market dwindling away that they “now” have discovered the smaller student/hacker market… only to discover that they have no real idea of the market or the people within in it. Given that they have punted their attempts it will be extremely difficult for them to re-enter unless they have a complete reboot of their devices and strategy… and a long term commitment with more emphasis on commitment than on profit.
Kind of like Microsoft and the cell-phone market.
MS dumped a metric ton of cash into phone. I think you do not understand what really happened. MS did not do what Jobs did. He broke the data ransom the carriers were doing. They were charging 45+ bucks for a couple of megabytes per month. Jobs said unlimited or I walk with my cool phone. If MS had did that the world would be very different. They had some of the coolest devices well before iPhone was even a twinkle in Jobs eye. The carriers then proceeded to charge an arm and leg for it. The wince phones were quickly relegated to the sidelines.
Intel did the same mistake. My group was one of the first groups doing M2M/IoT type items. We went to Intel and said ‘make that but as cheap as a Pi/Ardrino’. They came back with a 200+ dollar board with no case/power. We told them to come back when it was radically cheaper as the whole market was easy to see where it was going. They ignored us and then quickly suddenly launched this thing. They slightly lowered the price point. Yet all of the Arm boards were radically cheaper and in some cases faster. It is not surprising they got priced out of the market. IoT does not need crazy speed. It needs memory, local storage, and easy to use boards, with a decent radio. They whiffed the one slight advantage they had and got greedy.
>after just a two-year effort.
what effort???? they didnt even bother providing proper documentation, instead they tried selling us chips with 15Hz max speed GPIO
That was Galileo with the slow I/O, to be fair.
Arduino 101 does have decent GPIO speed. I am the maintainer of the OneWire library, and I can tell you I accepted a OneWire.h patch some time ago for Arduino 101 direct register access. It does work. I personally tested a DS18B20 temperature sensor with Arduino 101 when I merged the code.
However, much of the other Arduino 101 stuff was badly broken. See my comments above about the Wire library….
It’a shame that it took that many tries to get real GPIO performance out of our– I mean their– Maker products in a sea of products that already do that just fine, eh? Extremely poor documentation and non-stop in-fighting with a little bit of empire building and a lack of understanding of what Makers actually want. And yes, focus on design wins rather than hearts & minds. I should write a book…
It’s hard to compete in any market where the consumers are content with existing third-rate product.
Are you talking about the boards or the chips used? Atmel chips have proven themselves in the industrial market for years. Reliable, robust and stable with a good supply chain. And there are many other examples.
This is exactly what I expected when I heard the announcement they were going to try to get into this market…
Well people who bought the udoo x86 are now screwed. A major selling point of that board is that it had an arduino 101 on board in addition to the x86 microprocessor.
This is Intel using the pasta approach to product development. Rather than making the effort and thinking though your offerings you throw a bunch of half-cooked products at the wall and see what sticks.
I hope their “driverless car” and “connected city” support products are better thought through and supported.
looking at mouser, the curie chip was something like $20 or more.
then, I look at the ARM blue-pill board; a full board that really works (does not have all of curies features, but then again, the blue pill WORKS) and the blue pill is like $3 for a fully working board. no ‘BGA from hell’ either, on the blue pill.
finally, intel canceled their ‘americas greatest makers’ season 2 event: https://en.wikipedia.org/wiki/America%27s_Greatest_Makers (‘As of April 4 2017 the show has been canceled.’)
sad that intel screwed up such a good chance to enter the real maker market. but intel was never aligned to makers; not even close. we could have used another player in this market, but intel isn’t it.
Well what can I say….CEOs and chairmen/women of the board need to eat. How could they provide themselves sustenance if they invest in this maker stuff..especially when the ROI isn’t that great. Poor Intel I feel so sorry for them. They’re so poor
I’m not surprised they cancelled it.
I saw the writing on the wall when that casting company got ahold of me via hackaday.io . We were required to use 1 of 3 Intel based products, all 3 being badly documented. And not only that, from the Hacker News thread this links to, still has significant issues with I2C not working and other reported hardware errors that are summarily ignored.
Intel’s products are a ghetto, is what I keep getting. And there’s this big DONT USE
Oh hey. I did a Skype interview to be on that show. It was kind of a surreal experience. They actually made me stop and redo parts of my interview because they didn’t like the way I did it. I guess they were planning to use the interview on the show?
In the interview I did mention that there was no datasheet for the parts that we were required to use, and the parts were too expensive to use in a commercial product, which probably disqualified me.
Before some of you start shooting your mouth about how bad Arduino 101 is – have you even programmed one? I have, and it’s pretty cool. RIP – https://www.youtube.com/watch?v=OGE8qXIJM94
Did you not run into any problems programing it?
I ran into several.
A major one is it’s not compatible with any RF transceiver library.
The Intel Maker products actually were cool IF you could put the time into tracking down solutions to some problems. The docs were almost universally cited as a major roadblock to real adoption… every hackathon started off with the same issues, the same confusing and missing docs, the same roadblocks to success. I always said “they should be able to work on their bugs, not ours,” but the politics were strong… Go pick up some tinyTILEs from element14 before they’re all gone and killed off.
I suppose the UDOO x86 can take up the reins. It’s using Intel chips and boasts being an Arduino 101 compatible board.
They use the curie ic on their boards. Discontinuing the curie ic means that the future viability of the udoo x86 board is dead in the water unless they come to some sort of agreement with Intel or replace the curie ic with something like the atsamd21
Ah, my mistake. I was only looking at the Braswell soc and not the integrated microcontroller’s specs.
Last thing I wanted was closed sourced binary blob junk… That is, if you could even get the firmwares to run them. I turned down that Intel Maker TV show thing, precisely because they wanted to foist their crap on my development process. I told them I’d do it if they would publish the full specs how to control the hardware. Never got a peep out of them (surprising?).
Hell, the Chinese companies, even with a language barrier, are **BETTER THAN INTEL** at documentation. Or look at Expressif’s early entrance – “Here’s a Ubuntu VM already set up. Have at.”
It needed Wi-Fi.
Intel tried a few times to do something else than x86 but failed everytime.
Some known are Itanium (never a success & replaced now by Xeon processor) and XScale processor (bought by Marvell and still alive and kicking).
Even their “Embedded” branding of board is misleading. Reselling old technology in a new package is tempting but people are not so stupid.
By the way their first XBox was an old x86 board.
That was to be expected as the datacenter market (and top of the line CPUs) is way more profitable than any other market.
I thought the Arduino 101 was neat. Sure there were some comparability issues with advanced arm libraries but the built in BLE and 6 axis accelerometer gyro (IMU) was nice.
I guess they were expecting it to be a barn burner sales channel? A bit short sighted in my opinion to kill it just a few months after release.
What did you use yours for? and what are you going to replace it with in your projects that are using it?
I made some cool BLE rovers controlled by the Blynk app on my phone. Also used the IMU to trigger motion detection for a paired ESP8266 webserver app.
For replacement I will probably just use ESP8266 with external accelerometer / gyro, its the most economical and has WiFi to boot. I have yet to find an affordable BLE offering, Arduino 101 included, but BLE is quite complex and frustrating for simple DIY projects anyway…
is normal bluetooth easier to work with? I’m looking for a new way to interface my phone with my projects. I don’t want to use wifi as this typically requires being in a wifi zone.
Intel reminds me of IBM vs. Compaq.
When Compaq, a little insignificant company (in comparison with IBM in those times), created the Compaq Portable, they beat IBM badly in the market of personal computers. The Compaq Portable was a 100% IBM PC compatible system.
In the beginning, IBM didn’t took them seriously. After all, they were I-B-M!
Then, IBM created their version which ended to be a failure. It was NOT 100% compatible with their own system. They decided to introduce some hardware changes that would make their portable PC to not be backwards compatible. Current software that would run on a regular IBM and on the Compaq Portable would fail on the IBM portable PC.
Later on, IBM lost all chances in the market of personal computers and removed themselves to the corporate market.
The lesson? Stop looking down to others. Get down of your high horse and learn to be humble.
Also, there is a difference between leading a market and try to force it into your control.
Bye bye Intel, it was about time!
The wanted quick (and big) money, they tried, they failed and stopped. I don’t care, but people that builded products on top of one of these boards are probabably somewhat angry or desperate now… Single supply is bad…
I asked intel about a roadmap for this chip (series). never got an answer. without a promise of chip availability over time, this was a total non-starter for anyone wanting to build a business around it.
very very single-strung. with arm, you have choices. with atmel, you have choices. with curie, you are dead in the water if/when intel cancels it. and sure enough, they did, in a very short amount of time.
I knew this, I could see it coming, and I advised people who were doing curie devel to start to think about moving to ARM…
I think i bought a 101 board at Frys for 24ish$ out of curiosity. It’s in a dev boards box somewhere in my house. It was too much(feature wise) for my needs and i never found a project suitable for it without it being wasted. Maybe one day. But i just bought 10 ESP8266-12 modules so i don’t think i’ll be using the 101 very soon.
I can only speak from experience with the Galileo. It was doomed from a technical viewpoint from the very start. It ran at 4/10 of the speed of popular ARM based offerings (Raspi and BBB), was more expensive, and poorly documented.
What is there not to not like ?
So anyone wants to place bets on how long it will take before Intel kills their newest product, Movidius – the neural compute stick? https://www.movidius.com/
I say 6 months tops
I think they bought the company just to scalp the IP which they will attempt to add parts of their their existing lines of chips.
I feel these products were mis-targeted from the beginning.
Hobbyists are classic for not caring about quality and only wanting the cheapest available option.
Where they should have been marketing these is industrial and enterprise applications. There are plenty of small holes these could cover, and they have the name that the business types and purchasing managers aren’t going to turn their nose up at. (You should have seen the reaction of some of these people when I pitched using a raspberry pi for an application)
marketing high reliability, feature-laden product to a bunch of penny-pinching makers was ALWAYS going to fail.
There are some rules of thumb when it comes to companies like Intel, Google, and ANYTHING “cloud” based.
1. If it’s in the “cloud” it is NOT yours any more, and you will NEVER known when it vanishes.
2. If the hardware is Intel, it WILL be abandoned long before it is still useful. Intel makes its money through CHURN. It’s not only the hardware, it’s the binary blob drivers as well. Closed drivers equal Planned Obsolescence.
3. If the service is provided by Google, it will eventually VANISH. Google is like a kid playing in a sand-box. When Google gets bored – POOF – the service disappears (or if you’re lucky, it morphs into something entirely different).
This stuff is (sadly) Normal these days…
“or if you’re lucky, it morphs into something entirely different” So right! Instead of fixing bugs they just keep changing stuff.
Ask the Snopes people. You don’t need a cloud to have trouble with your stuff.
The current players have been in the market for a very long term building support. The chips make their money in other fields, the maker market is just a marketing tool for them really. Intel thought it was a place to make money, maker market is not that yet. PLC manufactures are not going to swap their Atmel / ARM based devices for x86 ones as they have to give at least a 10 year support window. Intel missed the boat about 20 years ago if not more. Market is closed.
Intel had an Arm division and sold it because they decided that X86 everywhere was the direction to go even though there is no real advantage to it.
Omron Sysmac PLC:s, very fast ones, Intel inside.
To be more specific: Omron NJ.
$80 for a Dev kit. It uses the Caffe deep learning framework.
Enabling machine vision and learning was supposed to be part of what the 101 was for. I guess they decided that the 101 was seriously underpowered and bought the movidius.
I wouldn’t mine someone other than Intel putting out an x86 microcontroller on an Arduino board. An 80286 on a chip or something along those lines (maybe at 90 nm?). 16-bit DOS tends to be very compact and easy to extend (if you dig into the esoterics of .SYS files).
In terms of price and performance though, a Microchip PIC32MZ (MIPS w/ a substantial amount of SRAM) or one of the larger ARM Cortex-M4 are both cheap, powerful and well supported by current microcontroller tools. Unlike x86.
My local Electronics shop (JayCar) have a new product just added to their catalogue – the ‘MeetEdison Robot Kit’. Yes – it has an Intel Edison in it.
The news was sort of sprung on everybody fairly recently… a product already designed and shipped to distributors and re-sellers is a sunk cost, such is life.
Mouser reports that even the Curie module is EOL. Mouser’s gonna have a hard time selling those 8.7K units they have in stock.
Oddly, a whole lot of money, no innovation, and a me-too attitude failed.
well, brian k got his fame being on tv and schmoozing with the celebs. I bet, to him, this was all worth it.
massimo also blessed this chip and the 101 board. (cough) – what exactly does that say about massimo?
(btw, no eeprom on the curie. no one thought about that; and when others mentioned it, it was ignored. typical of intel ‘we know better’ attitude).
Massimo gets zero blame. Intel hubris takes the day.
What is aurino 101….????????????
Esp32 will fill the gap
Not sure if everybody knows this but Arduino 101 can be reset with a strong flash. (same problem as Raspberry Pi before it). I was taking some pictures of a project and I see that it repeatedly resets every time I snap a photo.
Please be kind and respectful to help make the comments section excellent. (Comment Policy)