At the keynote for the Intel Developers Forum, Intel CEO Brian Krzanich introduced the Intel Joule compute module, a ‘maker board’ targeted at Internet of Things developers. The high-end board in the lineup features a quad-core Intel Atom running at 2.4 GHz, 4GB of LPDDR4 RAM, 16GB of eMMC, 802.11ac, Bluetooth 4.1, USB 3.1, CSI and DSI interfaces, and multiple GPIO, I2C, and UART interfaces. According to the keynote, the Joule module will be useful for drones, robotics, and with support for Intel’s RealSense technology, it may find a use in VR and AR applications. The relevant specs can be found on the Intel News Fact Sheet (PDF).
This is not Intel’s first offering to the Internet of Things. A few years ago, Intel partnered up with Arduino (the Massimo one) to produce the Intel Galileo. This board featured the Intel Quark SoC, a 400MHz, 32-bit Intel Pentium ISA processor. It was x86 in an Arduino format. This was quickly followed by the Intel Edison based on the same Quark SoC, which was followed by the Intel Curie, found in the Arduino 101 and this year’s DEF CON badge.
We’ve seen plenty of Intel’s ‘maker’ and Internet of Things offerings, but we haven’t seen these platforms succeed. You could spend hundreds of thousands of dollars in market research to determine why these platforms haven’t seen much success, but the Hackaday comments will tell you the same thing for free: the documentation for these platforms is sparse, and nobody knows how to make these boards work.
Perhaps because of the failures of Intel’s IoT market, the Joule differs significantly from previous offerings. Although it can be easily compared to the Raspberry Pi, Beaglebone, and a hundred other tiny single board computers, the official literature for the Joule makes a comparison between it and the Nvidia Jetson easy. The Nvidia Jetson is a high-power, credit card-sized ‘supercomputer’ meant to be a building block for high-performance applications, such as drones and anything that requires video or a very fast processor. The Joule fits into this market splendidly, with demonstrated applications including augmented reality safety glasses for Airbus employees and highway patrol motorcycle helmet displays. Here, the Joule might just find a market. This might even be the main focus of the Joule – it can be integrated onto Gumstix carrier boards, providing a custom single board computer with configurable displays, connectors, and sensors.
The Intel Joule lineup consists of the Joule 570x and 550x, with the 550x being a bit slower, a Gig less RAM, and half as much storage. They will be available in Q4 2016 from Mouser, Newegg, and other Intel reseller partners.
and it can be yours for only $369. -_-
Yeah, they’ve got themselves dug into that “Anything we make is worth 25x what anyone else makes” smughole again, I’m hoping AMD kick them “squaw inna nuts” with Zen, like they did with Athlon and earlier (briefly) with the K6-III
+1
I’ll take an Athlon-priced equivalent, as long as it doesn’t spew enough heat to cook eggs like the Athlons did.
+1
Athlons were cheaper and had a bonus egg cooking feature!
I don’t know, it’s nice in winter. It’s not even inefficient if it means I don’t turn on the heater… /s
Well, not inefficient in terms of energy. But possibly inefficient in terms of money, depending on the energy source of your heating and its cost per unit of energy.
AMD really should release a similar module that’s much cheaper as even something based on Puma could be way faster than this board.
They did used to have some pretty tasty embedded x86 stuff, haven’t checked lately though.
AMD had their (paper?) launch of the G-Series Brown Falcon and Prairie Falcon single board computers a few months ago. Not exactly the same kind of product because it would be more geared towards process heavy user interface tasks but 5-15 watt APU (bordering on passively cooled) but loaded with extras you’d need to make a modern PC like sata storage controllers, PCI-E controller, display adapter, sound card etc.
I guess if you removed all those PC-esc extras and added in the same kind of GPIO and connectivity stuff AMD could have something about equal.
Intel let’s AMD – barely – survive to prevent a monopoly and getting split up like AT&T.
$369? You can buy an actual internet of things for $369 and have enough cash left over to register a domain, host a site and brag about em.
It should be called the Intel Jewel (Faberge duck egg edition.)
it’s a SOM, not a single board computer. Very comparable to cost for phytec or DAVE SOM modules. Comparison unwarranted.
DAVE and Phytec are the higher spectrum of prices, I uses Variscite for a lower price and same performance.
But I did speak with DAVE and they seem to have a really damn good design ressources.
Its a bit high, but not as high as some of the comments make out to believe, tis no RBPi, its alot more powerfull. $275 or something would’ve been a good price point imo. There are other ’embedded computers’ with similar specs (tho usually a bunch larger) right now for that same price.
Right? What does that get you over an edison module? Faster CPU a tad more NAND and better radios? Oh, and I guess video…but to go from ~ 75 – 370 seems bonkers for that feature set. The only real win I see here is video for some applications…but why not go Pi or equivalent for that? This seems like a compute stick in different packaging (and IMHO the compute stick is a pile of turds). Also – they’ve been awefully quiet about power consumption. Anyone find a clue to how much this thing draws? Sorry Intel! I do LOVE your edison (when running debian).
So, better to buy an Archos computer stick at 99$, with hdmi output and Windows10
The failure of Intel’s offerings in maker/IoT range is attributed to “me too” / “stay relevant” nature of their gadgets. They get rolled out with oversights in areas that matter to the maker community. Take Edison for example. Its ridiculously fine-pitch connector is nuts. Rarely do you need more pins that an Arduino can offer. Or first Galileo’s unfathomably slow I/O. This kind of things questions whether product team has grip on people’s needs. And don’t get me started on the price.
Not only that, their USP is that their chips are high performance, in a market where things routinely run on 16MHz AVRs. There are simply so few use cases that they’ll never get the market penetration needed.
Not sure why they choose to use the Hirose connector as it was a poor design choice.
I can deal with lack of documentation, but not at $369. The price-point is where these Intel boards are failing.
And Benchoff fails to mention the Joule is based on the Broxton core that was recently cancelled by Intel. This is an astonishing oversight for such a ‘highly regarded’ tech ‘journalist’, and makes me wonder if he’s bought off. This is just a fluff piece for Intel, nothing more, disregard, and don’t read anything Benchoff writes ever again.
Yeeeeeesssss. Let hate floooow through you.
If you wanted to make a difference for the better, you could have provided the same information with a lighter tone. Someone with your knowledge would add content to the discussion, and that bit of new info would be well regarded for us.
Unfortunately, you instead turned the new data into an attack. You just sound arrogant, or seems like you have a personal grudge against Benchoff. Or both.
+1
+1
Maybe you should ask the Donald to give him a politically appropriate name…
We’re going to make the hackaday comments great again.
Have an upvote and some gold you sly son of a gun. Oh, sorry, wrong forum.
They’ll be yuuuuuuuge! Best comments you’ve ever seen. I personally guarantee it.
You are a complete tool, You tell people to stop reading what benchoff writes in the comments section of an article you just read by benchoff. I am sure anyone could look up any article anyone writes and add something that wasn’t mentioned and claim the writer is incompetent. The fact is when you write an article if needs to be too the point without being too abrupt. So you will write the majority of the info you gather and leave other details out so you don’t overload the reader or put them off reading a wall of text.
It really seems you have a problem or are jealous of benchoff there is no other reason someone would go too all the trouble you have to discredit someone. I won’t be too harsh though as I suspect you have mental health issues andif you do I hope you get the help you need.
I have see a lot of this lately and the screen names being used keeps changing. I suspect it is just one or two people throwing all the sh!t.
Do us all a favor and stick to one screen name so we can ignore email notifications from you.
Only a coward needs to hide behind a changing identity.
I’m not arguing with you in any way – you have a right to your own opinion but I have a right to ignore you because I think your a clown. Really if you don’t like [Benchoff]’s writing then don’t read them, it’s not rocket science.
Or is the actions of one bad apple (that would be you) going to cause changes here that inconvenience everyone?
I know, I know – don’t feed the trolls … I just go sick of it.
At least Bench off signs his name to his work. Cowards hide behind anonymous insults hurled as sender identities, such as “Bench off fails again”. Grow a pair, punk!
BFYA. Broxton seems to be focused on key devices like Joule. Your information is limited and short-sighted. Intel runs a business – as Kenny would say, you gotta know when to hold them, know when to fold them. Get your facts straight, and get out of your mom’s basement. Find a job. You’ll feel better.
The arrogance of the comments crowd is sometimes unbearable. Generally, why do people think Intel would be *primarily* targeting the “Maker market”? That’s not a market, by Intel’s scales, that is a hobby.
Obviously, “in devices that routinely run with 16bitters” can’t really be an argument. So, we already have all the devices we want, like good HUD in bike helmets, or multisensorial location-aware trolleys for hospitals, or drones that actually are smart enough themselves to fly at high speeds, process and encode a video stream and so on?
This is all “I’m happy with what I can do with my Ox, don’t force the combustion engine on me” all over. Seriously, people, cut it out. If higher-performance platforms don’t fulfill *your* needs doesn’t mean there’s no-one else that it could fulfill.
“Sparse docs”: I haven’t dipped into this myself. My gut feeling is: once you order a couple dozens of thousands, you might suddenly have Intel app engineers on your side, and things will look different. And seriously, “there’s no guide explaining how to level shift 1.8V GPIO and how to connect this thing to my PC” can hardly be an argument compared to the engineering problems potential manufacturers will face that are actually involved in building something *new* with it – not the nth toaster with internet access.
So, in a way, this is also a bit of editorial criticism: Believing that Intel does inherently “target the Maker market” shows a bit of naivety. Which is totally OK, here, because they probably intend to produce interesting, beginner-friendly IoT platforms, but that’s definitely targetting young engineers/interns that try these things and build future, high-volume (or high margin – hello defense industry!) products, not the couple thousand eval boards they’ll sell at barely a profit over the next years.
The problem is that they specifically say “this is for the makers out there.” They must mean the people making consumer electronics, because none of us ‘makers’, as in Makezine or the typical Hackaday reader, is going to order 10,000 units without good documentation and proof that the part is the right solution. A set of documentation which they won’t or can’t give the typical ‘maker’.
TLDR; Their concept of a ‘maker’ is far from what most Hackaday readers would call a
‘maker’.
That trick, my friend, is called “marketing”.
Again, what Intel needs to do is get their products into the designs of high-volume or high-margin devices. Placing them on the market with a lot of engineers looking at what can be done without spending days trying to get a specific competitor’s SoC module to just blink an LED, until you only have a significantly smaller part of your project’s time allotment left for implementing complex DSP stuff is pretty much a good form of marketing. The fact that these devices seem “a little too hard” for the average “maker” aligns nicely with that – to make an analogy, an engineer with embedded development experience might not really prefer things that look and feel like Android Studio.
I agree that Intel doesn’t get the “maker” scene.
As some have noticed, it cost is 10x the Rasp Pi, and it has a shitty documentation. As a maker I don’t want to lose 20 hours just to get “blinky” working – but somebody in the industry can spend a work day or two figuring it out. Lot of “maker” ideas can be (and are) done on Pi’s (Rasp, Orange, Banana, etc) because they are easy to use with community made documentation.
So this project is really for industry.
But it’s marketed for the makers. Only group of makers I can think of, that would buy an “expensive” board with bad documentation, are universities (mostly in the USA, ones with good R&D ). And they will buy 20-30-50 pieces for research only after Intel donates a few boards in exchange for writing better documentation.
by the way, I’d like to get a bit of background on this whole “shitty docs” aspect. Where are we taking this from? Was the Edison especially badly documented, compared to comparably highly integrated devices?
It was/is very well documented, but Yocto Linux is not as easy as Debian “apt-get install package-x.” You have to understand how to compile a custom kernel, because that’s what you do with Yocto. In fact you really have to know your way around make, something that your average Pi user running Python scripts isn’t going to be familiar with (I know I’m not).
They are very powerful devices in a very small package. But the learning curve is much steeper than a Pi running Raspian with the large support community. They are taking a step in the right direction by using git, so fingers crossed they might get it right. But for $369 I could buy 10 Pi3s and a few cables, so the use case had better be pretty compelling.
@k0jeg Reaching maximum thread depth, so replying on same level:
Ah, ok, admittedly, I’m having the fun of using OpenEmbedded/Yocto, too, and that is actually not underdocumented, but it’s a Moloch, at times devouring one of its own children, I agree. You need to understand the point of doing embedded development before you start with Yocto.
I can see how the maker scene can’t be bothered to care about why it is desirable to have something that gathers all the sources in one place, builds images and SDKs to your specification if “I can install GCC on the devices itself and make my software on my embedded itself” is possible for them. Exactly the “slightly too hard for enthusiasts” aspect I was mentioning in my other post.
so: thanks for clearing things up. It seems that the Edison is indeed not badly documented, it’s that the audience for that documentation isn’t hobbyists, so the hobbyists are getting lost. I can see that, and I fully appreciate that not everyone will ever be happy with things like Yocto.
The Odroid XU4 is a much more viable option, just on price alone and the performance spec is very close the only real advantage i can see is the joule has additional RAM, but the Odroid XU4at $74.00 for the board and then the avrage kit costing around the $160.00 mark with all of the extras makes the Ordoid Xu4 a clear winner in the bang for buck stakes.
The problem with the ODROID boards (in my experience) is that EVERY peripheral is on USB. For UAV work this was nonsensical and too slow for the data acquisition rates we were looking at. I haven’t looked at the Joule’s specs or diagrams, so it may suffer from the same “lets put everything on the USB” issues.
Albert if you are talking UAV as in Drones then have a look at this http://diydrones.com/profiles/blogs/turn-odroid-into-a-powerful-companion-computer-for-your-advanced
or here with an XU4 https://wiredcraft.com/blog/drone-copter-uav-4g-network/
@Albert: To my knowledge that’s not correct. For the XU-4, the Gigabit Ethernet controller is connected via USB3: http://goo.gl/s2I0gz But everything else not (except the USB ports, off course).
The Ethernet port is a 10/100/1000 it also has USB 3
I am sure the average “maker” market can easily prototype on whatever fine pitch SMT connectors needed to connect to Intel’s wonderful modules of the month. NOT.
These are people that are stuck with through hole parts uses breadboards, their idea of a schematic is a frizing wiring diagram, needs breakout boards, complains about having to read datasheets and app notes and instead relies on 3rd party youtube video and blogs for technical info etc.
Somehow Intel doesn’t even try to understand the skill level of their target market. While the market is not as penny-wise as hackers/engineers, there is an upper limit on how much they are willing to spend to blink a LED.
1) Iot is about mobile devices.
2) Mobile devices run on batteries.
3) I can buy a bunch of wire wound resistors to discharge batteries just as quick as this for less than $10.
4) Iot things with flat batteries are useless.
1) Nope. IoT is, per definition, about /connected/ devices. These /might/ be mobile. But usually, they aren’t
2) Usually, yes, that is the case, but “mobile” doesn’t apply to all IoT devices
3) I’m absolutely positive you can buy resistors that have four x86 cores running at up to 2.4 GHz, Gigabytes of RAM, graphics acceleration, a high throughput WiFi controller, USB3 and mature Linux drivers, at the same time ending world hunger and wars. Praise the RÖB!
4) True, but so are 99% of IoT devices that you can find on the market generally. A connected water bottle, even with a battery runtime of a couple years, is still no better than a normal water bottle. I don’t understand why we’re comparing apples and oranges – this system clearly isn’t meant to compete with ATmega/Cortex-M sized MCUs. It’s competing with the need to have an industrial PC. Stop trying to make IoT about devices that need nearly no computational power of their own. That’s not all the IoT there is.
Actually PC-alternative, with the sort of performance you only get from PC chips, I can see the point of. Still a bit of a niche. Then again, there’s defence, where they can write their own cheques. Must be plenty of murderous gadgets they can stick these in and charge a fortune for. The US Army, as well as Britain’s, seem to be going after the Call Of Duty generation.
There was an army ad here (UK) a few years ago with a soldier flying a drone with an Xbox joypad! I was surprised, but it’s probably true, after all if it does the job, and is cheap enough. The subtext though was pretty much “it’s just like Xbox but the graphics are better, come use your skillz and get paid!”.
Dunno why Intel’s marketing are wasting their time hooking them up to blenders and stuff. They’re certainly aiming at the Arduino market, even if it’s completely utterly wrong for it. I dunno, marketing people are all about superficial trends. Maybe that’s how you get media exposure, which gets the thing in front of the eyes of the people who actually pay for the killing machines and everything else.
As said, every engineer, no matter how complex the platform is, will make an LED blink first. Then do something slightly more complex, acquiring the skills necessary to do the “full job”.
As a company, if you make your own demos, the “this is the blinking LED demo” is always the least compelling one – in the end, you’re forcing engineers through an hour of set up, getting familiar with tools, hardware connections etc just to blink an LED. The next level demos aren’t necessarily better – you can hook up all the blenders you want and claim it’s easy, but everyone who has used e.g. Xilinx ISE knows that the accusation of “it’s easy because you work for the company developing the tools themselves” is often justified.
That’s where volunteers (“tinkerers”) jump in – bringing a board, and might it be ever too expensive for the maker market, into the public and getting *some* third party to do something remotely sensible with it makes your platform a lot more appealing, instantly.
> Still a bit of a niche.
Well, with cars getting smarter, and all sorts of drones, self-driving carts, fabrication tools… I’d say the complete maker crowd is pretty much what I’d call niche; not the other way around.
Yeeeeeesssss. Let hate flooFEEDTHETROLLSoow through you.
Are you trolling the people feeding the trolls? Is this Troll inception………..
Seems their marketing department read somewhere that the marker movement is ‘the thing’ now and so they’re attaching it to to anything they release. To me this looks like something only for set top boxes, smart TV’s, top end VR headsets or in car smarts. None of these things are what I would call marker friendly.
For that money I bet you could by a laptop that would out preform it.
For that money you can buy 3 of the same damn chips installed in tablets with touchscreens and batteries with free windows 10 LOL
why on earth are people surprised by that? For the price of a single motor development test stand you can buy >>10 Volkswagen. Surprising? Not really, considering one is an experimentation platform targeted at a small audience, while the other has been cost-optimized in every conceivable way for production in millions.
Really. Really. Really. Simple market mechanism.
point out to me the expensive instrumentation on there.
you really don’t get it. The point is not that there’s expensive equipment around the central components. It’s that every thing about making something like this is expensive.
These are CISC processors with a lot of peripherals running at up to 2.4 GHz, optimized for flexible clocking and low power losses; that’s all very fine lithography that intel uses, there, usually. Producing a few of those comes at another price tag than producing another billion of ARM Cortex-M MCU implementations in a 180nm process on a paid-off fab. Like, orders of magnitudes of a different price tag.
The point is that making a small run of high-perf processor boards is *incredibly* expensive. Design costs in the multi-millions. A halfway usable simulator for parts of x86 designs costs 10Mio $+; Mask costs, which n*100,000 to millions per layer, fabrication configuration costs in the 100s of thousands of dollars. Not using an intel fab to produce Xeons/i7s but for a low-volume tech demo isn’t free, either.
High-quality high-layer-count PCBs, power supply chains, oscillators, RF components, connectors, housings don’t come for 1ct a piece, either.
Really, that price isn’t because Intel plans to rob a couple of misjudged enthusiast of a couple hundred thousand dollars – assuming that makes no sense at, given that intel definitely has higher-revenue, lower-risk products that they could push.
Rage at me for your bad analogy why not.
Nothing on that board is running much more than a few mhz, the fast stuff is all inside the commodity chip. scrap of board, few connectors, that’s all it is, all the rocket science happens inside the chip or outside the board, the board is not more than $20 worth in 10 unit quantities. Even if you had to source your chips by paying someone to dismount them from tablets, one could replicate the board for half the price.
Then you wail on about how “special” this CPU is, it’s the exact same one as in win10/intel tablets, I’m not talking equivalent arm units here, EXACT SAME, already in bulk production. Not getting it? Pot meet kettle.
Here’s two tips and a suggestion, girls and boys–
Intel doesn’t know what it should be doing in this new upside-down world. It’s grasping at straws.
The writer of this article does know what he should be doing, and does a good job of it.
Quit getting your tights in a wad; everything will turn out fine (translation: you will not buy Intel’s ill-conceived, poorly offered product: and Mr. Benchoff will keep objectively reporting on all the new, whizzy stuff available).
I like to think of it this way:
Every dollar Intel spends for Sparkfun and Adafruit to promote their stuff is one more dollar that can be spent taking a chance on more interesting open source products and community outreach. Whether or not Intel can succeed at buying their way into the hacker mindset is irrelevant.
Bingo!
I Use Odroids the XU4 is of a comparable performance, but it only has 2Gb ram, but an Octocore Pc for $74.00 ea its not bad.
Samsung Exynos5422 Cortex™-A15 2Ghz and Cortex™-A7 Octa core CPUs
* Mali-T628 MP6(OpenGL ES 3.0/2.0/1.1 and OpenCL 1.1 Full profile)
* 2Gbyte LPDDR3 RAM PoP stacked
* eMMC5.0 HS400 Flash Storage
* 2 x USB 3.0 Host, 1 x USB 2.0 Host
* Gigabit Ethernet port
* HDMI 1.4a for display
* Size : 82 x 58 x 22 mm approx.(including cooling fan)
the link to the web site is is http://www.hardkernel.com/main/products/prdt_info.php?g_code=G143452239825
These Intel boards keep failing because nobody can afford them. When I can get a Pi 3 for $35, or a Pi Zero for $5, why would I shell out hundreds for something that’s billed as being the “same thing but better”, when the price doesn’t lend itself to permanent installations?
They need to stop billing these things as “Raspberry Pi/Arduino but better”, and start marketing them like an FPGA dev board.
And sure, x86 compatibility would be nice, but apart from niche applications, it’s not essential.
I’ve been hoping one of these models will drop cheap enough to dedicate to a handheld “Windows 9x era” gameboy knockoff, for some pocket podracer and starcraft.
This is clearly a swing at the Raspberry Pi and nVidia Jetson dev kits. The module is slightly smaller than a Raspberry Pi Zero, the dev kit is slighly cheaper than the Jetson TX1, the boost clock is slightly higher than the Jetson TK1, the base clock doesn’t beat an Atom 330, we’re supposed to assume that Atom’s x86 architecture is superior to the Cortex A53’s RISC but how does intel’s HD GPU hold up against a TK1’s 192-cell Kepler GPU or the TX1’s 256-cell Maxwell GPU? These are perfectly suitable for the kind of heterogeneous computing necessary for the kinds of computation intel is marketing this device too. How much of an impact does the boost clock even make anyway? I’d like to see some side-by-side comparisons with nVidia’s offerings. Speaking of nVidia, how well are they doing in the maker community to begin with?
Okay, time to crap on the price again. I see articles with sensationalized titles like “makes the Raspberry Pi look like garbage” but for 74x the price of a Pi0 it damn well better.
They seem to market it for image recognition functionality boasting the realsense connection. It would really be nice to see a XU4 vs Pi3 vs Jetson TK1/TX1 vs the Joule to see how far apart they really are.
I can see a market for robotics enthusiasts, but before spending this amount of cash, people would need to see it handle some real world scenarios.
So I agree, we need performance benchmarks. My personal interest would be to see it handle robotics sub-tasks like 3d map building with the realsense (like project Tango), marker id and pose estimation, face recognition, neural network object detection (as done with the nvidea’s drive PX) and more of the standard opencv stuff.
Exactly; Jetson was what I thought of when I read the spec thing! Really, integrated, feasible-perf graphics, camera interfaces, high-speed RAM interface (a notorious bottleneck on the jackson), feasible IO interfaces for storage… this screams “cognitive vision” all over. And yes, the price tag just below Jetson kits isn’t an accident.
Regarding comparions:
Jep, a quad-core atom running at 2.4 GHz beats the crap out of a Cortex A53 at nearly the same rate.
I’ve seen many applications run into troubles on TK1 because you hit the memory bandwidth wall pretty quickly. If I might take a guess, the intel quad core arch will probably have larger caches, possibly more independent lanes, and thus, better perf.
The CUDA/OpenCL performance advantage of a Kepler-gen GPU or even a Maxwell GPU is indisputable. But then again, what good is that if your GPU and the CPU share the same memory bus, and given good parallelization (including branching, something that GPUs don’t like/block on to synchronize different branches) on four CPU cores with proper SIMD extensions, many of the algorithms of image DSP might already be memory-, not FLOP-limited. But: That’s definitely the claim that needs prove here!
she sole purpose of those boards is to try running a desktop windows on them, then make a youtube video, and forget about it
It’s okay if it can run DOOM. That’s all we need, that’s all we can, that’s all we do.
From the first link in the article ” … targeted at Internet of Things (IoT) developers, entrepreneurs and established enterprises. “.
That doesn´t read as something “for makers or hobbysts”., same as the i7 isn´t marketed as a solution for making blinking leds.
Like many other of their products, it seems to be aimed at industry, or specific uses, etc. Nice to see it featured in HaD, but it is just something that will not be used for most people here, same as an ultra-high-end AVR or PIC with some peculiar peripherics.
While this wouldn’t end up at my home projects, as a daytime engineer I like seeing these things on HaD, helps me keep a little more up to speed on products that might be useful in a commercial setting.
I can buy a refurbished T420 with 320GB HD, 4GB RAM and an i5 processor for $200. Being a laptop it obviously comes with a 14″ screen, battery and keyboard/touchpad. This is a much more powerful setup that can run openCV with two cameras in a stereoscopic setup ….giving me many of the advantages of real sense on a much more powerful setup. It may not be as portable but $200 is almost half of $369. And If I need I/O I can add an arduino/stm32 board.
I don’t mind paying a bit …up to $149 for this setup. $369 is too steep…I’m not that rich.
What a timely comment.
I ain’t the brightest bulb in the bin, but I’ve been REALLY concerned with where the PC market is going, parTICularly (a) with regard to all of the gurus writing how hard the manufacturers are making it to install Linux on their new offerings, and (b) the general deterioration and increasing price of what’s being offered.
Found a refurbished Lenovo T420 with a clean Win7 install on A****n for $180.00. I never read any instructions, held down the blue “Thinkvantage” keyboard button–as per the on-screen startup instructions–and loaded Mint Linux 17.3. Guess what I ended up with? A real laptop running Linux, and the last of the real Wins, if I should ever need it.
You can still get a real laptop.
What does this have over any of the 96Boards designs? Size?
My first thought: another toy named Joule? There is the sous-vide cooker, some clothing lines, and now this . . . Intel needs better marketing research.
Second thought: for $400 I can buy a damn Skylake chip and ITX motherboard. Really, do some marketing research, guys!
1) the defcon badge was based on the D2000 aka Mint Field, not the Curie
2) the Mint Field CRB is also one of the Intel attempts from this area that
hackaday reviewed: http://hackaday.com/2016/03/31/intel-ups-the-dev-board-ante-with-the-quark-d2000/
This article seems to focus too much on the internet-of-things obligatory buzz word in the press release. It’s obvious this is a board aimed at vision processing in a small package. As such it’s not intended for people who think they are awesome haxxorz when they connect two break-out boards together.
As a plug-in computation module it reduces the effort someone has to put in when designing a compact vision system and there’s not a lot of competition in this computation power per volume category. And with Intel being good at low power design it’s probably a healthy choice from that aspect as well.
People designing such systems laugh at the cost of this module. They pay for the value of computation/volume as you can’t stick an entire laptop inside e.g. a small quadcopter.
So the last Atom product is an over priced, over spec’ed compute module. Its a real petty that Intel ended the whole low power mobile line because their recent stuff was quite fast for what they are.
Fail! My fail that is. I saw the announcement the other day and was so excited about a small module with realsense, but…
” and with support for Intel’s RealSense technology”
It’s only SUPPORT for realsense. I thought the module was a bargain!
Awesome..
This appears to be a hybrid of a NUC and Xeon-D in Compute stick format. I can think uses for this, but none are in IoT/II/maker space. This makes a nice platform for a media terminal with multitude of OS choices.
Yeah listen Brian, you might want to change that to USB3.0, I mean it sounds a small difference of only 0.1, but it’s actually a huge difference and this thing doesn’t have 3.1.
Also while fixing things: it isn’t running at 2.4GHz but at 1.7GHz ‘with bursts up to to 2.4GHz’
Just checked Newegg for pricing…. 570x development board only $369.99!
I just bought an Irulu 10″ Tablet / laptop convertible with Windows 10 for $150.
Do the math Intel…