Buyouts, acquisitions, and mergers of semiconductor companies are not unfamiliar territory for anyone who deals with chips and components for a living. Remember Mostek? That’s STMicroelectronics now. The switches used to type this post – Cherry blues – were made by ON Semiconductor. Remember Motorola? Freescale.
Today marks another merger, this time between NXP and Freescale. The merger will result in a $40 Billion dollar company, putting it in the top ten largest semiconductor companies.
Hackaday readers should know NXP for being the only company ever to produce an ARM microcontroller in a DIP package along with thousands of other cool components. Freescale is perhaps best known for their i.MX6 series of ARM processors, but of course both companies have a portfolio that stretches back decades and is filled with tens of thousands of parts.
Is this good or bad?
It is.
Whoever is in charge of putting all the data sheets and app notes in one place is going to fuck something up. They always do. Other than that…
Hopefully the NXP side will benefit from Freescales amazing dev tools. NXPs dev tools aren’t really the greatest. LPCxpresso is confusing at best.
Eh.. I always thought that Codewarrior was pretty terrible.
The text-formatting in it is terrible. But overall better than rolling your own IDE eh?
It’s always better to rely on (kn)own IDE. And own bunch of Makefiles. Yes, CodeWarrior is terrbile (As well as all Eclipse-based IDEs).
Will freescale remain free, or be taken down by NXP?
Will NXP become more free, maybe release some of their NDA’d datasheets, like DESFire EV1?
I doubt their NDA’d stuff is getting touched, it is under an NDA after all.
No, NXP isn’t going to go publishing things that would help make their competitors smart cards better and help hackers gain access to their smart cards whilst providing no real benefit to NXP. (anyone who is going to be using a system like that will be able to convince NXP they are genuine and will be able to get NDA access to the datasheets)
Let’s do the math.
Add one Dutch company taking over one US company, carry the tax loophole. Adds up to one “Dutch Sandwich” as a way to avoid higher US corporate tax rates.
it’s more like freescale and nxp are finally realizing that motorola and philips threw them overboard without a liferaft. Together they will also sink, again because neither has control over their destiny. Together or separately they will get out-maneuvered by samsung and apple and st and nvidia and amd because these companies can make actual products with their own chips. Freescale and NXP are toast, stick a fork in them, they are done. As one company you will only need one fork to skewer them both.
Do you really think so?
They have completely different area of interest. NXP is one of largest suppliers for automotive, one of ARM industrial producers. NXP produces a lot of discrete and more specific-purposed chips. The only company of your list that fits their market is ST.
I don’t see any clear reason why Apple would like to start producing 120A automotive-grade MOSFETs.
Name changes and spinoffs too… NXP started life as Philips semiconductor. Creators of i2c and Coinventors of nfc.
And don’t forget Signetics, that was swallowed by Phillips. I think they were Phillips Signetics for awhile.
when electronics companies stop purchasing each other, that’s when you should worry
they have been consolidating and merging and splitting etc since they made tubes
Well, one of the companies making ARM in DIL, another one is Tesla Roznov: http://mcu.cz/news.php?extend.1713
Before you get excited – it’s an April’s Fool joke :)
So is this the nail in the old HC12 then?
God I hope so. They had us using that thing in intro programming classes and even to someone who knew nothing it seemed dated.
That’s because it is dated… we learnt on the Motorola HC11, I can’t see any good reason for us to be using this platform other then our professors being to lazy to learn another one…
Because ultimately it doesn’t matter – unless you’re in a training class to learn a specific piece of hardware or a specific software package, the basic concepts and the process are what’s important. You can re-apply those every time you have to learn a new tool.
For example, once you’ve learned how one assembly language works (what it means in terms of a processor’s operation, not just the syntax), you can apply the same concepts to any other ones you need to learn.
I actually agree– these days, if you are quite smart and clever, you can do quite, quite a lot with an FPGA, but we are probably decades away from the ‘hobbyist’ minting their own ASICs.
I further think I can reinforce you point in one quick question to the ‘novice C/C++’ student — “Or what’s the deal with these ‘strange’ memory registers, all the management [i.e. not Java] etc ?”
Well, after all, it is all hardware and it has a logical ‘root’.
Just an addition, but agreeing with you.
I’d hazard a guess and say they hobbyist and anyone with with less that gigantic volumes, will not be making asics in the future either (at least when it is digital). Mask sets in the latest technology is horrendously expensive, so an FPGA makes more sense even if it has stuff you don’t need
Quite often, it’s an intentional and wise choice. To stop you learning bad habits before they can teach you better…
All I know is that after the course…
1. My assembly was good (Important)
2. I knew about memory mapping, indirect addressing, understood pointers WELL, registers, masks, interrupts, flags, etc… all VERY important stuff
3. My embedded C sucked, the IDE we used was limited, setting registers, dealing with interupts, etc, was done with inline assembly which is a horrible way to do things, not win7 compatible….
4. No idea what DMA, I2C, SPI, USB, CAN, PORT = (1<<PBX) … No idea how to bootstrap a project from a cheap platform (The board we bought was $100), PLCC package … wtf is this even still relevant?
5. No experience with DSP, digital filters, high speed algorithms… clock speed was to slow for any FLOPs
6. We used a bootloader, we didn't learn how the code gets onto the uC…
I've learnt way more on my own using other (non-archaic) architectures, MAYBE use a really old uC doesn't yeald any disadvantages… MAYBE, but honestly I can't see any advantages
Interestingly, when I was an undergraduate, in the mid to late 1980s; our assembler/embedded classes used a 68000-based system (yep, the 16/32-bit one, not a 6800). Each one was connected via serial to a Mac Plus and we learned to program them in assembler and Aztec C (which was a poor compiler, though it was interesting to program interrupt routines using it).
So why have courses gone backwards? The obvious answer is that packaging has become less accessible, but newbies need accessible technology to learn on. The 68HC11 (and 68HC12 I guess) were available in PLCC packaging, which is fairly accessible. Hence other Unis have taught on horrible stuff like 8-bit PIC processors*, though perhaps AVRs are more common now.
The nice thing that simple processors like 68HC11 have really, is that the peripherals are relatively easy to set up – and the user manual that describes them is .. well none of them seem to be small any more .. but at least far smaller than any Cortex manual.
However, it would have been far better, IMO for Motorola to have based their later MCUs on the 6809 instead of hacking with the 6800 :-(
*(apologies for my blatant bias)
The answer -if it’s not blindingly obvious is “because it’s what they do -and there is no compelling argument not to.
I learned on the 68HC11 at uni as well…
Aside from the older and simpler argument.
consider the uni had staff that were specialists in, actually had written books, decades of tried and tested course material, licenses for compilers and 40 odd dev boards (enough for a large lab to be run learning on these things)…
you can’t replace that over night,
the same as the HC12 likely won’t be leaving any automotive applications in the near future, where there are design teams and factories, code bases all ready and tooled up to use these devices.
you could argue that the HC11 was out of date at the time, (though given that the HC11/HC12 was in current production use within the auto industry at the time you’d be wrong to argue that).
You could argue that 8 bit processors are so old that they aren’t worth bothering with. (but then you’re thinking like someone who knows rather than possibly being someone learning who has never seen anything like it before.
Not only easy to set up, but simple enough (usually) that a novice can get their head around everything, and not just parts of the package. That’s a big help in several ways.
I guess I’m an old fogie in this hobby/business, my first home computer was a VIC-20. I’d already worked as a programmer before then, but it was really nice to find something that I could understand completely without thousands of pages of documentation … and maybe even more so, something that I could explain to others as well.
The SOC manual for the Beaglebone Black (a TI OMAP) is just under 5000 pages … which – unless I’m missing something – doesn’t even include details on the processor core itself. The ARM core manuals are pretty substantial too.
i did my undergrad degree with an 8 bit pic… graduated last year and have gone postgrad. The uni is now switching to arm Mbed based boards because nobody uses the 8bit microchip pics in their projects, as there’s no abstraction libraries. Least with the Mbeds you can learn the nitty gritty and move on to the advanced stuff in a single package. Shame its taken this long though.
Though obviously they don’t charge ‘top dollar’ (likely also due to both fabrication methods and competition), personally (at least) I always wince a bit when say another ‘heavy weight’ like Intel expresses suffering a ‘bad quarter’– If only because they make the vast majority of chips that power ‘most’ of the computers, not just here, or anywhere, but literally all around the planet– Their very existence allows others, even if they know ‘naught about how they ‘work’, to log on everyday and produce other great things (or cat videos).
But that Apple is now the ‘richest’ company in the world– Well, basically it is, completely, a ‘marketing endeavour’– with a small added ounce of R&D.
As a HaD reader I bring this up because why are not the technologists, if not the ‘makers’ (and under this umbrella one could claim Apple is thus a ‘maker’, but one with the most heavy handed of wallets– that which excludes all the rest) earning the coin on this ?
Mergers in this space tend to make life feel a bit ‘backwards’ thus, I guess
“with a small added ounce of R&D.”
“As a point of reference, Apple in 2013 spent US$4.5 billion on R&D initiatives”
http://www.tuaw.com/2014/02/12/a-look-at-apples-randd-expenditures-from-1995-2013/
Their sales are WAY better than Microsoft’s and they spend WAY less on R&D
YOU complain and think this is bad, but the market keeps driving their stock price higher and higher:
https://finance.yahoo.com/echarts?s=AAPL+Interactive#{%22range%22%3A%225y%22%2C%22scale%22%3A%22linear%22}
Plus, Apple bought the first 64-bit ARM to market, using their wholly in-house “Cyclone” microarchitecture, before any other ARM licensees did.
Just because Apple don’t talk about tech specs in their marketing doesn’t make their absolute tech achievements any less – they just worked out that the average buyer doesn’t care. They care about something working well, and that’s what they deliver.
(speaking as someone who worked at apple in R&D for 5 years and was blown away by the deep knowledge and commitment in the company. You have any basis to comment, or is it just armchair grandstanding?)
Like
Apple products work. It’s as simple as that.
“Hackaday readers should know NXP for being the only company ever to produce an ARM microcontroller in a DIP package”
NXP was producing only 2 MCU in DIP format the LPC1114 and the LPC810. A while ago I received an “end of product” nofitication from Digikey concerning the LPC1114 so there is only LPC810 left. I hope they won’t drop this one.
I remember the LPC1114 was in an EOL list, then NXP backtracked and said it was not EOL. NXP website doesn’t seem to mention its eol. Anyone have more info?
For the next 3 years, as a user, you probably won’t notice many changes at all for either NXP or Freescale products. But, there will be consolidation. This being hackaday, it is most relevant to discuss MCUs. I would have to imagine that Freescale’s MCU products are going to be phased-in and NXP’s are going to be phased-out.
What we have to look forward-to are more MCU+NFC SoCs. Perhaps even MCU+RF+NFC combo SoCs which can lead to all manners of cool wireless devices.
I really hope they keep the LPC4300 line, there aren’t low level dual core ARM controllers anywhere else on the market.
I imagine they would do the opposite – why would NXP ditch their own designs? Cost saving means closing one of the design teams, that surely means closing down Freescale design team.
There is less reason to stop making existing Freescale chips though, so I don’t see a problem with availability. Just that future designs will be a mix of NXP and Freescale IP under an NXP brand.
However, almost anything could happen, which is causing a lot of uncertainty amongst customers.
Strangely, I was designing with various LPCs at my last job, and the i.MX6 at the current job.
The ARM line is pretty complementary between the two companies. Freescale’s PPC stuff prpnably looks good, too, for higher performance networking. And they’re moving ARM into that QorIQ line, too.
Not sure what they do with Coldfire, which is more of a peer to the NXP ARM series. Not that two lines is a problem always, but one big reason I went to i.MX6 is that another ARM supplier decided they really wanted to be a MIPS company.
I betcha all of their product lines will continue as before. There might be some synergy in the long run but these semiconductor lines have value precisely because they continue to be supported. It’s why companies choose to use these products, they can count on them being available for decades. If they make radical changes that annoy their customers they will lose them right quick.
” The switches used to type this post – Cherry blues – were made by ON Semiconductor.”
Not sure where that idea comes from. Cherry is still independent. Why would a semiconductor fab make switches anyway..?
Because they do automotive stuff, and because Cherry is owned by ON Semiconductor.
They’re actually screwing up Cherry pretty bad. Most of that might be from the MX switch patent expiring, but holy crap are they not producing enough switches.
These are two different companies.
Cherry Corp. of keyswitch fame was founded in 1953. It moved from Illinois to Gernany in 1979, and was renamed ZF Electronics in 2008.
Cherry Semiconductor was founded in 1972. It is based in Rhode Island and was bought by ON Semiconductor in 2000.
More precisely, Cherry Semiconductor is a subsidiary that Cherry Corp sold to ON Semiconductor.
I’m not in favor of EU companies taking over US companies not due to some sort of nationalism but rather lessons from the past. Anyone remember Airborne Express :) DHL lost ton of money and a great profitable US company was lost. They just don’t know how to do biz in the US. Stick with what you are best at: welfare EU business community!
DHL is an awesome company, they have planet-wide overnight delivery, I’ve used it several times. If they chewed up and digested one of their competitors then I say good for them.
I’m not in favor of EU companies taking over US companies not due to some sort of nationalism but rather lessons from the past. Anyone remember International Signal and Control :) Ferranti lost ton of money and a great profitable UK company was lost. They just didn’t know how to do biz legally in the US. Stick with what you are best at: welfare EU business community!
This is going to put a lot of architectures under one roof — PowerPC, Coldfire, the 68XX legacy lines, 8051, XA, and two complementary and overlapping ARM product lines. I don’t envy trying to bring all this together into one rational set.
who says they have to “bring all this together into one rational set.”
For example look at HP, over the years they bought company after company and made no attempt to “bring them together”. They had old Digital products and old Tandem products and old Compaq products and they are all separate products today that have nothing to do with each other even many years later
Shoot even within NXP and Freescale they made no attempt to “bring things together”. Freescale has ARM and 68K and Power PC architectures, again going back decades, no attempt was made to merge them or “bring them together”
We are going to see some interesting products if the sensor gurus at NXP and Freescale start working together.
Just a thought:
An A15/LPC4xxx/FXLC95xxxCL Big.little.micro