Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Deep inside ARM's new Intel killer (theregister.co.uk)
54 points by protomyth on Oct 21, 2011 | hide | past | favorite | 42 comments


This is our inability to use multi-core.

From the performance graph (http://regmedia.co.uk/2011/10/20/arm_performance.jpg), two A7s would have the same raw processing power as one A15 - at less than half the power. The problem is we can't utilize that raw power effectively. So, instead, this awful, inelegant, stupid design is probably the absolute best we can do at the moment.

People claim that functional programming's immutability solves this problem. But it's interesting that while Erlang, one of the most effective multi-processor languages, is functional, it doesn't use that for multi-core: instead, it uses a shared-nothing architecture (like smalltalk) which is independent of the fp paradigm; and STM (Software Transactional Memory). The second interesting thing is that incredible rewards are available today for a solution (and have been for a while) - so where are the fp solutions? Why haven't they taken over, or had substantial success - or, at least, appeared?

Multi-core is a hard problem. Silicon is cheap but we don't know how to use it (excepting GPUs/CUDA). I think this will not be solved with any existing methods (including 50-year old fp). Instead, it will require a complete reframing of the question, from the silicon-up. That is, it won't even look like a programming language - or a computer.

It's hard to overstate the rewards for solving this.</provocativerant>


Erlang wasn't purpose built for multi-core though. It just so happens that everything desired for ultra reliability and continuity ended up working well for multi-core applications. The share-nothing architecture is not "for multi-core" it is for stability and reliability. The multi-core fit is a great bonus that happens to now be very important. Ditto for functional programming's immutability.


True; I was commenting on what makes it good for multi-core. Share-nothing helps; immutability doesn't (because they don't share data, unless on the same core). Unless there's some other benefit to immutability for multi-core...?


There exists a non-intuitive asymmetry in the nature of systems like multi-core. Do four 10mph car crashes do the same damage as one 40mph crash? If it can be properly coordinated. If you can coordinate it, how much does the coordination cost? At worst you only have the power of one 10mph crash, often at best too.

Programming for multi-core can a hard problem but the problem doesn't always lie in our abilities, many times it's computationally impossible.

To play games and video is a must right? We want the most from our battery right? A fact of the situation is that high power activities are many times mutually exclusive from low power activities right? So what is awful, inelegant or stupid about the design? The asymmetry? There is asymmetry of the interaction? Should you cary around a dumb-phone, a gameboy, and a video player? Maybe more elegant, but it sure is more awful.

P.S. Don't talk about FP son ;)


"Do four 10mph car crashes do the same damage as one 40mph crash? If it can be properly coordinated."

I know this is totally unrelated, but it is worth correcting: potential energy is 1/2 m v^2, so that faster car has 16 times the potential energy as each of the slower cars. That is why braking distances do not go up linearly with car speed and why high speed accidentals are way more lethal than low speed ones.

I guess there is an analogy here that one double-speed CPU beats two single-speed ones for most tasks, but I do not think it is very apt.


I enjoyed reading Ars Technica's take on this a while back, entitled "All this has happened before: NVIDIA 3.0, ARM, and the fate of x86"

http://arstechnica.com/business/news/2011/02/nvidia-30-and-t...

Predictions of x86's imminent demise are, well, wrong. But so too, prediciting arm's destruction at the hands of intel... not likely.

There's lots of market for both companies. They have different origins, different goals, and different areas of expertise.


> They have different origins, different goals, and different areas of expertise.

The greatest threat against Intel is the decreasing importance of the PC and the increasing importance of tablets, überphones and other non-Intel-based devices. Intel is common on desktops because the most popular OS for desktops runs on nothing but an x86 and on servers largely because what we call a server these days is just a (sometimes exquisitely) well built PC with rack mounting rails that has to be a PC because a very popular server OS can't run on anything that's not an x86 PC.

So, as x86's vanish from desks, there will be less reason to run desktop-derived OSs on servers and less reason to run them on x86 servers. With Windows 8 there is even less reason to have x86's in the remaining desktops, users being able to opt for more power-efficient ARM engines. I am not sure whether Windows Server 8 will support ARM-based servers, but, if they do, I expect the x86 to become a legacy platform.


I'll chime in on the "Post-PC Era" buzzword ... but only with a disclaimer: I don't buy it.

Intel wants to grow their market, so they're investing in UMPCs and MIDs. I think it's pretty telling they aren't in any phones (yet). They're going to try that in 2012.

Ok. "Post PC?" I don't buy it. PCs may change to ARM, but there will always be IT, science, engineering, data visualization, education, ... -- stuff that might work on a phone or a tablet, but the keyboard is a "killer app."

I actually see the two markets as symbiotic. Only in corner cases can you find people who prefer a smartphone over a PC, or vice versa. It looks like mainstream applications benefit from having both.


The "Post PC" era is coming, have little doubt of that. However, it'll be more of a phase transition than a cataclysm. Mostly it'll come down to different form factors, different operating systems, and different UIs. And yes, ultimately the markets will be even more symbiotic than they are today.


>The "Post PC" era is coming, have little doubt of that.

I see a lot of people saying this, but I don't see much evidence for it. Anybody who has tried to edit a document or spreadsheet on a tablet knows PCs aren't going anywhere.


Do you agree we are in the post-mainframe and post-mini eras? Yet you can still buy mainframes and (probably) mini computers. The end of the era does not mean the end of the type of computing, just that the focus of new technology development won't focus around it.

We are in the post-PC era.


Mainframes and minis were replaced by PCs. Tablets and mobiles may complement PCs, but they won't replace PCs. Like I said, I don't see any actual evidence there's a "post-PC" era. It's more like a "Post-DS" era.


Mainframes continue to be used to this day. Minis were squeezed out. PCs will likely continue to be used.

Eras don't end with a switch flipping. Things fade out. PCs have started their fade as we move to mobile. We still have little idea what mobile will evolve into, but there is no question it and not PCs is what is evolving.


Mainframes were replaced by PCs scaled up to mainframe levels, the old mainframe lines died off. That will probably happen to some degree with mobile and PCs as well. Laptops in the future will be more like tablets with keyboards than like the laptops of today, for example. And desktop OSes may have as much heritage from mobile OSes today as from stock OS/X and Windows today.


> Mainframes were replaced by PCs scaled up to mainframe levels

At the same time, mainframes moved on, acquiring new features PC-based servers cannot yet replicate. You'll never get the 5 nines a zSeries gives you out of a Dell. Or an xSeries IBM box. It's an incredibly sophisticated stack that goes from the user-facing applications and how you design them down to the processor microcode and processor cores dedicated to hardware management functions.


Post-PC devices are not necessarily keyboardless tablets. The transition we'll see has much more to do with where the data is stored and where programs are running than with the physical format of the device.


Then it needs a different name, right? "Post-PC device" implies it has left the PC model behind, and although new input methods may arrive, PC's will still be around. Specifically, laptops are what I'm thinking of when I say PC, and the even though the form factor isn't evolving rapidly, tablets have been around a long time and the form factor hasn't evolved that much.

Now, maybe you'd say that the shape of an iPad is way different from the shape of a windows table from a decade ago, but it isn't. They're both as flat as possible with just a screen.

The transition to SaaS and the cloud does not mark the end of the laptop.


Your idea of personal computer is something with a keyboard or the device physical form. I interpret it as something much more related to function - a device that holds and processes data locally, with locally installed applications and that you have to manage yourself, dealing with files that can be processed by different applications. This has been what a personal computer is since their invention. A "post-PC" device (and I see it as mostly a marketing term, but a useful way to refer to these devices nevertheless) is something that won't make you manage your files, that has a simple way to install/remove applications and that is tied to always-available remote storage and services. A Chromebook is much more post-PC-ish than an iPad, but that's another discussion.


People in, say, 2030 won't stop using keyboards, and they won't stop using devices which have a lot of similarities to what we would call a PC today, even down to the metal chassis with PCBs inside connected to a monitor and keyboard in some cases. However, there are a lot of other very fundamental elements to the computing experience which pretty much every computer system will have which will be inherited from non-PCs (specifically tablets and smartphones).

To be clear: the tablet is the first hint of the post-PC era we have, but it does not and will not define the entire extent of the post-PC phenomenon.


Cool, I can agree with that.

New input methods will take on a greater role. E.g. Kinect (time of flight sensors), Multi-touch, Voice, Light fields.


If symbiotic, then not "Post PC." I don't buy it. Maybe I need to phrase it as a question:

Why would you get rid of your PC once you have more form factors, more OS's, and more UIs? Especially, why would you get rid of your _keyboard_?


See my other reply. I don't take "post PC" to mean "no keyboard ever", it has a different meaning for me, mostly about the difference in default (though not exclusive) form-factor and especially in the OS and UI. But beyond that there's still plenty of flexibility.

Also, I don't think the PC as we know it will die per se, it'll just fill a different niche than it does today.


Ars had a better take on it, but that is to be expected.

I was reminded of how MIPS used to be the best selling ISA but that was because it was in HP and Lexmark printers so tons were sold and nobody realized it.

The tablet/smartphone markets have really pushed ARM into a position where the shear number of implementations justifies the time and effort to build better software. The helps the architecture achieve better tool parity and that is something that is vital for its longevity.


> because it was in HP and Lexmark printers

And game console. PS1, PS2 and N64, IIRC, were MIPS boxes.


And a huge amount of communications gear, e.g. routers, cable/dsl modems, wifi access points etc


I saw MIPS in embedded solutions too - engines, industrial processes... It was a very popular architecture.


According to the article , intel's strength is in per thread performance.But most of today's heavy applications enjoy multi-thread performance: Multi-user servers, Heavy math applications and other applications written with today's better multi-programming tools.

So that means selling a lot less chips.

But that bad for intel's business model , and get worse with moore's law(they gotta sell double number of transistors).

Now about demise: it's hard to tell because it depends minimum sales intel needs to sustain it's business model, But intel are trying to counteract that with several interesting moves:

1. An integrated pc platform strategy: CPU+GPU integration , chipsets with better integration with their flash modules(z86 chipset), buying an anti-virus software. This way they can get more money and have more control on the pc market. This may compensate them on decreasing volumes. And hurt GPU manufacturers along the way. 2. Finding new sources for chip manufacturing r&d investments: Manufacturing flash, letting non-competing companies manufacture chips using intel fabs. 3. trying hard to compete on the multi-core performance game: letting old code(c++/fortran, etc) run faster on a 64-core intel platfrom might be more appealing than needing to import that code to a GPU.Working on parallel development tools also helps there.


I don't know quite why every recent competitive relationship in tech has been portrayed as an imminent existential battle, but it's an annoying trend.


It sucks. I think it's as simple as: page views.

Reduce the news to something as simple as possible, give people a side to cheer for. Who would have imagined that tech would turn in to a spectator sport?


Because tech writers need something to write about, and rises and falls are more exciting that coexistence. I got tired of reading endless "PCs are dying, so run out and buy more stuff!!!11" articles and decided to write a response to them: http://jseliger.com/2011/10/09/desktop-pcs-arent-going-anywh... that's applies to the analogous situation you bring up.


On the other hand, this will probably head of MIPS's attempts to encroach on ARM's markets.


I'd really like to know if MIPS is actually competitive at the same performance-per-watt levels as ARM.

As I understand it, MIPS is easy to license and cheap to build, so when cost is the biggest factor you go with MIPS. That's nothing to sneeze at.

The way I see it ARM dominates in phones and tablets because these devices do much more than, say, a wireless router.


ARM, especially ARM-thumb mode, is the gold standard for instructions/watt. That's why it's in phones since way back. I don't see MIPS challenging that.


This is the A7 mentioned yesterday. The article is worth a peek if only for the two contrasted CPU instruction diagrams showing the difference in complexity and the graph of performance versus power for the two chips.

It all makes sense when you see those.


Here's ARM's whitepaper that the article links and summarizes: http://www.arm.com/files/downloads/big.LITTLE_Final.pdf


I've always thought it strange to think that ARM (or any other CPU design) company could take out Intel.

Worst case in the ARM scenario, Intel buys a company with an ARM license and is still a step ahead on process and fab technology.

To boot, increased mobile device usage (even on ARM) is probably good overall for Intel as it helps drive demand for the servers powering the services those devices consume.


ARM chips are priced an order of magnitude lower than Intel's chips. Their process advantage would only last one generation because the fat profit margins aren't there to pay for the next one.

If ARM kills Intel, its because cheap chips are finally "good enough" for nearly all purposes.


Intel already bought and then sold an ARM licensee:

http://en.wikipedia.org/wiki/XScale


I'm curious to see how the Linux process scheduler will load balance processes will be tweaked to use the big dog and little dog processors. IO-bound processes could be migrated to the little dog and CPU-bound processes to the big dog.


From the description, it seemed like there is a mode to handle this transparently in hardware


I believe the OS can, optionally, make these decisions itself if it wants to.


"big.LITTLE processing".

Funniest name since Blast Processing.

I imagine little Sackboys running around my smartphone, carrying bits of soft polyester data.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: