Intel's Core 2011 Mobile Roadmap Revealed: Sandy Bridge Part II
by Anand Lal Shimpi on August 30, 2010 8:01 AM ESTThe Roadmap
The roadmap for mobile is a lot easier to read through than the desktop one. There aren't as many competing products within a given price class. I've put together the Q3 2010 - Q3 2011 mobile CPU roadmap below, but I've left out the value segments. Sandy Bridge won't make it down there until late next year at this point, so Celerons are off limits for now.
If you're an Apple user the parts you'll want to pay attention to are the 2620, 2540 and 2520 - these will likely be in the next 15-inch MacBook Pro. Clock speeds are up slightly compared to what Apple is shipping today, which means you'll probably see at least a 10%+ performance improvement across the board. I'd expect that number to grow to as high as 15 - 20% depending on the application.
I don't believe Apple will abandon NVIDIA as a result of Sandy Bridge's vastly improved graphics given SB's lack of OpenCL support.
Final Words
Sandy Bridge looks to be very capable, both on the desktop and mobile side. Both CPU and GPU performance are much improved, the latter particularly in notebooks as all launch mobile Sandy Bridge parts will ship with the higher end 12 EU configuration. Intel is clearly going after the low hanging fruit in the GPU market, though I'm curious to see how far upstream Intel will push its advance.
It's not very hard for Intel to more than double integrated graphics performance. The question is how will it compare to AMD's Llano, a part that will undoubtedly have a competant GPU but a CPU core based on AMD's Phenom II architecture. 2011 is going to be an exciting time for the semiconductor market.
52 Comments
View All Comments
IntelUser2000 - Monday, August 30, 2010 - link
Holy crap. It doesn't make sense, but 650/1300MHz with 12 EUs seems amazing on mobile. I'd WAG the 650/1300MHz IGP is clock speed equivalent to a 1.1GHz "stable" frequency GPU, while the 850/1350MHz on desktop is equal to 1.25GHz.Still, the 12EU is a big thing. So the question is, 1 vs 2 core. Does it really mean EU-wise or more than just EU? If its just EU the difference might be much smaller than 50%.
CharonPDX - Monday, August 30, 2010 - link
The difference is that the mobile space has a lower power threshold to hit; and Intel would like to muscle discrete GPUs out of the equation completely in both the low end and midrange.So by offering such aggressive turbo (plus all SKUs having more EUs,) they make it more likely that midrange systems will forego discrete GPUs. For example, we may very well see Intel integrated graphics as the standard, even on the highest-end Apple products; with a discrete GPU solely as an additional-cost option.
In the mobile space, you can trade off CPU power usage for GPU power usage more readily. On the desktop, you don't need to do that nearly as much, with the greater power envelope, combined with the reasonably common acceptance of a low-end discrete GPU on mid-range systems.
For the majority of games, even two CPU cores aren't running at full load, so the CPU half could run at stock speeds, possibly even idling one or two cores on the quad-core parts, while the GPU could take up all the power headroom to boost itself up all the way. Once you're back at your desktop doing some video compression, though, the GPU can drop to idle, and all (potentially four) cores of the CPU can take all the power they can muster to get that over with.
Now, if you play a game that will fully stress all your CPU cores *AND* your GPU, though... And you end up with a mess.
I can't wait to see benchmarks of CPU-heavy games, tweaked with different settings. You'll probably find that settings that should be purely CPU-intensive will drop your framerate noticeably, as the GPU doesn't have the headroom to clock up when the CPU is drawing the power.
IntelUser2000 - Monday, August 30, 2010 - link
You don't see a full-fledged Geforce GTX480 on a laptop because the thermals don't allow it. If you are giving a faster product on a mobile variant it just means your desktop part is crippled and not showing its full potential.They can't take such a chance when Llano's GPU is rumored to be substantial. They need all the advantages they can get.
About Turbo: I think its too early to judge how it'll do on CPU-intensive games. If implemented well it can work pretty good. The Turbo driver for graphics on Arrandale has a frame rate counter. If they can see that boosting the GPU works better than CPU it can be programmed so GPU Turbo kicks in more often.
It's not "oh we'll let the software decide what to do", but rather "hmm most games require GPU power so we'll make the software Turbo more on GPU".
IntelUser2000 - Monday, August 30, 2010 - link
You don't see a full-fledged Geforce GTX480 on a laptop because the thermals don't allow it. If you are giving a faster product on a mobile variant it just means your desktop part is crippled and not showing its full potential.They can't take such a chance when Llano's GPU is rumored to be substantial. They need all the advantages they can get.
About Turbo: I think its too early to judge how it'll do on CPU-intensive games. If implemented well it can work pretty good. The Turbo driver for graphics on Arrandale has a frame rate counter. If they can see that boosting the GPU works better than CPU it can be programmed so GPU Turbo kicks in more often.
It's not "oh we'll let the software decide what to do", but rather "hmm most games require GPU power so we'll make the software Turbo more on GPU".
IntelUser2000 - Monday, August 30, 2010 - link
Sorry for the double post.Everything else is good for Anandtech except for the retarded posting system.
iwod - Tuesday, August 31, 2010 - link
Unless Apple and Intel will develop a OpenCL Drivers for this new Intel IGP, otherwise Apple using it solely for their Laptop will be out of the question.Calin - Tuesday, August 31, 2010 - link
A game that is fully stressing both the CPU and the GPU belongs to a gaming rig (or a gaming laptop), not to a laptop.This "powerful integrated graphics" will also greatly simplify the cooling (one heat sink, one fan) and the power delivery (probably cut in half the number of components). Along with real "fast enough" graphics, looks like NVidia is out of the market of midrange laptop graphics (and AMD/ATI is also out of the midrange laptop graphic for everything outside its own processors/chipsets).
bennyg - Thursday, September 2, 2010 - link
lol. My G51J has only 1 fan for 100W+ worth of i7-720 CPU and GTX260M GPU and manages to keep it below boiling point. Not by much mind you...Midrange graphics won't be touched. We're only talking about 2-3x the power of current integrated gfx, which only puts the low-end dedicated GPUs under threat, where arguably the feature set is more of a selling point than han grunt anyway. If these iGPUs can match on feature set then we'll see the "6400"/"6500"/"410M" et al have a tough time justifying their existence.
SteelCity1981 - Monday, August 30, 2010 - link
I'm suprised intel didn't name the 2720QM or 2820QM to something on the lines of 2750QM or 2850QM considering the clock speeds are a bit faster then the original 720QM's and 820QM's which sounds like a more evolutionary step to the next level beyond the 740QM and 840QM's. Also, I take it that Q4 of 2011 will prob see a refresh to thev 2nd gen mobile core I-series lineup much like the current mobile core I-series saw this year.bennyg - Thursday, September 2, 2010 - link
740/840/940 wasn't a refresh. It was a rebadging of the kind we hate Nvidia for.It was just a redrawing of the lines between speed bins. They are EXACTLY the same chips as were used in 720/820/920 just with a 1x higher multiplier across the board.
I also see the confusion everywhere... go onto any forum where people ask "what notebook should I buy" and you'll see a "i5 vs i7" or "dual vs quad" thread on a daily basis with a totally confused OP and sometimes even the replies are even more wrong.