Ok, I am going off on a tangent. I understand technology is expanding, and that CPU's have a huge amount of transistors now on a single die. So with our knowledge and manufacturing capabilities, why are we still make motherboards as big as a VCR? Seems to me a lot of space could be save by creating a semiconductor die, for all those capacitor's, resistor's, inductor's, and other semiconductor packages. After all your spending all that money to develop a new system. Why not make it a sportier, Lighter in weight, and tie it with the newest VR gear, and have a gaming anywhere super system?
The embedded RAM + DDR3 paradigm is sort of a gimme to Intel. It probably will not even be that long before intel starts shipping a CPU with embedded RAM and their equivalent of 768 AMD shaders (3x HD5200?). When they do they will be able to snatch up a handful of developers with lots of experience working with that exact setup. It seems like a mistake for AMD to provide such a design knowing it is going to hurt them down the road. This is another "what the hell were they thinking" moments.
I think Shadowmaster625 was implying that AMD is giving developers experience working on such a platform ready for Intel to come in and sweep them up. I don't think it's relevant as there's nothing particularly exotic about having a small amount of fast memory when you consider how developers have had to deal with the Cell chip.
I think the first microprocessor with embedded memory was the 1975 Fairchild F8 which had 64 bytes of scratchpad. The Motorola 68k line also had internal cache before Intel's x86. The 68040 from 1990 had the same amount of internal cache, 8KiB, as the Pentium Pro. The Pentium Pro, though, had a large on package cache (256KiB to 1 MiB).
So AMD should give up a guaranteed revenue stream from the two biggest console manufacturers for 10+ years just so they won't (as a side effect) train developers on an obvious architecture? That doesn't make any sense.
I wonder how close the DDR3 plus small fast eSRAM can get to the GDDR5s peak performance from the PS4. The GDDR5 will be better in general for the GPU no doubt, but how much will be offset by the eSRAM? And how much will GDDRs high latency hurt the CPU in the PS4?
The cpu is running on low frequency ~ 1.6 GHz which is half the frequency of most mainstream processors. And the GDDRs latency shouldn't be more than double the DDR3 latency. So in effect the latency stays the same, relativelly speaking.
GDDR5 actually has around ~8-10x worse latency compared to DDR3. So the CPU in the PS4 is going to be hurt. Everybody's talking about bandwidth but the Xbox One is going to have such a huge latency advantage that maybe in the end it's going to be better off.
gddr5 having much worse latency is a myth. The underlying memory technology is all the same after all, just the interface is different. Though yes memory controllers of gpus are more optimized for bandwidth rather than latency but that's not gddr5 inherent. The latency may be very slightly higher, but it probably won't be significant enough to be noticeable (no way for a factor of even 2 yet alone 8 as you're claiming). I don't know anything about the specific memory controller implementations of the PS4 or Xbox One (well other than one using ddr3 the other gddr5...) but I'd have to guess latency will be similar.
Are you talking latency in cycles (i.e. relative to memory's clock rate) or latency in seconds (absolute)? Latency in cycles is going to be worse, latency in seconds is going to be similar. If I understand it correctly, the absolute (objective) latency expressed in seconds is the deciding factor.
I got my info from Beyond3D but I went to dig into whitepapers from Micron and Hynix and it seems that my info was wrong. Micron's DDR3 PC2133 has a CL14 read latency specification but possibly set as low as CL11 on the XBox. Hynix' GDDR5 (I don't know which brand GDDR5 the PS4 will use but they'll all be more or less the same) has a CL18 up to CL20 for GDDR5-5500. So even though this doesn't give actual latency information since that depends a lot on the memory controller, it probably won't be worse than 2x.
Nowhere near as bad as I thought GDDR5 would be given what everyone is saying about it to defend DDR3, and given that it runs at such a high clock rate the CL effect will be reduced even more (that's measured in clock cycles, right?).
For game performance,GDDR5 has no equal atm.Their is a reason why it's used in Gpu's.MS is building a media center,while Sony is building a gaming console.Sony won't need to worry so much about latency for a console that puts games first and everything else second.Overall Ps4 will play games better then Xbone.Also ESram isn't a good thing,the only reason why Sony didn't use it is because it would complicate things more then they should be.This is why Sony went with GDDR5 it's a much simpler design that will streamline everything.This time around it will be MS with the more complicated console.
Also lets not forget you only have 32mb worth of ESRAM.At 1080p devs will push for more demanding effects.On Ps4 they have 8 gigs of ram that has around 70GB's more bandwidth.Since DDR3 isn't good for doing graphics,that only leaves 32mb of true Vram.That said Xbone can use the DDR3 ram for graphics,issue being DDR3 has low bandwidth.MS had no choice but to use ESRam to claw back some performance.
I've been a long-term Xbox fan, but the silly Kinect requirement scares me. It's only a matter of time before somebody hacks that. And I'm a casual sit-down kind of gamer. Who wants to stand up and wave arm motions playing Call of Duty? Or shout multiple voice commands that are never recognized the first time around?
If PS4 eliminates the camera requirement, get rids of the phone-home Internet connections, and lets me buy used games then I'm willing to reconsider my console loyalty.
Are you required to use the Kinect controller or can you set what controller you want? I'm betting you can set the one you want to use and not be made to use Kinect? It wouldn't surprise me to see a later/lower cost version sold without Kinect though.
MS had no choice but to use ESram,DDR3 has 1/3 the bandwidth of GDDR5.It's like using a band-aid on a sinking ship.Sony made Ps4 like a PC,down to the way the GPU uses GDDR5.Latency wise it won't be overly bad for the Cpu.
On the size and cooling, it's a 100 watt APU, they said in their technical talk with Engadget. Much lower than what the 360 debuted at. The reason it's as large as the launch 360 is that the PSU is internal this time.
I don't think either was confirmed at this point, I did see the prototype power brick but who knows with retail hardware. If the One is that big PLUS a huge external brick, wtf.
An MS exec in one interview indicated they prefer an external PSU for simplifying shipping the product into different regions. The core box remains identical with the external brick handling the power localization. It also gets a big heat source out of the box. They're likely to be intent on low cooling noise after the valid criticism the first couple iterations of the 360 received.
There was a certain amount of rush to the 360's development due to the need to end production of the original Xbox and not leave a huge time gap in the shift to the new platform. The success of the 360, after those early debacles, meant they could take their time on this design and do a lot more testing for industrial factors like noise suppression.
If it's a basic Jaguar in there, what do you think Microsoft meant by saying it can do 6 operations per core per clock? Jaguar would be 4. Unless they meant load and store as the two extra.
Yes exactly, load/store count separately. 2 Int + 2 FP + 2 LS making it 6. The more important number is probably dispatch/retire/decode width anyway (which is 2 for all of these).
@Anand: "We already have the Windows kernel running on phones, tablets, PCs and the Xbox, now we just need the Xbox OS across all platforms as well."
Agreed, this would be a nice selling point for a future Surface Pro device. It would be a relatively cheap way to bring the added value necessary to justify the higher cost in the eyes of the mass market.
So the Surface would have to run a hypervisor which would require like 8+ GB of memory if the user wanted a traditional Windows experience along with Xbox games.
2 options there: 1) Dump memory to flash if you *really* feel the need to run both simultaneously at full-tilt (daft) 2) 16GB RAM, which will cost a whole lot of not much by the time this idea becomes feasible.
there are a lot more ipads than xboxes out there... msft wants the surface to have its place there, and well, that will probably make them more money than simply winning this generation's 'console war' - makes some sense. wasn't messaged like that at all by the company, but gotta appreciate it's potential merit nonetheless
Meh. If they kill the box and make tablets and mobile their gaming focus, that gives the iPad and Android an ample opportunity to become their prime competitors, and the iPad already has a lot of headway on mobile gaming.
I don't think we can connect 2 or more controllers plus LCD TV to our smartphones just yet. And neither a smartphone nor a tablet can play high end games, certainly not at anything close to reasonable frame rates. Those who think a smartphone is going to replace consoles, PCs, and NASA supercomputers in the next 8 years all live in fantasy land.
Sure we can. There's a fair few phones with HDMI connectors, and it wouldn't take much in the way of software tweaks to pair multiple bluetooth controllers (assuming it's not already supported).
Actually I have a note 2 and attached to the smartdock and a couple of powered usb hubs i can connect 4 controllers to it and watch it on my 28 monitor or 55 led panel, hook up my external drives and play my bluray movies as well as surfing and netflix and the rest of it while listening to my stereo bluetooth headset. Convergence is happening already. Take off your blinders and look around!
They wouldn't. First you have SmartGlass which makes the smartphones and tablets into controller peripherals as a 1st step of integration. Then these peripherals would evolve more into an accessory gaming controller paired with the Xbox system. After that the Xbox system grows from the living room into tablets and smartphones where games are played cross-platform. Every device does it job and can't replace the other. If ppl argue that we may hookup the phone to the TV instead of the console, it would be possible but imagine getting a phone mid game. This may increase MS sales if they can improve ground they built now. Every product they made has excellent potential and just needs to be smoothed out...hopefully. Sony has already spoken about integrating different devices into same ecosystem via Android. I don't think Apple can compete with it unless they use their tv box as a console too.
I believe that Microsoft secured exclusive rights to all EA Sports titles for the XO, at least FIFA, Madden, NBA Live and UFC so far. Then there's COD:G which is exclusive as well. Whether these are temporal exclusives, DLC exclusives, or something else, I don't know.
sorry to say, but they're not exclusive at all. they might have some exclusive features ('ignite'?) or early DLC, but the games are confirmed for PS4, too.
What are you smoking? It's news but not surprising that EA is dumping the Wii U, but the PS4? That would be insane. And in fact, such a thing has not happened.
Same with COD:G. AQll of the ones you have mentioned are cross-platform.
Here is what is exclusive regarding EA Sports titles and CoD Ghosts.
X1 will receive the Ultimate Teams feature exclusively. They haven't mentioned what that is. But the games themselves will release simultaneously on 360, PS3, X1, PS4 (assuming the last two arrive before the sports title is released).
X1 will receive the CoD Ghosts DLC pack first, before it is released on PS4. Similarly to how it has been with every release of COD this gen for the past few years. CoD Ghosts will release simultaneously on 360, PS3, X1 and PS4.
I'm personally waiting to see someone strip Xbox OS from this and stick plain-old Windows on it. This might make a good HTPC without having to deal with any sort of specialized OS
Why bother? You can build a mini-ITX HTPC using an AMD APU right now for less than the XO will likely cost at launch. I'm building a mini-ITX system right now that is thus far not breaking the bank. (To be fair, I got the Core i7-3770K for an unusually low price but I'd have settled for a low-power A10 model for about the same price and lower cost on the motherboard.) In fact, by taking it slow and gathering parts as deals come along, it has totaled remarkably little so far.
I suspect Newegg has my phone bugged. Whenever I mention in a conversation not being able to find a good price on a particular needed item, I seem to get an email within hours with a sale on that item.
As alluded to in the article, only PS4 exclusives are likely to take advantage of the additional processing power. Most developers will probably use the same textures/lighting/etc. on both platforms to lower porting costs so you'd never see an improvement.
I think they were correct to focus more on the Kinect 2.
It's not difficult at all to include different levels of textures and ligthing. As we all know, the PC games makers have been doing that for years. And these news consoles are nothing but PCs.
Frankly, even if they don't program for it, it means that everything will run just a little smoother on the PS4. I'm now leaning toward the PS4 to replace my 360. If they go the same pay for multiplayer route they did this generation it will cement my decision.
The reality is that games are never fully optimized for any hardware configuration, so even if PS4 users never see higher res textures or higher poly models, having 50% (!!!) more GPU power means they will see smoother framerates with less dips.
I'd take that over some Big Brother contraption in my living room (Kinect) that will be broken into by creepy hackers trying to spy on teenage girls. Or I would, if I were buying a console, which still hasn't been decided (cost/affordability rather than any ideological divides).
Sony's cam is not required to be plugged in for the rest of the console to work . XB1, yes. No Kinect, no console, period.
Sony's cam are not hooked to an always-on console. I could be offline forever if I want and the console would still work, and it would be impossible to hack if it's not online. If your XB1 is off the net for more than 24H, no console, period
Sony's cam can actually be turned off, and I mean completely off. XB1, no. It's on, even when everything else is off. Just in case you are too lazy to just get out of the couch and press power on on your console, your Kinect is always on to accept your voice command.
Always-online console + always-on cam and mic and no way to shut any of those thing off = sooner than later some Chinese hackers WILL record you f*cking your wife on that couch and blackmail you for some money threatening you to post that video on YouTube.
FYI The new PS Eye (Kinect like Camera) will be included in every PS4 box. This was confirmed in February. Whether it is required to be plugged in like the X1 remains to be seen, but I wouldn't be surprised.
Seriously? LOD, rendering distance, anti-aliasing level, texture filtering level, post processing level, tessellation, shadows and every other conceivable GFX parameters can be turned up or down on the fly by ANY modern game engine. Even fluid and particles are now fully visualized and not fixed, prerendered object and so their level of complexity can be easily adjusted up of down.
PS4 would be displaying GFX on high or very high and XB1 only on medium. Same identical textures, same number of polygons etc... The foundation would remain the same, but PS4 would display more complex scene at higher quality. You don't need 2 different sets of games to produce drastically different results, it's all built in the engine already. It's only a matter of processing power and the PS4 GPU and GDDR5 are just that, more powerful.
The only real inherent limitation would be the number of items/accessories simultaneously on screen (trees, cars, spectators, chair, cup, book etc...) in a given scene that would fixed by the game developers and even then, they could build some flexibility into it to allow the more powerful system to display more stuff.
I don't think it will be just exclusives perse. If we think about what types of games both systems will get then we will see likely see better performance on the PS4 for games that are graphic intensive.
Think about RTSs for example that tend to have tons of stuff happening on screen or something like Dungeon Defenders.
Skyrim, Dragon's Age, Dragon's Dogma etc also all have tons of lighting and graphical effects going on at the same time that should see benefits from the difference in ram.
Anyone who has played Dragon's Dogma on the PS3 and uses a Mystic Knight with 3 great cannons firing all at the same time has seen slowdown happen and that's because of the lower amount of ram. This should be a problem with the PS4.
"Why didn't they just match X?" Because the design of these chips to have them tested, validated, and shipping on time is on the order of years, not weeks.
I hate it when websites get info wrong or interpret it badly, sony matched shipped numbers not sales numbers. I fyou go to vgchartz you will see the xbox was still up around a million units last time I checked.
I think when both systems are hovering around the same numbers and are off by 1 million. It's fairly negligible in the grand schemes of things. It's impressive for Sony how close they are, considering how colossally Sony screwed up early in this gen and started a year after the 360.
Kind of sorry for Sony, it took them that long to get even, remember they had Asia all to themselves. It also happens to be the biggest continent in the world.
Biggest continent in the world doesn't mean anything. Their purchasing power pales in comparison to the US and Europe.
Although their purchasing power is increasing, which is more beneficial for Sony in the long term, but even then it still pales in comparison to what the average North American can purchase.
And just how do you expect them to do that? Decisions on what hardware to use was made a lot earlier than Sony's PS4 presentation, meaning that train has already left the station. I'm guessing AMD is massproducing the hardware by now. Mircosoft: Oh we saw that Sony is going for a much more powerful architecture and we don't want any of the million of APU's u've just produced for us!
If AMD is using Jaguar here, isn't that basically an admission that Bulldozer/Piledriver is junk, at least for gaming/desktop usage? Why don't they use a scaled-up Jaguar in their desktop APUs instead of Piledriver? The only thing Bulldozer/Piledriver seems to be good for is very heavily threaded loads - i.e. servers. Most desktop users are well served by even 4 cores, and it looks like they've already scaled Jaguar to 8. And AMD is getting absolutely killed on the IPC front on the desktop - if Jaguar is a step in the right direction then by all means it should be taken. BD/PD is a sunk cost, it should be written off, or restricted to Opterons only.
Bulldozer/Piledriver needs SOI. Steamroller is not ready yet, and it is not portable outside of Globalfoundries gate-first 28nm architecture. Jaguar is bulk 28nm and gate-last, which can be made by TSMC in large quantities at lower cost per wafer.
All the more reason for AMD to switch to Jaguar in their mass-market CPUs and APUs. I'd be willing to bet money that a 4-core Jaguar clocked up to 3 GHz would handily beat a 4-module ("8-core") Piledriver clocked to 4 GHz. BD/PD is AMD's Netburst, a total FAIL of an architecture that needs to be dropped before it takes the whole company down with it.
Jaguar can't be clocked at 3GHz - 2GHz is closer to the hard limit as far as we currently know. It's clock limited by design, just look at the clock latency of FPU operations. IPC is at best similar to Piledriver (in practice probably a little worse), so in tasks heavily limited by single threaded performance Jaguar will do much worse. Consoles can bear limited single threaded performance to some extent but PCs can't.
It's effectively a low-power optimised Athlon 64 with added bits, so it's not going to scale any higher than Phenom did. That already ran out of steam on the desktop. Bulldozer/Piledriver may not have been the knockout blow AMD needed but they're scaling better than die-shrinking the same architecture yet again would have.
Bobcat/Jaguar is a new architecture specifically designed for low-power usage. It's not the same as the K10 design, though it wouldn't surprise me if they did share some parts. And even just keeping K10 with tweaks and die-shrinks would have worked better on the desktop than the Faildozer series. Phenom II X6 1100T was made on an outdated 45nm process, and still beat the top 32nm Bulldozer in most benchmarks. A die-shrink to 28nm would not only be much cheaper to manufacture per chip than Bulldozer/Piledriver, but would perform better as well. It's only pride and the refusal to admit sunk costs that has kept AMD on their trail of fail.
That's a nice bit of FUD there. K10 had pretty much been pushed as far as it was going to go. Die-shrinking and tweaking it was not going to cut it. AMD needed a new architecture.
Piledriver already handily surpasses K10 in every metric, including single-threaded performance.
In terms of single-threaded performance *per clock*, Thuban > Piledriver. Sure, if you crank up the clock rate *and the heat and power consumption* on Piledriver, you can barely edge out Deneb and Thuban on single-threaded benchmarks. But if you clock them the same, the Thuban uses less power, generates less heat, and performs better. Tom's Hardware once ran a similar test with Netburst vs Pentium M, and his conclusion was quite blunt: the test called into question the P4's "right to exist". The same is true of the Bulldozer/Piledriver line. And I don't buy the argument that K10 is too old to be fixable. Remember that Ivy Bridge and Haswell are part of a line stretching all the way back to the original Pentium Pro. The one time Intel tried a clean break with the past (Netburst) it was an utter fail. The same is true of AMD's excavation equipment line and for the same reason - IPC is terrible so the only way to get acceptable performance is to crank up clock rate, power, noise, and thermals.
It's true that K10 is generally more effective per clock, but look at it this way - AMD believed that the third AGU was unnecessary as it was barely used, much like when VLIW4 took over from VLIW5 as the average slot utilisation within a streaming processor was 3.4 at any given time. Put simply, they made trade-offs where it made sense to make them. Additionally, K10 was most likely hampered by its 3-issue front end, but it also lacked a whole load of ISAs - SSE4.1 and 4.2 are good examples.
Thuban compares well with the FX-8150 in most cases and favourably so when we're considering lighter workloads. The work done to rectify some of Bulldozer's ills shows that Piledriver is not only about 7% faster per clock, but can clock higher within the same power envelope. AMD was obviously aiming for more performance within a given TDP. The FX-83xx series is out of reach of Thuban in terms of performance.
Oddly, one of your arguments for having a Thuban in the first place was power consumption. The very reason a Thuban isn't clocked as high as the top X4s is to keep power consumption in check. Those six cores perform very admirably against even a 2600K in some circumstances, and generally with Bulldozer and Piledriver you'd look to the FX-8xxx CPUs if comparing with Thuban, however I expect the FX-6350 will be just enough to edge the 1100T BE in pretty much any area:
The two main issues with the current "excavation equipment line" as you put it is a lack of single threaded power, plus the inherent inability to switch between threads more than once per clock - clocking Bulldozer high may offset the latter in some way but at the expense of power usage. The very idea that Steamroller fixes the latter with some work done to help the former, and that Excavator improves IPC whilst (supposedly) significantly reducing power consumption should be evidence enough that whilst it started off bad, AMD truly believes it will get better. In any case, how much juice does anybody expect eight cores to use at 4GHz with a shedload of cache? Does anybody remember how hungry Nehalem was, let along P4?
I doubt that Jaguar could come anywhere near even a downclocked A10-4600M. The latter has a high-speed dual channel architecture and a 4-issue front end; to be perfectly honest, I think that even with its faults, it would easily beat Jaguar at the same clock speed.
Tacking bits onto K10 is a lost cause. AMD doesn't have the money, and even if it did, Bulldozer isn't actually a bad idea. Give them a chance - how much faster was Phenom II over the original Phenom once AMD worked on the problem for a year?
The previous athlon had a higher clock speed and the same amount of cache, but regor crushes it by almost 30% in Far Cry 2. It is 10% faster across the board despite being lower clocked and consuming far less power. Had they continued with Thuban it is possible they would have continued to squeeze 10% per year out of it as well as reduce power consumption by 15%, which if you do the math that leaves us with something relatively competitive today. Not to mention they would have saved a LOT of money. They could have easily added AVX or any other extensions to it.
Per clock Thuban > Piledriver, but power consumption favors Piledriver. Compare two chips of similar performance. The PhII 965 is a 125W CPU and the FX4300 is a 95W CPU and they perform similarly with the FX4300 actually beating the PhII by a small margin.
... Lol? You can't simply clock a low-power architecture up to 4GHz. Even if you could, a 4GHz Jaguar-based CPU would still be slower than a 4GHz Piledriver-based one.
Jaguar is a low-power architecture. It's not able (or meant to) compete with full-power CPUs in raw processing power. It's being used in the Xbox One and PS4 for two reasons: power efficiency, and cost. It's not because of its processing power (although it's still a big step up from the CPUs in the 360/PS3).
BD/PD have plenty of viability in big power envelope, big/liquid cooler, desktop PC arrangements. consoles aspire to be much quieter, cooler, energy efficient - thus the sensible jaguar selection. even the best ITX gaming builds out there are still quite massive and relatively unsightly vs what seems achievable with jaguar... now for laptops on the other hand, a dual jaguar 'netbook' could be very very interesting. you can probably cook your eggs on it, too, but still interesting..
It isn't a step in the right direction in IPC. Piledriver 40% faster than Jaguar at the same clocks and also clocks higher.
Stop spreading the FUD about Piledriver -- my A8-4500m is a very solid processor with very strong graphics performance and excellent CPU performance for all but the most taxing tasks.
Embedded memory latency is MUCH closer to L1/L2 cache latency than system memory. System memory is Brian and Stewie taking the airline to Vegas vs the Teleporter to Vegas that would be cache/embedded memory...
1) XBOX OS - plays games 2) App OS - runs apps 3) Hypervisor OS - based on Windows NT kernel, manages resources split between the two.
Why 3? Why not just one? Likely because splitting them up improves portability of code from the XBOX ONE and Windows proper (Windows 8 and it's future versions).
The reason is, AFAIK, is that there are fundamental design issues with x86 that would allow third-parties, if given access to a direct OS - bare metal switch, the ability to hack the console in trivial amounts of time. XMBC, and the hacking that enabled it, is the primary reason why Microsoft went to PowerPC for the 360, so it only makes sense that coming back to x86 for the one, they need a "supervisor" of some sorts that prevents exploits on the bare-metal x86 side from being carried back over to the "full-fat" OS side.
@Anand ~ In regard to x86 and backward game compatibility... All XBox Live game content currently available would be incompatible with Xbox One?? Am mainly talking the downloadable stuff, i.e. Castle Crashers or TMNT Arcade game??
Microsoft confirmed yesterday that there is no backwards compatibility with X360 software. That includes both disc and downloadable.
Sad really, and a missed opportunity, IMO. Since Sony was transitioning from Cell architecture to X86, and had already announced no backwards compatibility, Microsoft could have gained a significant strategic advantage by integrating backwards compatibility, even at a slight cost premium.
Even if the chips were free, you'd still need to stick them inside the console and provide cooling for them, making the system even bigger. It's a major pain and not worth the cost.
Chalk it up to time delays. It sucks that there's no BC out of the box, but I'm hopeful with MS having a straight up hypervisor (VM) in the box, that we'll see it down the road.
Curious, door open for a "Steam-like" app for Xbox One?? Where by the App handles the "driver" issues related to runny legacy games, etc.. I think Anand mentioned this concept in one of the podcasts.. We have Windows OS...
Two things stalling PC development: 1 No one is making an operating system that requires more power Win 8 has the same basic system requirements as Vista, which is 6 years old. At that age, compute power has doubled 3 times. So a PC COULD be 2x2x2 = 8 times as powerful... but no one is pushing the boundaries. Think about it.. some are still using Crysis to validate hardware! (Crysis is as old as Vista... imagine that!)
2 Game developers design to the smallest common factor (Consoles) While PC compute power has doubled 3 times, Consoles have been stagnant. The XBox 360 is 8 years old. Again, about as old as Vista!
We need a shake-up. We need someone to stand up and make an operating system and software that uses what we have. A PC system that is 8 times as powerful as anything on the market currently demands. Give us that... and no one will be talking about the demise of the PC anymore.
I've heard the complaints of how stagnated visuals have been and I'm sick of it! Sure graphics haven't advanced as much as we thought, but look how far we've come with animation and creating extremely fluid sprites on screen. I would easily take the faces and animations from Halo 4 (console game) over Metro: Last Light because of how well animated and human the characters look in Halo 4. The textures are so much more complex in Metro, but lack compelling animation of facial features.
I believe with this new console generation we will see awesome visual increases across the board with more PC games on the way.
Games have been on an increasing visual detail trend, which I enjoy. But... think about the visuals in a basic PC, tablet or phone device. They are not just stagnated... they are trending backwards. (And that goes for Apple, Google, and Microsoft)
If anyone else remembers the days of the original windows where Microsoft battled it out with Amiga and Commodore, you remember that each company had their own GUI interface. They were all blocky and 8-bit. And "metro" in Win 8 reminds me of that era. Why do I have an interface that looks like it was designed in 1983? That's 30 years old!
When I start up my PC, I should be greeted with stunning visuals, real time updates of weather and news in novel graphic ways, and a file system that is fun and intuitive in a graphically artistic fashion.
Yes, I know the article was about consoles... forgive me... I'm rattling on about PCs. Carry on then.
PC is held back because of the complexity of designing games for infinite combination of hardware platforms/OS.. Yeah, you can make a game with great visuals, but from a profitability stand point there is no way to measure what % of your gamers can actually benefit. Could put a lot of $$ into a game that only a handful of people can enjoy. Simply risky on the PC side to spend a lot of R&D/Game development dollars.
The console provides stability and predictability on the hardware side. Day 1, you know the install base is X million of users for Xbox or PS3. And everyone has same hardware.
If it is so obvious then why isnt AMD doing it? Why would they instead opt for a PS4 style design for their own next gen APU? Either way it is still an epic fail. The'yre either throwing their competitor a huge bone, or throwing themselves on the floor. Take your pick.
Well, between the two, Sony has nailed the 3d-game end of the console business this go-around, imo. Intel, of course, has nothing powerful enough in the igp department to garner this business, so it's no wonder both companies selected essentially the same architecture from AMD. The CoD snippet run at the end of yesterday's demonstration, announced as running in real-time on an xBox one, was extremely telling I thought. First, it did not appear to me to be running @1080P--but possibly @ 480P: the on-screen imagery was definitely low-res, exhibited noticeable pixel aliasing (I was surprised to see it), and seemed to generate a good deal of pixellation that was very noticeable in the scene transitions. It also looked like nobody wanted to show off XB1 rendering in longer scenes where you could really see the frame-rate and get a solid feel for the performance of the game--the whole demo for CoD consisted of one rapid scene transition after another. The rendering problems I observed could all have been caused by a streaming bottleneck--or else by the limits of the hardware (I *hope* it was the immediate streaming because if not then I think Microsoft is going to have some problems with this design.) It was easy to see why the CoD real-time demo was saved for last and was so very brief...;)
But, now that consoles are going x86, there's no earthly reason why either Microsoft or Sony could not update the hardware every couple of years or so when new tech hits the price/performance marks they require. Since we're talking x86, there would never be a question of backwards compatibility for their games as it would always be 100%. I think the days of 8-10 year frozen console designs are over. I think that's great news for console customers.
However, depending on whether Sony handles it correctly, the PS4 could walk away with practically everything as Microsoft is building in some fairly heavy DRM restrictions that involve the basic operation of the device--"storage in the cloud," etc. Involuntary storage, it would appear. If Sony comes out with a gaming console that is not only more capable in terms of the standard hardware, but one which is also customer-friendly in that it allows the customer to control his software environment--I think Sony will walk away with it. The people who will wind up buying the xb1 will be the people who aren't buying it as a game console. To be honest, though, set-top boxes are as common as dirt these days, etc. It should be very interesting to watch as this all shakes out...It's great, though--we've got some competition! (I'm not a console customer, but this is always fun to watch!)
I think most of that first paragraph seems mostly rubbish to me. Sony made a game console, Microsoft made an all-in-one media device. It was well known before the announcement that Microsoft would be showing very little in the way of games yesterday, and they were saving that for E3. 360 games already render @ higher than 480p.
So you are saying you saw artifacts in a demo through a live stream? Tell me you are joking...
As for Sony/Microsoft upgrading console hardware during the current generation, I mean anything's possible, but they would be leaving a lot of customers behind on older hardware. Developers would have to make sacrifices in framerate or quality to achieve compatibility. This places a lot of demands on game developers for testing more environments. Additionally, there's nothing about x86 which makes this upgrade more achievable than on PowerPC architecture. They could have released upgraded consoles if they saw a benefit.
I like the 8 year or longer console cycle. It means that I can focus on enjoying games more than upgrading every couple of years to the latest and greatest that isn't really any more fun to play, just has more eye candy.
"We already have the Windows kernel running on phones, tablets, PCs and the Xbox, now we just need the Xbox OS across all platforms as well." That, 100 infinity BAGILLION times that!
I'd actually like to see Nintendo release a console in time for X-mas 2014 with comparable hardware performance. Just because otherwise I don't see how that company will survive and I really don't want Nintendo to go away. I don't know if that's within their realm of possibility but they need to do something because the wiiU is pretty terrible.
But they won't, as laughable as Wii U sales are that would still anger however many bought that, likely their core base. They'll survive anyways, see their cash reserves, plus any platform Mario and Zelda et al come to will be fine. Nintendo survives on first party, always has.
Seriously? Have you even tried the GamePad for an extended period of time? The thing is incredible and very useful. Also, the Wii U CPU/GPU is very customized and tailored for gaming. It's smart Nintendo didn't release tech specs because most everybody wouldn't understand how it would perform in real time. Custom silicon is a magnificent thing. Heck look at the 4-core CPU and 12-core GPU for the Tegra 3 and pit that against a paltry looking dual core CPU/GPU Apple A5 and you wouldn't have any competition right? (on paper at least) And who would have thought that the A5 with A FOURTH THE CORES and much slower clockspeed outperformed about twice the game performance the "superior" Tegra 3.
a great thing MS could do is find a way to put Windows Phone 8 inside of Nvidia Shield - and then have the option to stream your game from the Xbox One to to Shield.
That would be awesome, family could be watching TV on the living room and you could have high-quality gaming anywhere - event if it would not be possible to play on the console AND shield at the same time, of course.
Streaming games from X1 to Shield (full control scheme) or any WP8 phone/tablet (simpler games) would be that killer-app that MS needs so badly to boost it's phone business.
Interesting read and excited that GCN is being used but the cpu side of things I have to wonder how it will actually perform. Bobcat was fairly weak (think of a Pentium 4) and was terrible starved of memory bandwidth but the worst was that the L2 cache only worked at half the core clock. If the Jaguar cores still have the same sluggish L2 cache then even 8 of them is going to be painful but I suppose devs are going to offload what they can onto the gpu (openCL).
As for the 32mb on die memory as stated in the article it all comes down to how it is used. If used for the frame buffer it will limit resolution but make for a huge fps boost as the rop and tmu are bandwidth hogs gpu wise but leave the rest for the cpu and shader. The cpu being weak as it is won't need much provided the gpu doesn't hog to much of the bandwidth. If used as a cache it will make up for the weak L2 cache and provide a unified pool for all 8 cores, if software only then we might have to wait to find out what for.
Overall this is good news for the PC, no more games like GTA4 :D
"The funny thing about game consoles is that it’s usually the lowest common denominator that determines the bulk of the experience across all platforms."
This is the key point that Microsoft have realised. I bet the developers too told both MS and Sony not to bother going crazy as they will develop to the minimum standard.
This is not the age of the games console anymore. Its the age of the Media/Entertainment center.
When I got my 360 back in 2006 it was mainly used for games now seven years later and a whole lot of bandwidth upgrades and media explosion I now use it mainly for...well...media. Gaming takes very much a backseat on my 360. It s a very convenient media portal that happens to play games as well.
The world has moved on and I can imagine that gamers might feel left behind but there is a whole load more out there to occupy people time than there was in 2005. Microsoft has to capture that.
Anand, in terms of the product positioning, I agree with your assessment, but I also think Microsoft would be better off by creating a disruptive (rather than sustaining) product. It'll be even better if they launched one in parallel with the Xbox One. It will surely cannibalize the sales, but that's the price for solving innovator's dilemma. Moreover, it's not Sony or Nintendo that MSFT should be very afraid of, rather Apple and Google. Apple will surely eneter the market from the low end of the value chain. More details here -- https://www.facebook.com/notes/itvale-the-blog/xbo... , would love to hear your thoughts.
Both were built to work along side each other. If it were the former, you'd expect to run Office on it (an HTPC is still a PC) and if it were the latter you'd need to quit a game before running any media apps (since the game would demand to use all system RAM available).
As such, it really is neither of the scenarios you presented.
I think most of the hardware choices are designed with the display device in mind. There is a huge market of 1080p devices and the price point for those is well established. Most households now have one or will have one. Locking in hardware and performance at a set resolution is good for console costs and game developers. (it does worry me a bit whether advances in PC display technology will equate to higher graphics displays in PC games if the rest of the market is set at 1080p... why develop higher res models etc.) Sony going for a bit higher graphics performance could be an advantage someday if display technology changes to utilize the headroom but Microsoft has solid hardware for their target resolution.
The hypervisor approach is particularly interesting to me as it might be a window into the future of where MS may take OS development. Virtual machines optimized for particular tasks can give you a faster spreadsheets and higher game fps on the same box by selecting which OS module is running. Is there a plan somewhere to put Office 365 on the Xbox One? Microsoft would like nothing better than to be selling software suites that use MS cloud services across multiple platforms to each and every one of us.
More power can be advantageous to the same resolution...My Radeon x1650 could run games in 1920x1080, that doesn't mean it can do everything a GTX680 can at the same res.
A visual comparison between the games demonstrated in the PS4 and XB1 presentations clearly give the graphical edge to the PS4. PS4 games look distinctly next-gen, and are approaching CGI in fidelity. These are early days, and this comparison is hardly scientific, but it seems to corroborate the stronger, easier to develop for hardware in the PS4.
But I think the underestimated feature for the PS4 is the 'share' button on the controller. Game spectating is a big deal, and this gives the PS4 a fan-based advertising engine. Due to the simplicity of sharing video, expect a flood of high-quality PS4 videos to be uploaded to the web, making the PS4 and PS4 games much more visible online. This turns regular players into advertisers for the system which should significantly help its popularity with cool 'look what I did' videos, walkthroughs, and competitions.
I am also very interested to see how Sony uses the second, low-power, always-on processor in the PS4. Certainly it would be possible to include voice-commands ala XB1, but I think that this can open up interesting new uses to keep the system competitive over the coming years.
I think you're reading too much into presentations that could very well have been 100% pre-processed CGI. I expect that the final games will look quite similar on both.
You might want to check the Xbox One presentation - one the things they mention is that game play share is easy for developers to include because of all the connections to the Azure Cloud computing. So that just leaves a share button, but that is actually horrible compared to Kinect, which be on every Xbox One. Instead of hitting a button in the middle of your controller and losing your momentum in the game, for Xbox One you should just be able to yell, Xbox, start recording game play.
I think the PS4 will share to more people. I expect the Xbox One sharing to be either Xbox Live only or the MS universe only. I think Sony's sharing won't be as limited. At least that is the impression I got from the presentations.
If these new boxes are more Media than gaming orientated going forward it could mean far shorter life-cycles for them. We could be going to a 3-4 year cycle rather than the current 8 year trend.
"The day Microsoft treats Xbox as a platform and not a console is the day that Apple and Google have a much more formidable competitor."
I'd say the reverse. The day that Apple and Google decide to become competitors to Xbox is the day that Xbox (and Playstation) go extinct. Right now, MS and Sony are getting by because the HDTV efforts by Apple and Google are "experiments" and not taken seriously. Imagine an AppleTV where Apple allows app installations and a GoogleTV that's focused on gaming with decent hardware.
And imagine how low that GoogleTV (for Games) would cost. Imagine it opens up Android and just like that, bajillions of apps descend upon it.
Hell, it's debatable if they even need to bother making more than a streaming device to receive the image from your tablet and/or smartphone to do just that. Really, all Google needs is an AppleTV-like Airplay connection. You can already plug in whatever USB/bluetooth controller you like.
Within a few generations of Google taking HDTV gaming seriously, they could walk all over Sony and MS because while consoles sit and languish for longer and longer periods of time, tablets are constantly evolving year after year, iterating upward in specs at an impressive rate.
How long before even the Xbox One isn't pushing out graphics far enough ahead of a Nexus tablet that people just go with the $100-$200 tablet with the free to $1 games instead?
no one will make a $1 game with the visuals of CoD, BF2, halo, the list goes on. They would make 0 money.
google taking hdtv gaming seriously? They make all their money on ad's, you honestly think people constantly want ads in a video game? And not product placement...ads. Before you matchmake just watching this 30sec video about vagisil...yea right...
Also, what is a few generations? A few is more than 2, 3 generations ago we were at the ps1. 14yrs ago.
Your telling me that its going to take 19yrs for a tablet to have todays graphics of the xbox1? By that time what the hell will the ps5 have or the x5....
The biggest thing the x1 has for it, that every one is forgetting...cloud/azure.
This is huge, so huge time will show just how little the x1 in multiplayer games will need to compute tasks
I think you are putting way too much stake in the cloud especially when we are talking about computing anything graphics or otherwise. People can barely download music on a steady connection right now. Consoles can't even get you solid updates in a timely manner and you are talking about offloading real work over the internet?
After reading a lot of articles about these two consoles, and their SoC's. There are some things we can extrapolate from this info.
Both Systems are based on the same 8core x86 amd64 CPU. Which means the main logic and memory controllers in the APU's are the the exact same. The comment about PS4 being married to ddr5 may not be true, as we all know that the GPU's can also run on ddr3, plus it may be possible that the cpu memory controller is also capable of running ddr5 or ddr3 in either system..
Both systems are using a 256bit memory bus. Being these are x86 amd cpus, that likely points to jaguar using a quad channel memory controller 64+64+64+64=256, which could be good news when they hit the desktop, if they retain said quad channel controller. It would also be nice to see that in AMD's mainstream chips as well.
Going with eSRAM is an odd choice. I would have through capacity would have been more important than absolute latency. By merit of being on-die, using eDRAM would have lower latency than external DDR3. If they had chosen eDRAM, they could have had 128 MB on die. That is enough for three 32 bit, 4K resolution buffers. In such a case, I'd have that 128 MB of eDRAM directly accessible and not a cache. Sure, software would need to be aware of the two different memory pools for optimizations but most of that would be handled by API calls (ie a DirectX function calls would set up a frame buffer in the eDRAM for the programmer).
The bandwidth figures for the eSRAM seem to be a bit on the low side too. The Xbox 360 had 256 GB/s of bandwidth between the ROPs and eDRAM. With high clock speeds and a wider bus, I would have thought the Xbox One had ~820 GB/s bandwidth there.
I'm also puzzled by MS using DDR3 for main memory. While lower power than GDDR5, for a console plugged into a wall, the bandwidth benefits would out weigh the power savings in my judgement. There is also another option: DDR4. Going for a 3.2 Ghz effective clock on DDR4 should be feasible as long as MS could get a manufacture to start producing those chips this year. (DDR4 is ready for manufacture now but they're holding off volume production until a CPU with an on-die DDR4 memory controller becomes available.) With 3.2 Ghz DDR4, bandwidth would move to 102.4 GB/s. Still less than what the PS4 has but not drastically so. At the end of the Xbox One's life span, I'd see DDR4 being cheaper than acquiring DDR3.
As far as the XBox One's AV capabilities, I'd personally have released two consoles. One with the basic HDMI switching and another with Cable card + tuner + DVR. And for good measure, the model with Cable card + tuner + DVR would also have an Xbox 360 SoC to provide backwards compatibility and run the DVR software while the x86 CPU's handle gaming and the basic apps. If MS is going to go in this direction, might as well go all the way.
Good to see 802.11n and dual band support. With HDMI in and out, I'd have also included HDMI+Ethernet support there as well. Basically the Xbox One would have a nice embedded router between the Gigabit Ethernet port, the two HDMI ports and the 802.11n wireless.
Remember though that the DDR3 in the Xbox will be hardwired directly with no legacy or other PC related stuff getting in the way. This will be optimised DDR3 and not working exactly how its standardised in our PCs.
The only advantage DDR3 in the Xbox One has over a PC is that it is soldered. This allows for marginally better signaling without the edge connector of a DIMM.
That was surprisingly fair, considering a lot of what I've seen since yesterday. Sony tried hard to do what it thought would "improve the gaming experience" and ended up with a lot of social integration and considerably more aggressive hardware. Microsoft didn't really add much to actually playing games (though they do have some cloud-based stuff including game streaming) but has made a play for becoming a full living room experience, with games, live and recorded television, no hassle cable integration, and seemingly several exclusive partnerships. I'm not convinced that core gamers will see much use for those options (though most of the people I know in that group were already PC-focused if not exclusive) or the social things with the PS4, but the raw power would be a nice draw, assuming Sony doesn't accidentally pull a 360 and overheat with any noteworthy extended use.
Of course, if the rumors of Microsoft pushing developers toward always-online DRM, included on-console mandatory check-in every 24 hours, fees for pre-owned or shared games, forced hard drive installs, etc. all pan out a lot of people are going to boycott on principle even if they don't buy used games and have great internet.
I fall in that category of, not buying used games with decent internet (but capped - damn you Canadian duopoly!!) but definitely won't be picking up the X1 if this holds (at least early on).
Additionally, I hate paying for XBL and have no intention of doing it going forward, hopefully Sony doesn't follow this route and maintains PS+ as value added and not a requirement for playing games online.
I hope it's just weak marketing, but what worries me is that the non-gaming extras don't sound all that new or interesting. A compelling, if unlikely, possibility would be to sell an upgrade that sticks Pro-compatible Windows 8 on the non-XBox side. MS is already selling a portable computer; why not sell a desktop/HTPC, too?
If this was Top Gear, Jeremy Clarkson would be busy saying unpleasant things about the Xbox One right now.
"MORE SPEED and POWER!!!"
Oddly enough, I dreamt the night before the Xbox One launch that the new Xbox had 16 GB of DDR4 RAM and shader count equivalent to Tahiti @ 1 Ghz. I hoped that Microsoft with their virtually bottomless pockets could someone improve on the leaked Durango and Orbis specs which didn't bode too well for Durango. I mean, Sony doubled the RAM from 4 to 8 GB. And it's GDDR5 to boot!
Don't think anyone has mentioned this yet, BUT another reason why MS can have slightly lower specs than the PS4: XBO and PS4 are close enough that cross platform games are likely going to target the weaker platform. Sony will be spending money on more powerful hardware that will only be utilized in first party/exclusive games. Side by side comparisons will (in most cases) show minimal - if any - differences.
That alway-on Kinect thing is really creeping the hell out of me.
This THING is always watching. Always. Unblinking. And it can see you in the dark. It can hear you. It can even measure your heart rate just by looking at you! It has a huge HAL9000 eye staring at you.
One thing I think that the article failed to point out was that even though there is a 33% difference in hardware on GPU, there is NOT a 33% different in performance. As the GPU grows in parts, the gains go up logarithmically, not linear like this article suggests. I'd say there is likely less than a 10% performance gain on the PS4 part. Yet it will use almost twice the power, be much more hotter requiring more loud cooling. Yuck! Add to that the performance of the Windows kernel for processing, something Sony could never be able to match, I'd bet that they are probably almost equal in the end.
I believe I should have said "reverse logarithmically" meaning more parts equal less gain. I'm sure you guys get the point. Sony is betting on tech specs and market power rather than real processing power.
In the absense of another bottleneck like memory bandwidth, within GPUs adding shader cores is actually a pretty good indicator of performance. And it's not just 12 vs 18 CUs, the PS4 also has double the ROPs and 172GB/s for its entire pool of memory.
You have all process, fighting for attention on a higher latency bandwith.
There is a reason pc's are still using ddr3, and its not just because of memory controllers, ddr5 has been in use for many many years.
Heck with amd producing both video card controllers for their video cards, one could simply conclude just dump a ddr5 controller on the apu and have them go to ddr5 desktop.
But cost is not just a factor, bandwith only goes so far.
x1 will be saving costs and reap the benefits with eSram, sony's bandwith starts to go out the window.
Put in cloud computing, and it becomes even more mute to have ddr5
Given how similar the the X1 and PS4 are, any chance developers would be able to ship a single disc with support for both platforms? Not that they would...just wondering if it's technically possible to have shared assets/textures etc. and separate binaries on a single disc that could be read by both machines.
Can I turn it into a Windows computer? That would be a selling point.
Can I turn it into a DVR? Selling point.
I would question not the choices here, but the underlying principle of wanting to check stats etc on your tv in a side bar using Kinect, game controller or Smartglass instead of just using your phone or tablet directly.
I would question the attraction of this to someone not interested in games. GTV hasn't made a case for a box on a box tv product. And it is hundreds cheaper than the nextbox will be. It doesn't seem like this market will open up until, at the least, you don't need that second box.
I suppose though that this stuff is a value add to convince mom or dad to buy it or someone on the fence with their gaming interest to buy it. Or someone only interested in one or two franchises.
I think the scenario arises for having ESPN or a news site pinned to the side, when you have multiple people in the room watching. Where as a tablet would provide the same info; its just a personal experience. But if you have the guys over for a playoff game, while another important game is going on at the same time. Instead of having every one individually looking down at their phones/tablets/switching channels. You can have one game full time and the other with it's boxscore pinned to the side, so everyone can see everything without having to look away from the screen. Or have a news or twitter feed going on the side, which depending on the circumstances could be really interesting.
The example they showed with buying tickets for a movie, while watching a movie, was such a stupid example, all of which is a personal experience and can be done on a phone or tablet anyways, especially since everything on the TV had to be manipulated by a phone to begin with.
I honestly find this really compelling and potentially awesome, but all the gamer (or anti-gamer) things they've mentioned so far as well as the XBL gold still being required for playing online are really dissuading me from thinking about getting one any time soon.
In introduction:"This last round was much longer that it ever should have been, so the Xbox One arrives to a very welcoming crowd." Change that to "than"
All pretty much expected on the hardware side of things. And I like it as well, as a non-console gamer (I bought an Xbox 360 for the Kinect to introduce my wife and larger family to gaming... it's mostly left unused, since our living room is on the small side and we need to rearrange a lot to play). The current high end PC should be able to run all the games developed for the consoles very well, which is good. There is less chance of getting rubbish ports since all 3 consoles are so similar. All positives for me as a PC gamer. For the software/entertainment side, that doesn't interest me in the least. The Xbox could never be my media hub because I have an A/V receiver for that, as I expect many people do. And all the software stuff, streaming etc. I have my doubts about everything trickling down to the German market as they envision it. Plus, I'm not a TV guy or a rent guy. I like to own my stuff as much as possible.
It seems to me that the Microsoft approach is Superior to the Sony one if One + Kinnect is priced competitively or at not too big a premium with the PS4.
Sony has a history of hacker-friendly internet platforms and it is often in denial. Would I trust the Playstation Network? No way.
If consoles become the center of the TV/Entertainment center, if people will really use the console as a Skype terminal or as today use the Kinnect to excercise then I think the Microsoft offering is the more attractive one and that its features will interest buyers more than to know that one console has more shader units than the other.
In addition, it would not be difficult for Microsoft to extend the Platform with byod features such as Skydrive access, Office 365 and so forth.
this annoys the hell out of me. who cares about power. i want performance. i want my console or pc so power hungry it trips a breaker. its obvious everyone wants a kinect instead of a better gpu. right? tv on xbox is useless. so i dont have to hit the input button on my remote? seriously? its not like you get the service from ms. you have to pay for xbox live gold and be a paying cable customer. it offers absolutely nothing. i have 2 360's but i have no plan on my buying this garbage cable box i already have a dvr. ill take that $500+ and buy a 27" 1440p monitor and another video card for sli. ms already proved to me over the past 3 years they don't care about the core gamers anymore and this just reaffirms it. i hope this console fails so miserably.
because everyone wants a ps4eyetoy in their ps4 right? instead of a better gpu right?
Oh wait...ps4 did the same thing.
Everyone wants a camera with their laptop instead of a better cpu/gpu right? Oh wait...
In my house hold finding the remote is an issue. Not because we forget but with 8 people in a home, each one grabbing at it at different times, this becomes an issue.
You have a dvr? great, the xbox is not designed to replace it, in fact its already stated its working along side....any cable box.
Why do people pay for TiVo? there is a service to be had, and its much better imo than comcasts dvr or the offering by Direct. I was skeptical about TiVo till my damn ex got me hooked on it.
Not to mention how many times I have to tell my parents the channels of anything other than local news. Now they just have to say...hgtv, animal, history. They don't live with me, but they will be getting one. This is also a plus for me, as my kids don't have to worry about memorizing the channels, even though they do currently.
Your mentality is what the losers of the tablet wars are facing...
Why would people want a tablet? I have a laptop, pc, tv, ipod...the list goes on...
Convenience and easy. This is what x1 is offering, along with cloud computing, dedicated servers, and games games games games
really? sony hasn't announced the pseye is going to be bundled with all ps4's. in fact there are links to sony videos that say "Playstation4 camera may be required and is sold separately". even if it is the ps4 has a 30% faster gpu then xbox so ms should have spent that kinect money on making the gpu more competitive. the 360 already showed kinect was a failure for gaming. unless you are a girl who wants to play dance central there is no reason to get a kinect. this is why bundling the kinect boggles my mind. it shows me ms either doesn't get it or doesn't care if the xbox is a gaming console.
the best defense of xbox tv you can come up with is you constantly loose your remote? my remote is always in the room with the tv. not very hard to find.
i know its not going to replace my dvr. that doesn't change the fact that i see zero benefit to using xbox and i still have to pay for my dvr. i forgot you can use voice commands and wave your hands around like an idiot. no thanks ill pass and stick to my remote.
"games games games games"? i must have missed that part of the conference. when they announce their exclusives at e3 expect them to mostly be kinect and XBLA games not the non kinect AAA games everyone wants. just like they have over the last 3 years.
i have a ipad and never use it. the only thing its good for is quickly checking email or watching a youtube video. beyond that my laptop or desktop is 10x faster and easier to use. people are buying tablets because it the cool thing to do and everyone already has multiple pc's and laptops. once everyone has a tablet sales will slow down.
It appears MS has dumped the WMC extender abilities. It's not a surprise, but I don't give two hoots about the DVR overlay. With my HTPC and extenders I save a lot of money not having a DVR. MS wants me to go back to a DVR, and I won't. Given all else it looks like PS4 for me with continued 360 gaming (which MS said will also get updated in some sort of way, won't know till E3 what that means). But pricing and other factors will be part of that equation too.
im surprised you hooked up an htpc, considering you have a hard time understanding works along side your cable/dvr via infared. Meaning when you tell it to change the channel it will do so, at your cable box/dvr/htpc. I can understand how something so simple can be confusing...
Wow, can't wait till I'm all growed up like you! Then I too can be a jackass.
I watched the MS presentation live, and have followed many of the MS follow-up discussions about Xbox One. They've mentioned working with cable companies like Comcast for TV overlay, but oddly enough they haven't mentioned working with HTPC's running their own Windows software. Then again maybe they too have a hard time figuring all this out. Please post your phone number or email so MS can contact you and you can set them straight.
They've already showed the next gen PS Eye and it was announced back in February that it would be shipping with every PS4 (just like Kinect) ... Sony needs better marketers lol.
All of you who are trying to use the "Sony will ship with the PSeye" thing is missing a very big point.
The PSeye IS NOT required for ANY functionality. That's a big difference there big man.
I personnaly don't give a damn about the eye or kinect I didn't use either of them last gen and I won't use either now. If you are going to try and make a though you should try to state ALL the facts. Not just the ones that you think will validate your argument.
Memory bandwidth is going to be a huge issue, no? I'm mean we've all seen the benchmarks when you get an AMD apu based X86 pc, and then you change the speed of the ram and you see 30 - 40% difference in FPS, on PC with all it's glorious overhead. In games GPU's are the deciding factor, and probably still will be for sometime, not to mention the PS4 will likely have a less intensive ram overhead for the OS to boot.
What this is going to mean this gen IMO, is that sony's first party titles will probably look better, and third party games, and the 'ports' if you will, will be the reverse of last generation. I remember one of the main reasons I got a 360 was that it was the better console for 3rd party titles, they ran better, less texture popping & FPS dips / tearing, than the ps3 at the time. It will likely be the reverse this generation, seeing as games should be easier to get running optimally on the ps4, simply because it has more GPU. 50% more shaders in the GPU is nothing to sneeze at, to put it in PC performance terms, it's probably about the relative difference between a 660ti & a 680.
you might want to re-understand how the internals will work.
ddr5 is great at bandwidth, something a video card needs, because its sharing large amounts of predetermined data, latency is not a real issue. DDR3, is used in pcs because its cheaper, but also because it has lower latency in general.
You off load one of main things a gpu does, framebuffering, and ddr5 becomes a highly priced memory module.
Don't forget, the x1 is carring a set of 2133 ddr3's, what you are talking about is people going from 1333 ddr3 to 1800 or 2133. Good thing the x1 already has 2133
Sony is the winner here. The architecture is the same, so developers can easily tune some numbers (higher quality shaders, more particles, higher FPS, higher resolution etc) to get noticeably better results on the PS4. I would prefer PS4 as my next gaming device. Hopefully Sony will not screw up developer tools.
I prefer Xbox one too! I liked playing Xbox 360 games ever and even read many articles about it from Aneesoft Xbox 360 column. And now I will choose Xbox as my favorite game console.
GPU is the most important factor in determining the console. PS4 holds the advantage here. Xbox one unless they change the GPU similar to PS4, I will not opt for it. Other than this the integration of TV, Internet is not necessary for most of the gamers. Still Xbox should change the GPU, otherwise it will lose.
In every place its mentioned 32% higher GPU power, I don't think A GTX 660 TI and GTX 680 are equal. For sure PS4 holds the advantage. Lower shaders and lower in everything compared to PS4, DDR3 Xbox one-PS4 DDR5. For ESRAM, I will tell you something have a SSD, have 32 GB RAM, it cannot make it for a better GPU.
In some ways this is the opposite to the previous generation. The 360 screamed games (at least its original dashboard), whereas the PS3 had all the potential media support (the xbar interface though let it down) as well as being an excellent blu-ray player (which is the whole reason I got mine).
This time around MS have gone all out entertainment, that can do games, where as Sony seems to have gone games first. I'm imagining that physically the PS4 is more flashy too like the PS3 and 360 where...game devices not family entertainment boxes.
Personally I'm keeping the 360 for my games library, and the One will likely replace the PS3.
One of my biggest concerns with the new system is the Kinect requirement. I have my Xbox and other electronics in a rack in the closet. I would need to extend the USB 3.0 (and I am assuming this time around, the Kinect is using a standard USB connector on all models) over 40 feet to get the wire from my closet to the location beneath or above my wall mounted TV. With the existing Kinect for the 360, I never bothered with it, but you COULD buy a fairly expensive USB over cat5 extender (Gefen makes one of the more reliable models, but it's $499!). I know of no such adapter for USB 3.0, and since Kinect HAS to be used for the console to operate, this means I won't be buying an Xbox One! Does anyone know of a product that will extend USB 3.0 over a cat5 or cat6 cable? Or any solution?
Is it just me or are these new gen consoles seriously lacking in CPU performance? According to the benchmarks of the A4-5000, of which you could say the consoles have two, the CPU power is not even going to come close to any i5 or maybe even i3 chip.
Considering the fact they are running the X86 platform this time, which probably is not the most efficient to run games (probably the reason why consoles in the past never used x86), and the fact that they run lots of secondary applications next to the game (which leaves maybe 6/8 cores left for the game on average), I think CPU performance is seriously lacking. CPU intensive games will be a no-no on this next gen on consoles.
The first Xbox used x86 CPU. Cost was the main reason not many consoles used x86 CPU in the past, unlike IBM Power and ARM, x86 doesn't give out license to whatever company to make their own CPU. But this time they probably see benefit has outweighed the cost (or even less cost) with x86 APU design from AMD - good performance per dollar/per watt for both CPU and GPU. I am not sure if Power today can reach this kind of performance per dollar/per watt for a CPU, or ARM has the CPU performance to run high end games. Also bear in mind that consoles use less CPU cycle to run games than PC.
"Differences in the memory subsytems also gives us some insight into each approach to the next-gen consoles. Microsoft opted for embedded SRAM + DDR3, while Sony went for a very fast GDDR5 memory interface. Sony’s approach (especially when combined with a beefier GPU) is exactly what you’d build if you wanted to give game developers the fastest hardware. Microsoft’s approach on the other hand looks a little more broad. The Xbox One still gives game developers a significant performance boost over the previous generation, but also attempts to widen the audience for the console."
I don't quite understand how their choice of memory is going to "widen the audience for the console". Unless it's going to cause the XBox One to truly be cheaper, which I doubt. Or if you are referring to the entire package with Kinect, though it didn't seem so in the context of the statement.
It's my understanding (following an AMD statement during a phone conference over 8000m announcement) that ZeroCore had been enhanced for graceful fall-back, powering-down individual GPU segments not just the entire GPU. If this is employed we could see the PS4 delivering power as needed (not sure what control they'll have over GDDR5 clocks if any), but potentially not power hungry unless it needs to be. Perhaps warrants further investigation?
I agree with the article that if used appropriately, the 32MB SRAM buffer could compensate for limited bandwidth, but only in a traditional pipeline; it could severely limit GPGPU potential as there's limited back-and-forth bandwidth between the CPU and GPU, a buffer won't help here.
For clarity, the new Kinect uses a time-of-flight depth sensor, completely different technology to the previous Kinect. This offers superior depth resolution and fps but the XY resolution is actually something like 500x500 (or some combination that adds up to 250,000 pixels).
I'm curious to see what feature sets each of these GPU's has. These are not the run of the mill APU's that you can buy at the store. These are both custom SoC's and it's my understanding that they may even be from different generations (7000 vs. 8000), similar to how the PS3's RSX was from the 7900 era and the Xenos was around the R600 era (Unified Architecture). Although this would be a much smaller difference here being the same make (AMD) and similar model (GCN).
In the end simply measuring CU's may not be enough to determine the true power/quality of the two GPU's.
We may never know as I highly doubt they will easily divulge this info for fear of the outcry (especially from M$ standpoint).
"The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely."
I would disagree. You won't see compatibility with existing Xbox discs but I very much expect a line of original Xbox titles to be offered as download purchases on the new machine. If Nintendo thinks $5 for an NES game running ont he Wii U is reasonable, Microsoft should able to make some good coin on a core set of two or three dozen Xbox titles at $10 each.
As for the 360 library, those should start turning up in HD (well, HD-ier) remakes in about four years as the market ripens for bringing those item back into circulation. This has worked very well for adding value to the PS3 with HD remake collections of PS2 hits. Given the right tools, reworking old IP can be very cost effective.
Some of the best original Xbox titles might get native remakes. We've already had Halo Anniversary and I wouldn't be surprised to see a Halo 2 Anniversary turn up for Xbox One. Jade Empire and the Knights of the Old Republic games may be worth the investment.
Anand Lal Shimpi why did you not mention any thing about the four move engines in the Xbox one and the capabilities of the cloud quadrupling the Xbox ones power
iirc the PS4 had similar hardware blocks to the Move engines, just no fancy branding? And the cloud compute thing is a future theoretical, I'll factor it in when it's actually shown to work well. It can't be used for any latency sensitive calculations of course.
The Xbox 1 (aka PS4 Mini aka PS4 Lite) sure is a colossal disappointment. Microsoft are trying to cut costs and save money in order to create the biggest gap they can btwn selling price and production cost. In other words, the Apple approach: rape your customers. Kudos to Sony for 1152 cores and gddr5.
I won't buy any console that needs an internet connection. It is a huge privacy risk to have a console with a camera that connects to the internet. A console that connects to the internet once per day or once per week has the same privacy risk as a console with an always-on connection.
Gamers should boycott Xbox One so the console manufacturers get the message that we won't accept a required internet connection. If a physical disk is inserted in the console, no internet connection should be needed to prevent piracy. The console manufacturers just have to develop a proprietary disk format that can't be copied by Windows, Mac or Linux. It would be fine if gamers who don't want to put a physical disk in the console to prove they own the game are required to have an internet connection. That way, if a gamer wants to prevent game companies from spying on them, they would just swap disks when switching games. If a gamer uses LIVE or they want the convenience of not needing to swap disks, they would provide an internet connection.
The PS4's PS3 games are allegedly coming via GaiKai. I'm curious what MS will do for the old stable of games. I wonder if it would be too much to implement other VM's for the Xbox and the Xbox 360; though a VM on an x86 running PPC is likely to suffer severe penalties. It's either state or gaming from the cloud.
Alternatively, developers will recompile some of the "best hits" on the 360 and re-release for the Xbox One. I wonder how that would work with the Halo series, but having Gears of War on a faster machine might be fun.
xbox one - everything we don't want in a video game console, except the controller. ps3 - everything we do want in a video game console, except the controller.
can anyone tell me if the following statement is correct or incorrect?
pc games will be ports from the games made for consoles. both consoles (xbox one and ps4) will have 5gb vram in their gpu. so that means the system requirements for pc games, as early as december when they start porting games over from the consoles to pc, will require a pc gamer to have a video card of at least 5gb vram, or more, just to run the game.
Before the hardware is released and analysed, we have no idea how much of the PS4 GDDR5 ram is going to be shared and/or dedicated for gpu use and how much of those are going to be available to user data. It is anyone's guess at this stage. But the improvements in hUMA design with dual ported frame buffer for gpu and cpu makes it a rather quick gpu by PC standards. Since only one game is loaded at a time, there can be shared memory reconfiguration going on just before the game loading so it can depend on the game and how much ram it can grab. The cpu counts very little in the process and it is why it can be clocked at 1.6Ghz rather than storming at 3.6Ghz as in Trinity chips. Still with faster gpu and globs of ram now, there is certainly greater leeway in the development process and optimizing process for game developers. One can assume at 3X the Trinity gpu core counts, the PS4 must be at least 2.5X the speed of Trinity gpus since those ran at 900Mhz. With good cooling, the PS4 could well clock their gpu cores at 1.2Ghz since Intel is going 1.3Ghz on the GT3 core.
This has never been an easier choice-Microsoft doesn't let you buy games, Sony does, and their system is 50% more powerful, more focused on games, while Microsoft's off doing yet more Kinnect.
YUP, and as a cripple, what good is flailing my arms about and hopping around like a retard going to do me? Kinetic is about the dumbest thing I've seen people use. Accept for work-out stuff and kids stuff sure, makes sense. But then now they give those in the dial-up and cellular internet locations the finger and say 'stick with the 360' when they know damn well developers won't make games for it within a year... Morons. I'm done with M$. If I do get a new console, it will be the PS4. Besides, I've always loved the Kingdom Hearts series more then any others...
i am glad that these consoles have finally seen the light of day. though a bit underpowered compared to an average mid range rig, at least game developers will be forced to utilized each and every available core at such low clock rates on these consoles. heavily threaded games will finally be the norm and not just a mere exception. if the playing field no longer relies heavily on ipc advantages, will amd's "more cores but cheaper" strategy finally catch intel's superior ipc advantage? will amd finally reclaim the low to mid range market? no, not likely. but one can hope so. i yearn for the good old c2d days when intel was forced to pull all the stops.
Who gives a shit about heat and power consumption in a console? Both machines are miserly, and they're not notebooks for Gods sake. Looks like MS simply cheaped out to me. Letting them off the hook by pointing out the tiny heat/power savings as a "benefit" is a real reach. By this logic, why not just cut the compute power even more?
mmmmm.... Custom CPU (6 operations per clock compared to the 4 of PS4) and now overclocked. GPU (now overclocked) eSram (ultra fast memory with extremely low access time, we will see it's real function soon) DDR3 (extremely fast access time memory) Maybe this combination may become a nightmare for the PS4 owners?? xD Yes, i really think YES.
And please don't forget the new pulse triggers (apparently fantastic and a must have for a completely-new experience)
Shadow, have you seen the cooling system? It's giant.. Have you seen the case? It's giant..(a lot of fresh air inside ;-) Have you seen the Xbox One will detect heat, power down to avoid meltdown? http://www.vg247.com/2013/08/13/xbox-one-will-dete... And the very heat power supply is outside.....
A perfect system for overclocking.... obviously for me....
Ah, for my first message here a reply to PS4 team made directly by Albert Penello (Microsoft Director of Product Planning):
"******************************************************************************************* I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.
I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.
So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.
I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.
So, here are couple of points about some of the individual parts for people to consider:
• 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU. • Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall. • We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted. • We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles. • We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect. • Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.
Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.
I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around – they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.
Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.
Thanks again for letting my participate. Hope this gives people more background on my claims. "*****************************************************************************
Once again i would like to warn PS4 fan........Everytime Sony announced a new console, Sony have publicized it as the most powerful.... every time Xbox does the job better....
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
245 Comments
Back to Article
sri_tech - Wednesday, May 22, 2013 - link
Anand,I replied to you on twitter that this is 28nm SoC. You asked for "source".
As usual, good article.
ON A PALE HOR5E - Wednesday, December 18, 2013 - link
Ok, I am going off on a tangent. I understand technology is expanding, and that CPU's have a huge amount of transistors now on a single die. So with our knowledge and manufacturing capabilities, why are we still make motherboards as big as a VCR? Seems to me a lot of space could be save by creating a semiconductor die, for all those capacitor's, resistor's, inductor's, and other semiconductor packages. After all your spending all that money to develop a new system. Why not make it a sportier, Lighter in weight, and tie it with the newest VR gear, and have a gaming anywhere super system?Shadowmaster625 - Wednesday, May 22, 2013 - link
The embedded RAM + DDR3 paradigm is sort of a gimme to Intel. It probably will not even be that long before intel starts shipping a CPU with embedded RAM and their equivalent of 768 AMD shaders (3x HD5200?). When they do they will be able to snatch up a handful of developers with lots of experience working with that exact setup. It seems like a mistake for AMD to provide such a design knowing it is going to hurt them down the road. This is another "what the hell were they thinking" moments.tipoo - Wednesday, May 22, 2013 - link
Intel is already going down the embedded memory route, AMD is not "providing" anything to them here that they wouldn't already make.Gigaplex - Wednesday, May 22, 2013 - link
I think Shadowmaster625 was implying that AMD is giving developers experience working on such a platform ready for Intel to come in and sweep them up. I don't think it's relevant as there's nothing particularly exotic about having a small amount of fast memory when you consider how developers have had to deal with the Cell chip.tipoo - Wednesday, May 22, 2013 - link
Ah, perhaps I misunderstood. But if it's automatically managed (and Haswell at least certainly is) that's not a consideration.80's Kidd - Thursday, May 23, 2013 - link
Embedded memory is nothing new, Intel had it first with the L1 and L2 cache in the early 90's with their Pentium Pro line.Oxford Guy - Wednesday, January 21, 2015 - link
I think the first microprocessor with embedded memory was the 1975 Fairchild F8 which had 64 bytes of scratchpad. The Motorola 68k line also had internal cache before Intel's x86. The 68040 from 1990 had the same amount of internal cache, 8KiB, as the Pentium Pro. The Pentium Pro, though, had a large on package cache (256KiB to 1 MiB).mfenn - Wednesday, May 22, 2013 - link
So AMD should give up a guaranteed revenue stream from the two biggest console manufacturers for 10+ years just so they won't (as a side effect) train developers on an obvious architecture? That doesn't make any sense.kyuu - Wednesday, May 22, 2013 - link
Don't be absurd. One, this is nothing new. The Xbox 360 used embedded eDRAM, and other consoles have before it.Two, AMD is going to start embedding eDRAM into its APUs in the near future as well.
tipoo - Wednesday, May 22, 2013 - link
I wonder how close the DDR3 plus small fast eSRAM can get to the GDDR5s peak performance from the PS4. The GDDR5 will be better in general for the GPU no doubt, but how much will be offset by the eSRAM? And how much will GDDRs high latency hurt the CPU in the PS4?Braincruser - Wednesday, May 22, 2013 - link
The cpu is running on low frequency ~ 1.6 GHz which is half the frequency of most mainstream processors. And the GDDRs latency shouldn't be more than double the DDR3 latency. So in effect the latency stays the same, relativelly speaking.MrMilli - Wednesday, May 22, 2013 - link
GDDR5 actually has around ~8-10x worse latency compared to DDR3. So the CPU in the PS4 is going to be hurt. Everybody's talking about bandwidth but the Xbox One is going to have such a huge latency advantage that maybe in the end it's going to be better off.mczak - Wednesday, May 22, 2013 - link
gddr5 having much worse latency is a myth. The underlying memory technology is all the same after all, just the interface is different. Though yes memory controllers of gpus are more optimized for bandwidth rather than latency but that's not gddr5 inherent. The latency may be very slightly higher, but it probably won't be significant enough to be noticeable (no way for a factor of even 2 yet alone 8 as you're claiming).I don't know anything about the specific memory controller implementations of the PS4 or Xbox One (well other than one using ddr3 the other gddr5...) but I'd have to guess latency will be similar.
shtldr - Thursday, May 23, 2013 - link
Are you talking latency in cycles (i.e. relative to memory's clock rate) or latency in seconds (absolute)? Latency in cycles is going to be worse, latency in seconds is going to be similar. If I understand it correctly, the absolute (objective) latency expressed in seconds is the deciding factor.MrMilli - Thursday, May 23, 2013 - link
I got my info from Beyond3D but I went to dig into whitepapers from Micron and Hynix and it seems that my info was wrong.Micron's DDR3 PC2133 has a CL14 read latency specification but possibly set as low as CL11 on the XBox. Hynix' GDDR5 (I don't know which brand GDDR5 the PS4 will use but they'll all be more or less the same) has a CL18 up to CL20 for GDDR5-5500.
So even though this doesn't give actual latency information since that depends a lot on the memory controller, it probably won't be worse than 2x.
tipoo - Monday, May 27, 2013 - link
Nowhere near as bad as I thought GDDR5 would be given what everyone is saying about it to defend DDR3, and given that it runs at such a high clock rate the CL effect will be reduced even more (that's measured in clock cycles, right?).Riseer - Sunday, June 23, 2013 - link
For game performance,GDDR5 has no equal atm.Their is a reason why it's used in Gpu's.MS is building a media center,while Sony is building a gaming console.Sony won't need to worry so much about latency for a console that puts games first and everything else second.Overall Ps4 will play games better then Xbone.Also ESram isn't a good thing,the only reason why Sony didn't use it is because it would complicate things more then they should be.This is why Sony went with GDDR5 it's a much simpler design that will streamline everything.This time around it will be MS with the more complicated console.Riseer - Sunday, June 23, 2013 - link
Also lets not forget you only have 32mb worth of ESRAM.At 1080p devs will push for more demanding effects.On Ps4 they have 8 gigs of ram that has around 70GB's more bandwidth.Since DDR3 isn't good for doing graphics,that only leaves 32mb of true Vram.That said Xbone can use the DDR3 ram for graphics,issue being DDR3 has low bandwidth.MS had no choice but to use ESRam to claw back some performance.CyanLite - Sunday, May 26, 2013 - link
I've been a long-term Xbox fan, but the silly Kinect requirement scares me. It's only a matter of time before somebody hacks that. And I'm a casual sit-down kind of gamer. Who wants to stand up and wave arm motions playing Call of Duty? Or shout multiple voice commands that are never recognized the first time around?If PS4 eliminates the camera requirement, get rids of the phone-home Internet connections, and lets me buy used games then I'm willing to reconsider my console loyalty.
piiman - Saturday, June 22, 2013 - link
Are you required to use the Kinect controller or can you set what controller you want? I'm betting you can set the one you want to use and not be made to use Kinect?It wouldn't surprise me to see a later/lower cost version sold without Kinect though.
Riseer - Sunday, June 23, 2013 - link
MS had no choice but to use ESram,DDR3 has 1/3 the bandwidth of GDDR5.It's like using a band-aid on a sinking ship.Sony made Ps4 like a PC,down to the way the GPU uses GDDR5.Latency wise it won't be overly bad for the Cpu.tipoo - Wednesday, May 22, 2013 - link
On the size and cooling, it's a 100 watt APU, they said in their technical talk with Engadget. Much lower than what the 360 debuted at. The reason it's as large as the launch 360 is that the PSU is internal this time.Noobslayer - Wednesday, May 22, 2013 - link
No it isn't, psu is still external.tipoo - Wednesday, May 22, 2013 - link
I don't think either was confirmed at this point, I did see the prototype power brick but who knows with retail hardware. If the One is that big PLUS a huge external brick, wtf.epobirs - Saturday, May 25, 2013 - link
An MS exec in one interview indicated they prefer an external PSU for simplifying shipping the product into different regions. The core box remains identical with the external brick handling the power localization. It also gets a big heat source out of the box. They're likely to be intent on low cooling noise after the valid criticism the first couple iterations of the 360 received.There was a certain amount of rush to the 360's development due to the need to end production of the original Xbox and not leave a huge time gap in the shift to the new platform. The success of the 360, after those early debacles, meant they could take their time on this design and do a lot more testing for industrial factors like noise suppression.
Riseer - Sunday, June 23, 2013 - link
Well now it has been confirmed it will have a power brick.tyler31763 - Thursday, May 23, 2013 - link
http://asset0.cbsistatic.com/cnwk.1d/i/tim2/2013/0...This picture basically confirms an external psu.
RedavutstuvadeR - Saturday, May 25, 2013 - link
i only see a space for the powercordtipoo - Monday, May 27, 2013 - link
That could plug into a power cord or PSU.tipoo - Wednesday, May 22, 2013 - link
If it's a basic Jaguar in there, what do you think Microsoft meant by saying it can do 6 operations per core per clock? Jaguar would be 4. Unless they meant load and store as the two extra.mczak - Wednesday, May 22, 2013 - link
Yes exactly, load/store count separately. 2 Int + 2 FP + 2 LS making it 6.The more important number is probably dispatch/retire/decode width anyway (which is 2 for all of these).
nathanddrews - Wednesday, May 22, 2013 - link
It will be interesting to see how long it takes before XO (or PS4, for that matter) software is running on regular PCs.JPForums - Wednesday, May 22, 2013 - link
@Anand: "We already have the Windows kernel running on phones, tablets, PCs and the Xbox, now we just need the Xbox OS across all platforms as well."Agreed, this would be a nice selling point for a future Surface Pro device. It would be a relatively cheap way to bring the added value necessary to justify the higher cost in the eyes of the mass market.
Arsynic - Wednesday, May 22, 2013 - link
So the Surface would have to run a hypervisor which would require like 8+ GB of memory if the user wanted a traditional Windows experience along with Xbox games.Spunjji - Wednesday, May 22, 2013 - link
2 options there:1) Dump memory to flash if you *really* feel the need to run both simultaneously at full-tilt (daft)
2) 16GB RAM, which will cost a whole lot of not much by the time this idea becomes feasible.
lmcd - Wednesday, May 22, 2013 - link
Already doesn't cost that much anyway, especially off-die like Surface Pro.RollingCamel - Wednesday, May 22, 2013 - link
I think that MS is deliberately holding the performance down so tablets and smartphones can quickly catch up and be added to the ecosystem.tipoo - Wednesday, May 22, 2013 - link
I think it's more so that they can bundle Kinect at a competitive cost.Spunjji - Wednesday, May 22, 2013 - link
Bit of both, plus being burned by high-spec high-heat parts last gen?tipoo - Wednesday, May 22, 2013 - link
Why would they want tablets and smartphones to take away their viable market?plcn - Wednesday, May 22, 2013 - link
there are a lot more ipads than xboxes out there... msft wants the surface to have its place there, and well, that will probably make them more money than simply winning this generation's 'console war' - makes some sense. wasn't messaged like that at all by the company, but gotta appreciate it's potential merit nonethelesstipoo - Wednesday, May 22, 2013 - link
Meh. If they kill the box and make tablets and mobile their gaming focus, that gives the iPad and Android an ample opportunity to become their prime competitors, and the iPad already has a lot of headway on mobile gaming.Jaybus - Wednesday, May 22, 2013 - link
I don't think we can connect 2 or more controllers plus LCD TV to our smartphones just yet. And neither a smartphone nor a tablet can play high end games, certainly not at anything close to reasonable frame rates. Those who think a smartphone is going to replace consoles, PCs, and NASA supercomputers in the next 8 years all live in fantasy land.Gigaplex - Thursday, May 23, 2013 - link
Sure we can. There's a fair few phones with HDMI connectors, and it wouldn't take much in the way of software tweaks to pair multiple bluetooth controllers (assuming it's not already supported).Dedhed - Wednesday, June 5, 2013 - link
Actually I have a note 2 and attached to the smartdock and a couple of powered usb hubs i can connect 4 controllers to it and watch it on my 28 monitor or 55 led panel, hook up my external drives and play my bluray movies as well as surfing and netflix and the rest of it while listening to my stereo bluetooth headset. Convergence is happening already. Take off your blinders and look around!RollingCamel - Thursday, May 23, 2013 - link
They wouldn't. First you have SmartGlass which makes the smartphones and tablets into controller peripherals as a 1st step of integration.Then these peripherals would evolve more into an accessory gaming controller paired with the Xbox system.
After that the Xbox system grows from the living room into tablets and smartphones where games are played cross-platform. Every device does it job and can't replace the other. If ppl argue that we may hookup the phone to the TV instead of the console, it would be possible but imagine getting a phone mid game.
This may increase MS sales if they can improve ground they built now. Every product they made has excellent potential and just needs to be smoothed out...hopefully.
Sony has already spoken about integrating different devices into same ecosystem via Android. I don't think Apple can compete with it unless they use their tv box as a console too.
jaydee - Wednesday, May 22, 2013 - link
What's with all these "One" monikers?Since PS4 and XBO are all x86 architecture, here's hoping EA Sports will starting porting its games (notably Madden) to PC again.
nathanddrews - Wednesday, May 22, 2013 - link
I believe that Microsoft secured exclusive rights to all EA Sports titles for the XO, at least FIFA, Madden, NBA Live and UFC so far. Then there's COD:G which is exclusive as well. Whether these are temporal exclusives, DLC exclusives, or something else, I don't know.plcn - Wednesday, May 22, 2013 - link
sorry to say, but they're not exclusive at all. they might have some exclusive features ('ignite'?) or early DLC, but the games are confirmed for PS4, too.THizzle7XU - Wednesday, May 22, 2013 - link
Ignite is only the name of their next gen game engine.cbrownx88 - Wednesday, May 22, 2013 - link
Negative - they have an exclusive "launch on xbox FIRST" deal... No way EA would allow those franchises to be exclusiveFriendly0Fire - Wednesday, May 22, 2013 - link
Ghosts only has timed exclusivity on DLC. The game will see simultaneous release on all platforms.lmcd - Wednesday, May 22, 2013 - link
What are you smoking? It's news but not surprising that EA is dumping the Wii U, but the PS4? That would be insane. And in fact, such a thing has not happened.Same with COD:G. AQll of the ones you have mentioned are cross-platform.
blacks329 - Wednesday, May 22, 2013 - link
Here is what is exclusive regarding EA Sports titles and CoD Ghosts.X1 will receive the Ultimate Teams feature exclusively. They haven't mentioned what that is. But the games themselves will release simultaneously on 360, PS3, X1, PS4 (assuming the last two arrive before the sports title is released).
X1 will receive the CoD Ghosts DLC pack first, before it is released on PS4. Similarly to how it has been with every release of COD this gen for the past few years. CoD Ghosts will release simultaneously on 360, PS3, X1 and PS4.
Inteli - Wednesday, May 22, 2013 - link
I'm personally waiting to see someone strip Xbox OS from this and stick plain-old Windows on it. This might make a good HTPC without having to deal with any sort of specialized OSgeniekid - Wednesday, May 22, 2013 - link
That was one of the main draws of the PS3 until they removed the ability to run Linux on it.lmcd - Wednesday, May 22, 2013 - link
Already has the Metro part of plain Windows.epobirs - Saturday, May 25, 2013 - link
Why bother? You can build a mini-ITX HTPC using an AMD APU right now for less than the XO will likely cost at launch. I'm building a mini-ITX system right now that is thus far not breaking the bank. (To be fair, I got the Core i7-3770K for an unusually low price but I'd have settled for a low-power A10 model for about the same price and lower cost on the motherboard.) In fact, by taking it slow and gathering parts as deals come along, it has totaled remarkably little so far.I suspect Newegg has my phone bugged. Whenever I mention in a conversation not being able to find a good price on a particular needed item, I seem to get an email within hours with a sale on that item.
Chad Boga - Wednesday, May 22, 2013 - link
Why didn't Microsoft simply match the PS4's GPU capabilities? If they had, surely they then could have finished off Sony for good.Now Sony has a chance to once again become the premier gaming console.
Microsoft has got so much wrong in the last decade and it looks like this is just a continuation of these stuff ups.
tipoo - Wednesday, May 22, 2013 - link
I think it's so that they can bundle Kinect at a competitive cost.geniekid - Wednesday, May 22, 2013 - link
As alluded to in the article, only PS4 exclusives are likely to take advantage of the additional processing power. Most developers will probably use the same textures/lighting/etc. on both platforms to lower porting costs so you'd never see an improvement.I think they were correct to focus more on the Kinect 2.
dysonlu - Wednesday, May 22, 2013 - link
It's not difficult at all to include different levels of textures and ligthing. As we all know, the PC games makers have been doing that for years. And these news consoles are nothing but PCs.Flunk - Wednesday, May 22, 2013 - link
Frankly, even if they don't program for it, it means that everything will run just a little smoother on the PS4. I'm now leaning toward the PS4 to replace my 360. If they go the same pay for multiplayer route they did this generation it will cement my decision.Voldenuit - Wednesday, May 22, 2013 - link
The reality is that games are never fully optimized for any hardware configuration, so even if PS4 users never see higher res textures or higher poly models, having 50% (!!!) more GPU power means they will see smoother framerates with less dips.I'd take that over some Big Brother contraption in my living room (Kinect) that will be broken into by creepy hackers trying to spy on teenage girls. Or I would, if I were buying a console, which still hasn't been decided (cost/affordability rather than any ideological divides).
lmcd - Wednesday, May 22, 2013 - link
Exactly.Of course Move wasn't better given that a camera was used for that too...
Ramon Zarat - Tuesday, May 28, 2013 - link
Sony's cam is not required to be plugged in for the rest of the console to work . XB1, yes. No Kinect, no console, period.Sony's cam are not hooked to an always-on console. I could be offline forever if I want and the console would still work, and it would be impossible to hack if it's not online. If your XB1 is off the net for more than 24H, no console, period
Sony's cam can actually be turned off, and I mean completely off. XB1, no. It's on, even when everything else is off. Just in case you are too lazy to just get out of the couch and press power on on your console, your Kinect is always on to accept your voice command.
Always-online console + always-on cam and mic and no way to shut any of those thing off = sooner than later some Chinese hackers WILL record you f*cking your wife on that couch and blackmail you for some money threatening you to post that video on YouTube.
I just seriously can't way for this to happen! :)
blacks329 - Wednesday, May 22, 2013 - link
FYI The new PS Eye (Kinect like Camera) will be included in every PS4 box. This was confirmed in February. Whether it is required to be plugged in like the X1 remains to be seen, but I wouldn't be surprised.piiman - Saturday, June 22, 2013 - link
"I'd take that over some Big Brother contraption in my living room (Kinect) that will be broken into by creepy hackers trying to spy on teenage girls"Paranoid much?
Just place something in front of the camera if you’re really that worried.
lmcd - Wednesday, May 22, 2013 - link
Rather the opposite -- any engine-licensed game will take advantage of additional processing power and/or have way better framerates.Ramon Zarat - Tuesday, May 28, 2013 - link
Seriously? LOD, rendering distance, anti-aliasing level, texture filtering level, post processing level, tessellation, shadows and every other conceivable GFX parameters can be turned up or down on the fly by ANY modern game engine. Even fluid and particles are now fully visualized and not fixed, prerendered object and so their level of complexity can be easily adjusted up of down.PS4 would be displaying GFX on high or very high and XB1 only on medium. Same identical textures, same number of polygons etc... The foundation would remain the same, but PS4 would display more complex scene at higher quality. You don't need 2 different sets of games to produce drastically different results, it's all built in the engine already. It's only a matter of processing power and the PS4 GPU and GDDR5 are just that, more powerful.
The only real inherent limitation would be the number of items/accessories simultaneously on screen (trees, cars, spectators, chair, cup, book etc...) in a given scene that would fixed by the game developers and even then, they could build some flexibility into it to allow the more powerful system to display more stuff.
Majeed Belle - Sunday, September 8, 2013 - link
I don't think it will be just exclusives perse. If we think about what types of games both systems will get then we will see likely see better performance on the PS4 for games that are graphic intensive.Think about RTSs for example that tend to have tons of stuff happening on screen or something like Dungeon Defenders.
Skyrim, Dragon's Age, Dragon's Dogma etc also all have tons of lighting and graphical effects going on at the same time that should see benefits from the difference in ram.
Anyone who has played Dragon's Dogma on the PS3 and uses a Mystic Knight with 3 great cannons firing all at the same time has seen slowdown happen and that's because of the lower amount of ram. This should be a problem with the PS4.
Majeed Belle - Sunday, September 8, 2013 - link
-editThis SHOULDN'T be a problem with the PS4
jeffkibuule - Wednesday, May 22, 2013 - link
"Why didn't they just match X?" Because the design of these chips to have them tested, validated, and shipping on time is on the order of years, not weeks.dysonlu - Wednesday, May 22, 2013 - link
"Finish off Sony" is a big overstatement considering that PS3 surpassed Xbox360 in some recent reports.FearfulSPARTAN - Wednesday, May 22, 2013 - link
I hate it when websites get info wrong or interpret it badly, sony matched shipped numbers not sales numbers. I fyou go to vgchartz you will see the xbox was still up around a million units last time I checked.blacks329 - Wednesday, May 22, 2013 - link
I think when both systems are hovering around the same numbers and are off by 1 million. It's fairly negligible in the grand schemes of things. It's impressive for Sony how close they are, considering how colossally Sony screwed up early in this gen and started a year after the 360.beuwolf - Wednesday, May 22, 2013 - link
You checked a long time ago then: http://www.vgchartz.com/ - they are equal. And that's despite Xbox having 1 extra year...If you look at the global year to date, then PS3 is outselling 360 by more than a million this last year.
c1979h - Thursday, May 23, 2013 - link
Kind of sorry for Sony, it took them that long to get even, remember they had Asia all to themselves. It also happens to be the biggest continent in the world.blacks329 - Thursday, May 23, 2013 - link
Biggest continent in the world doesn't mean anything. Their purchasing power pales in comparison to the US and Europe.Although their purchasing power is increasing, which is more beneficial for Sony in the long term, but even then it still pales in comparison to what the average North American can purchase.
xaml - Thursday, May 23, 2013 - link
If every third Xbox 360 user had to get at least one repaired and after that died, bought a new one until finally salvaged by the 'Slim'...Niabureth - Wednesday, May 29, 2013 - link
And just how do you expect them to do that? Decisions on what hardware to use was made a lot earlier than Sony's PS4 presentation, meaning that train has already left the station. I'm guessing AMD is massproducing the hardware by now. Mircosoft: Oh we saw that Sony is going for a much more powerful architecture and we don't want any of the million of APU's u've just produced for us!JDG1980 - Wednesday, May 22, 2013 - link
If AMD is using Jaguar here, isn't that basically an admission that Bulldozer/Piledriver is junk, at least for gaming/desktop usage? Why don't they use a scaled-up Jaguar in their desktop APUs instead of Piledriver? The only thing Bulldozer/Piledriver seems to be good for is very heavily threaded loads - i.e. servers. Most desktop users are well served by even 4 cores, and it looks like they've already scaled Jaguar to 8. And AMD is getting absolutely killed on the IPC front on the desktop - if Jaguar is a step in the right direction then by all means it should be taken. BD/PD is a sunk cost, it should be written off, or restricted to Opterons only.tipoo - Wednesday, May 22, 2013 - link
Too big.Slaimus - Wednesday, May 22, 2013 - link
Bulldozer/Piledriver needs SOI. Steamroller is not ready yet, and it is not portable outside of Globalfoundries gate-first 28nm architecture. Jaguar is bulk 28nm and gate-last, which can be made by TSMC in large quantities at lower cost per wafer.JDG1980 - Wednesday, May 22, 2013 - link
All the more reason for AMD to switch to Jaguar in their mass-market CPUs and APUs.I'd be willing to bet money that a 4-core Jaguar clocked up to 3 GHz would handily beat a 4-module ("8-core") Piledriver clocked to 4 GHz. BD/PD is AMD's Netburst, a total FAIL of an architecture that needs to be dropped before it takes the whole company down with it.
Exophase - Wednesday, May 22, 2013 - link
Jaguar can't be clocked at 3GHz - 2GHz is closer to the hard limit as far as we currently know. It's clock limited by design, just look at the clock latency of FPU operations. IPC is at best similar to Piledriver (in practice probably a little worse), so in tasks heavily limited by single threaded performance Jaguar will do much worse. Consoles can bear limited single threaded performance to some extent but PCs can't.Spunjji - Wednesday, May 22, 2013 - link
It's effectively a low-power optimised Athlon 64 with added bits, so it's not going to scale any higher than Phenom did. That already ran out of steam on the desktop. Bulldozer/Piledriver may not have been the knockout blow AMD needed but they're scaling better than die-shrinking the same architecture yet again would have.JDG1980 - Wednesday, May 22, 2013 - link
Bobcat/Jaguar is a new architecture specifically designed for low-power usage. It's not the same as the K10 design, though it wouldn't surprise me if they did share some parts.And even just keeping K10 with tweaks and die-shrinks would have worked better on the desktop than the Faildozer series. Phenom II X6 1100T was made on an outdated 45nm process, and still beat the top 32nm Bulldozer in most benchmarks. A die-shrink to 28nm would not only be much cheaper to manufacture per chip than Bulldozer/Piledriver, but would perform better as well. It's only pride and the refusal to admit sunk costs that has kept AMD on their trail of fail.
kyuu - Wednesday, May 22, 2013 - link
That's a nice bit of FUD there. K10 had pretty much been pushed as far as it was going to go. Die-shrinking and tweaking it was not going to cut it. AMD needed a new architecture.Piledriver already handily surpasses K10 in every metric, including single-threaded performance.
JDG1980 - Wednesday, May 22, 2013 - link
In terms of single-threaded performance *per clock*, Thuban > Piledriver. Sure, if you crank up the clock rate *and the heat and power consumption* on Piledriver, you can barely edge out Deneb and Thuban on single-threaded benchmarks. But if you clock them the same, the Thuban uses less power, generates less heat, and performs better. Tom's Hardware once ran a similar test with Netburst vs Pentium M, and his conclusion was quite blunt: the test called into question the P4's "right to exist". The same is true of the Bulldozer/Piledriver line.And I don't buy the argument that K10 is too old to be fixable. Remember that Ivy Bridge and Haswell are part of a line stretching all the way back to the original Pentium Pro. The one time Intel tried a clean break with the past (Netburst) it was an utter fail. The same is true of AMD's excavation equipment line and for the same reason - IPC is terrible so the only way to get acceptable performance is to crank up clock rate, power, noise, and thermals.
silverblue - Wednesday, May 22, 2013 - link
It's true that K10 is generally more effective per clock, but look at it this way - AMD believed that the third AGU was unnecessary as it was barely used, much like when VLIW4 took over from VLIW5 as the average slot utilisation within a streaming processor was 3.4 at any given time. Put simply, they made trade-offs where it made sense to make them. Additionally, K10 was most likely hampered by its 3-issue front end, but it also lacked a whole load of ISAs - SSE4.1 and 4.2 are good examples.Thuban compares well with the FX-8150 in most cases and favourably so when we're considering lighter workloads. The work done to rectify some of Bulldozer's ills shows that Piledriver is not only about 7% faster per clock, but can clock higher within the same power envelope. AMD was obviously aiming for more performance within a given TDP. The FX-83xx series is out of reach of Thuban in terms of performance.
The 6300 compares with the 1100T BE as such:
http://www.cpu-world.com/Compare/316/AMD_FX-Series...
Oddly, one of your arguments for having a Thuban in the first place was power consumption. The very reason a Thuban isn't clocked as high as the top X4s is to keep power consumption in check. Those six cores perform very admirably against even a 2600K in some circumstances, and generally with Bulldozer and Piledriver you'd look to the FX-8xxx CPUs if comparing with Thuban, however I expect the FX-6350 will be just enough to edge the 1100T BE in pretty much any area:
http://www.cpu-world.com/Compare/321/AMD_FX-Series...
The two main issues with the current "excavation equipment line" as you put it is a lack of single threaded power, plus the inherent inability to switch between threads more than once per clock - clocking Bulldozer high may offset the latter in some way but at the expense of power usage. The very idea that Steamroller fixes the latter with some work done to help the former, and that Excavator improves IPC whilst (supposedly) significantly reducing power consumption should be evidence enough that whilst it started off bad, AMD truly believes it will get better. In any case, how much juice does anybody expect eight cores to use at 4GHz with a shedload of cache? Does anybody remember how hungry Nehalem was, let along P4?
I doubt that Jaguar could come anywhere near even a downclocked A10-4600M. The latter has a high-speed dual channel architecture and a 4-issue front end; to be perfectly honest, I think that even with its faults, it would easily beat Jaguar at the same clock speed.
Tacking bits onto K10 is a lost cause. AMD doesn't have the money, and even if it did, Bulldozer isn't actually a bad idea. Give them a chance - how much faster was Phenom II over the original Phenom once AMD worked on the problem for a year?
Shadowmaster625 - Wednesday, May 22, 2013 - link
Yeah but AMD would not have stood still with K10. Look at how much faster Regor is compared to the previous athlon:http://www.anandtech.com/bench/Product/121?vs=27
The previous athlon had a higher clock speed and the same amount of cache, but regor crushes it by almost 30% in Far Cry 2. It is 10% faster across the board despite being lower clocked and consuming far less power. Had they continued with Thuban it is possible they would have continued to squeeze 10% per year out of it as well as reduce power consumption by 15%, which if you do the math that leaves us with something relatively competitive today. Not to mention they would have saved a LOT of money. They could have easily added AVX or any other extensions to it.
Hubb1e - Wednesday, May 22, 2013 - link
Per clock Thuban > Piledriver, but power consumption favors Piledriver. Compare two chips of similar performance. The PhII 965 is a 125W CPU and the FX4300 is a 95W CPU and they perform similarly with the FX4300 actually beating the PhII by a small margin.kyuu - Wednesday, May 22, 2013 - link
... Lol? You can't simply clock a low-power architecture up to 4GHz. Even if you could, a 4GHz Jaguar-based CPU would still be slower than a 4GHz Piledriver-based one.Jaguar is a low-power architecture. It's not able (or meant to) compete with full-power CPUs in raw processing power. It's being used in the Xbox One and PS4 for two reasons: power efficiency, and cost. It's not because of its processing power (although it's still a big step up from the CPUs in the 360/PS3).
plcn - Wednesday, May 22, 2013 - link
BD/PD have plenty of viability in big power envelope, big/liquid cooler, desktop PC arrangements. consoles aspire to be much quieter, cooler, energy efficient - thus the sensible jaguar selection. even the best ITX gaming builds out there are still quite massive and relatively unsightly vs what seems achievable with jaguar... now for laptops on the other hand, a dual jaguar 'netbook' could be very very interesting. you can probably cook your eggs on it, too, but still interesting..lmcd - Wednesday, May 22, 2013 - link
It isn't a step in the right direction in IPC. Piledriver 40% faster than Jaguar at the same clocks and also clocks higher.Stop spreading the FUD about Piledriver -- my A8-4500m is a very solid processor with very strong graphics performance and excellent CPU performance for all but the most taxing tasks.
lightsout565 - Wednesday, May 22, 2013 - link
Pardon my ignorance, What is the "Embedded Memory" used for?tipoo - Wednesday, May 22, 2013 - link
It's a fast memory pool for the GPU. It could help by holding the framebuffer or caching textures etc.BSMonitor - Wednesday, May 22, 2013 - link
Embedded memory latency is MUCH closer to L1/L2 cache latency than system memory. System memory is Brian and Stewie taking the airline to Vegas vs the Teleporter to Vegas that would be cache/embedded memory...tipoo - Wednesday, May 22, 2013 - link
That's one of the more interesting analogies for cache I've heard, lol. I've always used tomatoes in a store vs tomatoes in your kitchen.BSMonitor - Wednesday, May 22, 2013 - link
Sunday night season finale'.. Stewie rules!MrPete123 - Wednesday, May 22, 2013 - link
Curious about the "3 operating system" approach. You have Windows, XBox OS.....and Windows? Is the custom version of Hyper-V also running on Windows?Spunjji - Wednesday, May 22, 2013 - link
I think they're counting the hypervisor as an OS for explanatory purposes.jeffkibuule - Wednesday, May 22, 2013 - link
1) XBOX OS - plays games2) App OS - runs apps
3) Hypervisor OS - based on Windows NT kernel, manages resources split between the two.
Why 3? Why not just one? Likely because splitting them up improves portability of code from the XBOX ONE and Windows proper (Windows 8 and it's future versions).
MrPete123 - Wednesday, May 22, 2013 - link
How do you know this?takeship - Wednesday, May 22, 2013 - link
The reason is, AFAIK, is that there are fundamental design issues with x86 that would allow third-parties, if given access to a direct OS - bare metal switch, the ability to hack the console in trivial amounts of time. XMBC, and the hacking that enabled it, is the primary reason why Microsoft went to PowerPC for the 360, so it only makes sense that coming back to x86 for the one, they need a "supervisor" of some sorts that prevents exploits on the bare-metal x86 side from being carried back over to the "full-fat" OS side.Kevin G - Wednesday, May 22, 2013 - link
The PPC in the Xbox 360 supports virtualization. This isn't a reason to ditch PPC in favor of x86.The big reason is due to AMD's HSA support with their GPU's.
Jaybus - Wednesday, May 22, 2013 - link
No. Both Windows and XBox OS run on Hyper-V. Hyper-V is the only OS (known as a hypervisor) running on the bare metal.BSMonitor - Wednesday, May 22, 2013 - link
@Anand ~ In regard to x86 and backward game compatibility... All XBox Live game content currently available would be incompatible with Xbox One?? Am mainly talking the downloadable stuff, i.e. Castle Crashers or TMNT Arcade game??Razorbak86 - Wednesday, May 22, 2013 - link
Microsoft confirmed yesterday that there is no backwards compatibility with X360 software. That includes both disc and downloadable.Sad really, and a missed opportunity, IMO. Since Sony was transitioning from Cell architecture to X86, and had already announced no backwards compatibility, Microsoft could have gained a significant strategic advantage by integrating backwards compatibility, even at a slight cost premium.
tipoo - Wednesday, May 22, 2013 - link
I think you are underestimating the "slight". The chips still are not trivially cheap to make.jeffkibuule - Wednesday, May 22, 2013 - link
Even if the chips were free, you'd still need to stick them inside the console and provide cooling for them, making the system even bigger. It's a major pain and not worth the cost.tipoo - Wednesday, May 22, 2013 - link
Indeed. I'm happy with the cost going elsewhere.takeship - Wednesday, May 22, 2013 - link
Chalk it up to time delays. It sucks that there's no BC out of the box, but I'm hopeful with MS having a straight up hypervisor (VM) in the box, that we'll see it down the road.Stuka87 - Wednesday, May 22, 2013 - link
There is no way the x86 CPU in the XBO is going to be able to emulate the PPC based chip in the XBox 360. So it is not going to happen.Kevin G - Wednesday, May 22, 2013 - link
MS could have added the Xbox 360 SoC to the One to side step emulation entirely.It would have been nice to see a high end model with this, even if it was a limited production run.
Mistake Not ... - Thursday, May 23, 2013 - link
*Cough* Rosetta *Cough*BSMonitor - Wednesday, May 22, 2013 - link
Curious, door open for a "Steam-like" app for Xbox One?? Where by the App handles the "driver" issues related to runny legacy games, etc.. I think Anand mentioned this concept in one of the podcasts.. We have Windows OS...BSMonitor - Wednesday, May 22, 2013 - link
running lolShinobisan - Wednesday, May 22, 2013 - link
This may actually HELP the PC market quite a bit.Two things stalling PC development:
1 No one is making an operating system that requires more power
Win 8 has the same basic system requirements as Vista, which is 6 years old.
At that age, compute power has doubled 3 times.
So a PC COULD be 2x2x2 = 8 times as powerful... but no one is pushing the boundaries.
Think about it.. some are still using Crysis to validate hardware!
(Crysis is as old as Vista... imagine that!)
2 Game developers design to the smallest common factor (Consoles)
While PC compute power has doubled 3 times, Consoles have been stagnant.
The XBox 360 is 8 years old. Again, about as old as Vista!
We need a shake-up. We need someone to stand up and make an operating system and software that uses what we have. A PC system that is 8 times as powerful as anything on the market currently demands. Give us that... and no one will be talking about the demise of the PC anymore.
xTRICKYxx - Wednesday, May 22, 2013 - link
I've heard the complaints of how stagnated visuals have been and I'm sick of it! Sure graphics haven't advanced as much as we thought, but look how far we've come with animation and creating extremely fluid sprites on screen. I would easily take the faces and animations from Halo 4 (console game) over Metro: Last Light because of how well animated and human the characters look in Halo 4. The textures are so much more complex in Metro, but lack compelling animation of facial features.I believe with this new console generation we will see awesome visual increases across the board with more PC games on the way.
Shinobisan - Wednesday, May 22, 2013 - link
Games have been on an increasing visual detail trend, which I enjoy.But... think about the visuals in a basic PC, tablet or phone device.
They are not just stagnated... they are trending backwards.
(And that goes for Apple, Google, and Microsoft)
If anyone else remembers the days of the original windows where Microsoft battled it out with Amiga and Commodore, you remember that each company had their own GUI interface. They were all blocky and 8-bit. And "metro" in Win 8 reminds me of that era. Why do I have an interface that looks like it was designed in 1983? That's 30 years old!
When I start up my PC, I should be greeted with stunning visuals, real time updates of weather and news in novel graphic ways, and a file system that is fun and intuitive in a graphically artistic fashion.
Yes, I know the article was about consoles... forgive me... I'm rattling on about PCs. Carry on then.
bji - Wednesday, May 22, 2013 - link
And when I turn my PC on all I really want is a bash prompt. To each their own :)BSMonitor - Wednesday, May 22, 2013 - link
PC is held back because of the complexity of designing games for infinite combination of hardware platforms/OS.. Yeah, you can make a game with great visuals, but from a profitability stand point there is no way to measure what % of your gamers can actually benefit. Could put a lot of $$ into a game that only a handful of people can enjoy. Simply risky on the PC side to spend a lot of R&D/Game development dollars.The console provides stability and predictability on the hardware side. Day 1, you know the install base is X million of users for Xbox or PS3. And everyone has same hardware.
Shadowmaster625 - Wednesday, May 22, 2013 - link
If it is so obvious then why isnt AMD doing it? Why would they instead opt for a PS4 style design for their own next gen APU? Either way it is still an epic fail. The'yre either throwing their competitor a huge bone, or throwing themselves on the floor. Take your pick.tipoo - Wednesday, May 22, 2013 - link
Or they are just managing costs, since Kinect 2 will be bundled.WaltC - Wednesday, May 22, 2013 - link
Well, between the two, Sony has nailed the 3d-game end of the console business this go-around, imo. Intel, of course, has nothing powerful enough in the igp department to garner this business, so it's no wonder both companies selected essentially the same architecture from AMD. The CoD snippet run at the end of yesterday's demonstration, announced as running in real-time on an xBox one, was extremely telling I thought. First, it did not appear to me to be running @1080P--but possibly @ 480P: the on-screen imagery was definitely low-res, exhibited noticeable pixel aliasing (I was surprised to see it), and seemed to generate a good deal of pixellation that was very noticeable in the scene transitions. It also looked like nobody wanted to show off XB1 rendering in longer scenes where you could really see the frame-rate and get a solid feel for the performance of the game--the whole demo for CoD consisted of one rapid scene transition after another. The rendering problems I observed could all have been caused by a streaming bottleneck--or else by the limits of the hardware (I *hope* it was the immediate streaming because if not then I think Microsoft is going to have some problems with this design.) It was easy to see why the CoD real-time demo was saved for last and was so very brief...;)But, now that consoles are going x86, there's no earthly reason why either Microsoft or Sony could not update the hardware every couple of years or so when new tech hits the price/performance marks they require. Since we're talking x86, there would never be a question of backwards compatibility for their games as it would always be 100%. I think the days of 8-10 year frozen console designs are over. I think that's great news for console customers.
However, depending on whether Sony handles it correctly, the PS4 could walk away with practically everything as Microsoft is building in some fairly heavy DRM restrictions that involve the basic operation of the device--"storage in the cloud," etc. Involuntary storage, it would appear. If Sony comes out with a gaming console that is not only more capable in terms of the standard hardware, but one which is also customer-friendly in that it allows the customer to control his software environment--I think Sony will walk away with it. The people who will wind up buying the xb1 will be the people who aren't buying it as a game console. To be honest, though, set-top boxes are as common as dirt these days, etc. It should be very interesting to watch as this all shakes out...It's great, though--we've got some competition! (I'm not a console customer, but this is always fun to watch!)
hemmy - Wednesday, May 22, 2013 - link
I think most of that first paragraph seems mostly rubbish to me. Sony made a game console, Microsoft made an all-in-one media device. It was well known before the announcement that Microsoft would be showing very little in the way of games yesterday, and they were saving that for E3. 360 games already render @ higher than 480p.jamyryals - Wednesday, May 22, 2013 - link
So you are saying you saw artifacts in a demo through a live stream? Tell me you are joking...As for Sony/Microsoft upgrading console hardware during the current generation, I mean anything's possible, but they would be leaving a lot of customers behind on older hardware. Developers would have to make sacrifices in framerate or quality to achieve compatibility. This places a lot of demands on game developers for testing more environments. Additionally, there's nothing about x86 which makes this upgrade more achievable than on PowerPC architecture. They could have released upgraded consoles if they saw a benefit.
bji - Wednesday, May 22, 2013 - link
I like the 8 year or longer console cycle. It means that I can focus on enjoying games more than upgrading every couple of years to the latest and greatest that isn't really any more fun to play, just has more eye candy.Hrel - Wednesday, May 22, 2013 - link
"We already have the Windows kernel running on phones, tablets, PCs and the Xbox, now we just need the Xbox OS across all platforms as well." That, 100 infinity BAGILLION times that!I'd actually like to see Nintendo release a console in time for X-mas 2014 with comparable hardware performance. Just because otherwise I don't see how that company will survive and I really don't want Nintendo to go away. I don't know if that's within their realm of possibility but they need to do something because the wiiU is pretty terrible.
tipoo - Wednesday, May 22, 2013 - link
But they won't, as laughable as Wii U sales are that would still anger however many bought that, likely their core base. They'll survive anyways, see their cash reserves, plus any platform Mario and Zelda et al come to will be fine. Nintendo survives on first party, always has.tipoo - Wednesday, May 22, 2013 - link
That said, yes I would have preffered if they just kept motionplus controls and the cost of the tablet controller instead went to an APU.skatendo - Friday, May 24, 2013 - link
Seriously? Have you even tried the GamePad for an extended period of time? The thing is incredible and very useful. Also, the Wii U CPU/GPU is very customized and tailored for gaming. It's smart Nintendo didn't release tech specs because most everybody wouldn't understand how it would perform in real time. Custom silicon is a magnificent thing. Heck look at the 4-core CPU and 12-core GPU for the Tegra 3 and pit that against a paltry looking dual core CPU/GPU Apple A5 and you wouldn't have any competition right? (on paper at least) And who would have thought that the A5 with A FOURTH THE CORES and much slower clockspeed outperformed about twice the game performance the "superior" Tegra 3.marc1000 - Wednesday, May 22, 2013 - link
a great thing MS could do is find a way to put Windows Phone 8 inside of Nvidia Shield - and then have the option to stream your game from the Xbox One to to Shield.That would be awesome, family could be watching TV on the living room and you could have high-quality gaming anywhere - event if it would not be possible to play on the console AND shield at the same time, of course.
Streaming games from X1 to Shield (full control scheme) or any WP8 phone/tablet (simpler games) would be that killer-app that MS needs so badly to boost it's phone business.
nforce4max - Wednesday, May 22, 2013 - link
Interesting read and excited that GCN is being used but the cpu side of things I have to wonder how it will actually perform. Bobcat was fairly weak (think of a Pentium 4) and was terrible starved of memory bandwidth but the worst was that the L2 cache only worked at half the core clock. If the Jaguar cores still have the same sluggish L2 cache then even 8 of them is going to be painful but I suppose devs are going to offload what they can onto the gpu (openCL).As for the 32mb on die memory as stated in the article it all comes down to how it is used. If used for the frame buffer it will limit resolution but make for a huge fps boost as the rop and tmu are bandwidth hogs gpu wise but leave the rest for the cpu and shader. The cpu being weak as it is won't need much provided the gpu doesn't hog to much of the bandwidth. If used as a cache it will make up for the weak L2 cache and provide a unified pool for all 8 cores, if software only then we might have to wait to find out what for.
Overall this is good news for the PC, no more games like GTA4 :D
Arsynic - Wednesday, May 22, 2013 - link
Listening to Sony PR and Sony fanboys you'd think PS4 had every advantage under the sun and will retail for $400.MooseMuffin - Wednesday, May 22, 2013 - link
Nice article Anand. Rare to get level-headed analysis at the start of a console cycle.highend - Wednesday, May 22, 2013 - link
Amiga CDTV in 1990 looked better than XBOX One, see yourself: www.amigahistory.co.uk/cdtv.jpg Also M$ copied from Commodore looks & designStuka87 - Wednesday, May 22, 2013 - link
I notice Anand says the XBox is 28nm. But everything I have been seeing says the XBox is 40nm, while the PS4 is 28nm.tipoo - Wednesday, May 22, 2013 - link
28 was confirmed. See the Engadget tech talk with Microsoft.tipoo - Wednesday, May 22, 2013 - link
5 billion transistors on 40nm would be something...dragonsqrrl - Wednesday, May 22, 2013 - link
Anand, on the 'Memory Subsystem' page, when you say..."less area efficient but lower latency and doesn't need refreshing"
Are you referring to eDRAM or eSRAM? I got a little confused there.
tipoo - Wednesday, May 22, 2013 - link
eSRAM. eDRAM needs refreshing.tipoo - Wednesday, May 22, 2013 - link
eDRAM also takes a third the size.jabber - Wednesday, May 22, 2013 - link
"The funny thing about game consoles is that it’s usually the lowest common denominator that determines the bulk of the experience across all platforms."This is the key point that Microsoft have realised. I bet the developers too told both MS and Sony not to bother going crazy as they will develop to the minimum standard.
This is not the age of the games console anymore. Its the age of the Media/Entertainment center.
When I got my 360 back in 2006 it was mainly used for games now seven years later and a whole lot of bandwidth upgrades and media explosion I now use it mainly for...well...media. Gaming takes very much a backseat on my 360. It s a very convenient media portal that happens to play games as well.
The world has moved on and I can imagine that gamers might feel left behind but there is a whole load more out there to occupy people time than there was in 2005. Microsoft has to capture that.
bengildenstein - Wednesday, May 22, 2013 - link
Interestingly, this seems to be a core strategy for the PS4 as well (as it was for the PS3) as mentioned in their unveil presentation.sfrocks - Wednesday, May 22, 2013 - link
Anand, in terms of the product positioning, I agree with your assessment, but I also think Microsoft would be better off by creating a disruptive (rather than sustaining) product. It'll be even better if they launched one in parallel with the Xbox One. It will surely cannibalize the sales, but that's the price for solving innovator's dilemma. Moreover, it's not Sony or Nintendo that MSFT should be very afraid of, rather Apple and Google. Apple will surely eneter the market from the low end of the value chain. More details here -- https://www.facebook.com/notes/itvale-the-blog/xbo... , would love to hear your thoughts.cjs150 - Wednesday, May 22, 2013 - link
Looking at the screen shot of the TV part of the software: at least we now know where the Windows media center team went.I am confused is this an HTPC with gaming facility attached or a games system with HTPC capabilities attached?
jeffkibuule - Wednesday, May 22, 2013 - link
Both were built to work along side each other. If it were the former, you'd expect to run Office on it (an HTPC is still a PC) and if it were the latter you'd need to quit a game before running any media apps (since the game would demand to use all system RAM available).As such, it really is neither of the scenarios you presented.
Littleluk - Wednesday, May 22, 2013 - link
I think most of the hardware choices are designed with the display device in mind. There is a huge market of 1080p devices and the price point for those is well established. Most households now have one or will have one. Locking in hardware and performance at a set resolution is good for console costs and game developers. (it does worry me a bit whether advances in PC display technology will equate to higher graphics displays in PC games if the rest of the market is set at 1080p... why develop higher res models etc.) Sony going for a bit higher graphics performance could be an advantage someday if display technology changes to utilize the headroom but Microsoft has solid hardware for their target resolution.The hypervisor approach is particularly interesting to me as it might be a window into the future of where MS may take OS development. Virtual machines optimized for particular tasks can give you a faster spreadsheets and higher game fps on the same box by selecting which OS module is running. Is there a plan somewhere to put Office 365 on the Xbox One? Microsoft would like nothing better than to be selling software suites that use MS cloud services across multiple platforms to each and every one of us.
tipoo - Wednesday, May 22, 2013 - link
More power can be advantageous to the same resolution...My Radeon x1650 could run games in 1920x1080, that doesn't mean it can do everything a GTX680 can at the same res.bengildenstein - Wednesday, May 22, 2013 - link
A visual comparison between the games demonstrated in the PS4 and XB1 presentations clearly give the graphical edge to the PS4. PS4 games look distinctly next-gen, and are approaching CGI in fidelity. These are early days, and this comparison is hardly scientific, but it seems to corroborate the stronger, easier to develop for hardware in the PS4.But I think the underestimated feature for the PS4 is the 'share' button on the controller. Game spectating is a big deal, and this gives the PS4 a fan-based advertising engine. Due to the simplicity of sharing video, expect a flood of high-quality PS4 videos to be uploaded to the web, making the PS4 and PS4 games much more visible online. This turns regular players into advertisers for the system which should significantly help its popularity with cool 'look what I did' videos, walkthroughs, and competitions.
I am also very interested to see how Sony uses the second, low-power, always-on processor in the PS4. Certainly it would be possible to include voice-commands ala XB1, but I think that this can open up interesting new uses to keep the system competitive over the coming years.
Flunk - Wednesday, May 22, 2013 - link
I think you're reading too much into presentations that could very well have been 100% pre-processed CGI. I expect that the final games will look quite similar on both.senecarr - Wednesday, May 22, 2013 - link
You might want to check the Xbox One presentation - one the things they mention is that game play share is easy for developers to include because of all the connections to the Azure Cloud computing. So that just leaves a share button, but that is actually horrible compared to Kinect, which be on every Xbox One. Instead of hitting a button in the middle of your controller and losing your momentum in the game, for Xbox One you should just be able to yell, Xbox, start recording game play.BPB - Wednesday, May 22, 2013 - link
I think the PS4 will share to more people. I expect the Xbox One sharing to be either Xbox Live only or the MS universe only. I think Sony's sharing won't be as limited. At least that is the impression I got from the presentations.blacks329 - Thursday, May 23, 2013 - link
PS Eye (Sony's Kinect) will be included with every console as well, this was announced back in February.jabber - Wednesday, May 22, 2013 - link
If these new boxes are more Media than gaming orientated going forward it could mean far shorter life-cycles for them. We could be going to a 3-4 year cycle rather than the current 8 year trend.HisDivineOrder - Wednesday, May 22, 2013 - link
"The day Microsoft treats Xbox as a platform and not a console is the day that Apple and Google have a much more formidable competitor."I'd say the reverse. The day that Apple and Google decide to become competitors to Xbox is the day that Xbox (and Playstation) go extinct. Right now, MS and Sony are getting by because the HDTV efforts by Apple and Google are "experiments" and not taken seriously. Imagine an AppleTV where Apple allows app installations and a GoogleTV that's focused on gaming with decent hardware.
And imagine how low that GoogleTV (for Games) would cost. Imagine it opens up Android and just like that, bajillions of apps descend upon it.
Hell, it's debatable if they even need to bother making more than a streaming device to receive the image from your tablet and/or smartphone to do just that. Really, all Google needs is an AppleTV-like Airplay connection. You can already plug in whatever USB/bluetooth controller you like.
Within a few generations of Google taking HDTV gaming seriously, they could walk all over Sony and MS because while consoles sit and languish for longer and longer periods of time, tablets are constantly evolving year after year, iterating upward in specs at an impressive rate.
How long before even the Xbox One isn't pushing out graphics far enough ahead of a Nexus tablet that people just go with the $100-$200 tablet with the free to $1 games instead?
BSMonitor - Wednesday, May 22, 2013 - link
Nexus tablet doesn't have CoD for free..elitewolverine - Thursday, May 23, 2013 - link
no one will make a $1 game with the visuals of CoD, BF2, halo, the list goes on. They would make 0 money.google taking hdtv gaming seriously? They make all their money on ad's, you honestly think people constantly want ads in a video game? And not product placement...ads. Before you matchmake just watching this 30sec video about vagisil...yea right...
Also, what is a few generations? A few is more than 2, 3 generations ago we were at the ps1. 14yrs ago.
Your telling me that its going to take 19yrs for a tablet to have todays graphics of the xbox1? By that time what the hell will the ps5 have or the x5....
The biggest thing the x1 has for it, that every one is forgetting...cloud/azure.
This is huge, so huge time will show just how little the x1 in multiplayer games will need to compute tasks
Majeed Belle - Sunday, September 8, 2013 - link
I think you are putting way too much stake in the cloud especially when we are talking about computing anything graphics or otherwise. People can barely download music on a steady connection right now. Consoles can't even get you solid updates in a timely manner and you are talking about offloading real work over the internet?ok.
Mathos - Wednesday, May 22, 2013 - link
After reading a lot of articles about these two consoles, and their SoC's. There are some things we can extrapolate from this info.Both Systems are based on the same 8core x86 amd64 CPU. Which means the main logic and memory controllers in the APU's are the the exact same. The comment about PS4 being married to ddr5 may not be true, as we all know that the GPU's can also run on ddr3, plus it may be possible that the cpu memory controller is also capable of running ddr5 or ddr3 in either system..
Both systems are using a 256bit memory bus. Being these are x86 amd cpus, that likely points to jaguar using a quad channel memory controller 64+64+64+64=256, which could be good news when they hit the desktop, if they retain said quad channel controller. It would also be nice to see that in AMD's mainstream chips as well.
tipoo - Wednesday, May 22, 2013 - link
GDDR5 is on the PS4 official spec sheet.Kevin G - Wednesday, May 22, 2013 - link
Going with eSRAM is an odd choice. I would have through capacity would have been more important than absolute latency. By merit of being on-die, using eDRAM would have lower latency than external DDR3. If they had chosen eDRAM, they could have had 128 MB on die. That is enough for three 32 bit, 4K resolution buffers. In such a case, I'd have that 128 MB of eDRAM directly accessible and not a cache. Sure, software would need to be aware of the two different memory pools for optimizations but most of that would be handled by API calls (ie a DirectX function calls would set up a frame buffer in the eDRAM for the programmer).The bandwidth figures for the eSRAM seem to be a bit on the low side too. The Xbox 360 had 256 GB/s of bandwidth between the ROPs and eDRAM. With high clock speeds and a wider bus, I would have thought the Xbox One had ~820 GB/s bandwidth there.
I'm also puzzled by MS using DDR3 for main memory. While lower power than GDDR5, for a console plugged into a wall, the bandwidth benefits would out weigh the power savings in my judgement. There is also another option: DDR4. Going for a 3.2 Ghz effective clock on DDR4 should be feasible as long as MS could get a manufacture to start producing those chips this year. (DDR4 is ready for manufacture now but they're holding off volume production until a CPU with an on-die DDR4 memory controller becomes available.) With 3.2 Ghz DDR4, bandwidth would move to 102.4 GB/s. Still less than what the PS4 has but not drastically so. At the end of the Xbox One's life span, I'd see DDR4 being cheaper than acquiring DDR3.
As far as the XBox One's AV capabilities, I'd personally have released two consoles. One with the basic HDMI switching and another with Cable card + tuner + DVR. And for good measure, the model with Cable card + tuner + DVR would also have an Xbox 360 SoC to provide backwards compatibility and run the DVR software while the x86 CPU's handle gaming and the basic apps. If MS is going to go in this direction, might as well go all the way.
Good to see 802.11n and dual band support. With HDMI in and out, I'd have also included HDMI+Ethernet support there as well. Basically the Xbox One would have a nice embedded router between the Gigabit Ethernet port, the two HDMI ports and the 802.11n wireless.
jabber - Wednesday, May 22, 2013 - link
Remember though that the DDR3 in the Xbox will be hardwired directly with no legacy or other PC related stuff getting in the way. This will be optimised DDR3 and not working exactly how its standardised in our PCs.Kevin G - Wednesday, May 22, 2013 - link
The only advantage DDR3 in the Xbox One has over a PC is that it is soldered. This allows for marginally better signaling without the edge connector of a DIMM.kamil - Wednesday, May 22, 2013 - link
That was surprisingly fair, considering a lot of what I've seen since yesterday. Sony tried hard to do what it thought would "improve the gaming experience" and ended up with a lot of social integration and considerably more aggressive hardware. Microsoft didn't really add much to actually playing games (though they do have some cloud-based stuff including game streaming) but has made a play for becoming a full living room experience, with games, live and recorded television, no hassle cable integration, and seemingly several exclusive partnerships. I'm not convinced that core gamers will see much use for those options (though most of the people I know in that group were already PC-focused if not exclusive) or the social things with the PS4, but the raw power would be a nice draw, assuming Sony doesn't accidentally pull a 360 and overheat with any noteworthy extended use.Of course, if the rumors of Microsoft pushing developers toward always-online DRM, included on-console mandatory check-in every 24 hours, fees for pre-owned or shared games, forced hard drive installs, etc. all pan out a lot of people are going to boycott on principle even if they don't buy used games and have great internet.
blacks329 - Thursday, May 23, 2013 - link
I fall in that category of, not buying used games with decent internet (but capped - damn you Canadian duopoly!!) but definitely won't be picking up the X1 if this holds (at least early on).Additionally, I hate paying for XBL and have no intention of doing it going forward, hopefully Sony doesn't follow this route and maintains PS+ as value added and not a requirement for playing games online.
bplewis24 - Wednesday, May 22, 2013 - link
"the Xbox One is MORE about consuming media THAN it is about playing games."FTFY
twotwotwo - Wednesday, May 22, 2013 - link
I hope it's just weak marketing, but what worries me is that the non-gaming extras don't sound all that new or interesting. A compelling, if unlikely, possibility would be to sell an upgrade that sticks Pro-compatible Windows 8 on the non-XBox side. MS is already selling a portable computer; why not sell a desktop/HTPC, too?SymphonyX7 - Wednesday, May 22, 2013 - link
If this was Top Gear, Jeremy Clarkson would be busy saying unpleasant things about the Xbox One right now."MORE SPEED and POWER!!!"
Oddly enough, I dreamt the night before the Xbox One launch that the new Xbox had 16 GB of DDR4 RAM and shader count equivalent to Tahiti @ 1 Ghz. I hoped that Microsoft with their virtually bottomless pockets could someone improve on the leaked Durango and Orbis specs which didn't bode too well for Durango. I mean, Sony doubled the RAM from 4 to 8 GB. And it's GDDR5 to boot!
Sigh. One can only dream -- no pun intended.
nafhan - Wednesday, May 22, 2013 - link
Don't think anyone has mentioned this yet, BUT another reason why MS can have slightly lower specs than the PS4: XBO and PS4 are close enough that cross platform games are likely going to target the weaker platform. Sony will be spending money on more powerful hardware that will only be utilized in first party/exclusive games. Side by side comparisons will (in most cases) show minimal - if any - differences.inherendo - Wednesday, May 22, 2013 - link
Can someone explain to me what bluray dsp is. Curious as to what Anand meant but googling shows no info.Kiste - Wednesday, May 22, 2013 - link
That alway-on Kinect thing is really creeping the hell out of me.This THING is always watching. Always. Unblinking. And it can see you in the dark. It can hear you. It can even measure your heart rate just by looking at you! It has a huge HAL9000 eye staring at you.
It even LOOKS evil.
I really don't want this thing... looking at me.
tipoo - Wednesday, May 22, 2013 - link
It's silicon. It's anonymized. I don't get what everyone has a problem with. It's not like this is letting the evil gub'ment spy on you.scottwilkins - Wednesday, May 22, 2013 - link
One thing I think that the article failed to point out was that even though there is a 33% difference in hardware on GPU, there is NOT a 33% different in performance. As the GPU grows in parts, the gains go up logarithmically, not linear like this article suggests. I'd say there is likely less than a 10% performance gain on the PS4 part. Yet it will use almost twice the power, be much more hotter requiring more loud cooling. Yuck! Add to that the performance of the Windows kernel for processing, something Sony could never be able to match, I'd bet that they are probably almost equal in the end.scottwilkins - Wednesday, May 22, 2013 - link
I believe I should have said "reverse logarithmically" meaning more parts equal less gain. I'm sure you guys get the point. Sony is betting on tech specs and market power rather than real processing power.tipoo - Wednesday, May 22, 2013 - link
In the absense of another bottleneck like memory bandwidth, within GPUs adding shader cores is actually a pretty good indicator of performance. And it's not just 12 vs 18 CUs, the PS4 also has double the ROPs and 172GB/s for its entire pool of memory.elitewolverine - Thursday, May 23, 2013 - link
with the eSram, 172 means nothing...You have all process, fighting for attention on a higher latency bandwith.
There is a reason pc's are still using ddr3, and its not just because of memory controllers, ddr5 has been in use for many many years.
Heck with amd producing both video card controllers for their video cards, one could simply conclude just dump a ddr5 controller on the apu and have them go to ddr5 desktop.
But cost is not just a factor, bandwith only goes so far.
x1 will be saving costs and reap the benefits with eSram, sony's bandwith starts to go out the window.
Put in cloud computing, and it becomes even more mute to have ddr5
UNCjigga - Wednesday, May 22, 2013 - link
Given how similar the the X1 and PS4 are, any chance developers would be able to ship a single disc with support for both platforms? Not that they would...just wondering if it's technically possible to have shared assets/textures etc. and separate binaries on a single disc that could be read by both machines.tipoo - Wednesday, May 22, 2013 - link
I'm sure there are security descriptors for each console that would stop that.juhatus - Thursday, May 23, 2013 - link
Double layer blu-ray disc? One layer for X1 and the other for PS4.. mmhhhrangerdavid - Wednesday, May 22, 2013 - link
Assuming that white Xbox logo glows on both the box and the Kinect, I can see some black electric tape coming in very handy....marc1000 - Wednesday, May 22, 2013 - link
lol shut it's eyes close!I bet the logo on Kinect has some kind of IR emmiter below it!
trip1ex - Wednesday, May 22, 2013 - link
Can I turn it into a Windows computer? That would be a selling point.Can I turn it into a DVR? Selling point.
I would question not the choices here, but the underlying principle of wanting to check stats etc on your tv in a side bar using Kinect, game controller or Smartglass instead of just using your phone or tablet directly.
I would question the attraction of this to someone not interested in games. GTV hasn't made a case for a box on a box tv product. And it is hundreds cheaper than the nextbox will be. It doesn't seem like this market will open up until, at the least, you don't need that second box.
I suppose though that this stuff is a value add to convince mom or dad to buy it or someone on the fence with their gaming interest to buy it. Or someone only interested in one or two franchises.
blacks329 - Thursday, May 23, 2013 - link
I think the scenario arises for having ESPN or a news site pinned to the side, when you have multiple people in the room watching. Where as a tablet would provide the same info; its just a personal experience. But if you have the guys over for a playoff game, while another important game is going on at the same time. Instead of having every one individually looking down at their phones/tablets/switching channels. You can have one game full time and the other with it's boxscore pinned to the side, so everyone can see everything without having to look away from the screen. Or have a news or twitter feed going on the side, which depending on the circumstances could be really interesting.The example they showed with buying tickets for a movie, while watching a movie, was such a stupid example, all of which is a personal experience and can be done on a phone or tablet anyways, especially since everything on the TV had to be manipulated by a phone to begin with.
I honestly find this really compelling and potentially awesome, but all the gamer (or anti-gamer) things they've mentioned so far as well as the XBL gold still being required for playing online are really dissuading me from thinking about getting one any time soon.
elitewolverine - Thursday, May 23, 2013 - link
300,000 servers is not free...Live this year will slowly start to show why people are paying
ncsaephanh - Wednesday, May 22, 2013 - link
In introduction:"This last round was much longer that it ever should have been, so the Xbox One arrives to a very welcoming crowd." Change that to "than"Krysto - Wednesday, May 22, 2013 - link
Seems like Sony has a winner here in terms of hardware and developer support, too.Any idea if PS4 does indeed use the full OpenGL 4.3 as rumored? And is it based on some custom Linux OS?
powerarmour - Wednesday, May 22, 2013 - link
PS4 will likely run some form of libGCM similar to the PS3, this is another area the PS4 may easily have the upper hand in, software overheads...Death666Angel - Wednesday, May 22, 2013 - link
All pretty much expected on the hardware side of things. And I like it as well, as a non-console gamer (I bought an Xbox 360 for the Kinect to introduce my wife and larger family to gaming... it's mostly left unused, since our living room is on the small side and we need to rearrange a lot to play). The current high end PC should be able to run all the games developed for the consoles very well, which is good. There is less chance of getting rubbish ports since all 3 consoles are so similar. All positives for me as a PC gamer.For the software/entertainment side, that doesn't interest me in the least. The Xbox could never be my media hub because I have an A/V receiver for that, as I expect many people do. And all the software stuff, streaming etc. I have my doubts about everything trickling down to the German market as they envision it. Plus, I'm not a TV guy or a rent guy. I like to own my stuff as much as possible.
Silma - Wednesday, May 22, 2013 - link
It seems to me that the Microsoft approach is Superior to the Sony one if One + Kinnect is priced competitively or at not too big a premium with the PS4.Sony has a history of hacker-friendly internet platforms and it is often in denial. Would I trust the Playstation Network? No way.
If consoles become the center of the TV/Entertainment center, if people will really use the console as a Skype terminal or as today use the Kinnect to excercise then I think the Microsoft offering is the more attractive one and that its features will interest buyers more than to know that one console has more shader units than the other.
In addition, it would not be difficult for Microsoft to extend the Platform with byod features such as Skydrive access, Office 365 and so forth.
jonjonjonj - Wednesday, May 22, 2013 - link
this annoys the hell out of me. who cares about power. i want performance. i want my console or pc so power hungry it trips a breaker. its obvious everyone wants a kinect instead of a better gpu. right? tv on xbox is useless. so i dont have to hit the input button on my remote? seriously? its not like you get the service from ms. you have to pay for xbox live gold and be a paying cable customer. it offers absolutely nothing. i have 2 360's but i have no plan on my buying this garbage cable box i already have a dvr. ill take that $500+ and buy a 27" 1440p monitor and another video card for sli. ms already proved to me over the past 3 years they don't care about the core gamers anymore and this just reaffirms it. i hope this console fails so miserably.elitewolverine - Thursday, May 23, 2013 - link
because everyone wants a ps4eyetoy in their ps4 right? instead of a better gpu right?Oh wait...ps4 did the same thing.
Everyone wants a camera with their laptop instead of a better cpu/gpu right? Oh wait...
In my house hold finding the remote is an issue. Not because we forget but with 8 people in a home, each one grabbing at it at different times, this becomes an issue.
You have a dvr? great, the xbox is not designed to replace it, in fact its already stated its working along side....any cable box.
Why do people pay for TiVo? there is a service to be had, and its much better imo than comcasts dvr or the offering by Direct. I was skeptical about TiVo till my damn ex got me hooked on it.
Not to mention how many times I have to tell my parents the channels of anything other than local news. Now they just have to say...hgtv, animal, history. They don't live with me, but they will be getting one. This is also a plus for me, as my kids don't have to worry about memorizing the channels, even though they do currently.
Your mentality is what the losers of the tablet wars are facing...
Why would people want a tablet? I have a laptop, pc, tv, ipod...the list goes on...
Convenience and easy. This is what x1 is offering, along with cloud computing, dedicated servers, and games games games games
jonjonjonj - Tuesday, June 4, 2013 - link
really? sony hasn't announced the pseye is going to be bundled with all ps4's. in fact there are links to sony videos that say "Playstation4 camera may be required and is sold separately". even if it is the ps4 has a 30% faster gpu then xbox so ms should have spent that kinect money on making the gpu more competitive. the 360 already showed kinect was a failure for gaming. unless you are a girl who wants to play dance central there is no reason to get a kinect. this is why bundling the kinect boggles my mind. it shows me ms either doesn't get it or doesn't care if the xbox is a gaming console.the best defense of xbox tv you can come up with is you constantly loose your remote? my remote is always in the room with the tv. not very hard to find.
i know its not going to replace my dvr. that doesn't change the fact that i see zero benefit to using xbox and i still have to pay for my dvr. i forgot you can use voice commands and wave your hands around like an idiot. no thanks ill pass and stick to my remote.
"games games games games"? i must have missed that part of the conference. when they announce their exclusives at e3 expect them to mostly be kinect and XBLA games not the non kinect AAA games everyone wants. just like they have over the last 3 years.
i have a ipad and never use it. the only thing its good for is quickly checking email or watching a youtube video. beyond that my laptop or desktop is 10x faster and easier to use. people are buying tablets because it the cool thing to do and everyone already has multiple pc's and laptops. once everyone has a tablet sales will slow down.
BPB - Wednesday, May 22, 2013 - link
It appears MS has dumped the WMC extender abilities. It's not a surprise, but I don't give two hoots about the DVR overlay. With my HTPC and extenders I save a lot of money not having a DVR. MS wants me to go back to a DVR, and I won't. Given all else it looks like PS4 for me with continued 360 gaming (which MS said will also get updated in some sort of way, won't know till E3 what that means). But pricing and other factors will be part of that equation too.elitewolverine - Thursday, May 23, 2013 - link
im surprised you hooked up an htpc, considering you have a hard time understanding works along side your cable/dvr via infared. Meaning when you tell it to change the channel it will do so, at your cable box/dvr/htpc. I can understand how something so simple can be confusing...BPB - Monday, May 27, 2013 - link
Wow, can't wait till I'm all growed up like you! Then I too can be a jackass.I watched the MS presentation live, and have followed many of the MS follow-up discussions about Xbox One. They've mentioned working with cable companies like Comcast for TV overlay, but oddly enough they haven't mentioned working with HTPC's running their own Windows software. Then again maybe they too have a hard time figuring all this out. Please post your phone number or email so MS can contact you and you can set them straight.
Rogatti - Wednesday, May 22, 2013 - link
If Sony allow Linux OS ... PS4 GO !Next generation of EyeToy...E3 2013 ?
blacks329 - Thursday, May 23, 2013 - link
They've already showed the next gen PS Eye and it was announced back in February that it would be shipping with every PS4 (just like Kinect) ... Sony needs better marketers lol.Majeed Belle - Sunday, September 8, 2013 - link
All of you who are trying to use the "Sony will ship with the PSeye" thing is missing a very big point.The PSeye IS NOT required for ANY functionality. That's a big difference there big man.
I personnaly don't give a damn about the eye or kinect I didn't use either of them last gen and I won't use either now. If you are going to try and make a though you should try to state ALL the facts. Not just the ones that you think will validate your argument.
rasatouche - Thursday, May 23, 2013 - link
Memory bandwidth is going to be a huge issue, no? I'm mean we've all seen the benchmarks when you get an AMD apu based X86 pc, and then you change the speed of the ram and you see 30 - 40% difference in FPS, on PC with all it's glorious overhead. In games GPU's are the deciding factor, and probably still will be for sometime, not to mention the PS4 will likely have a less intensive ram overhead for the OS to boot.What this is going to mean this gen IMO, is that sony's first party titles will probably look better, and third party games, and the 'ports' if you will, will be the reverse of last generation. I remember one of the main reasons I got a 360 was that it was the better console for 3rd party titles, they ran better, less texture popping & FPS dips / tearing, than the ps3 at the time. It will likely be the reverse this generation, seeing as games should be easier to get running optimally on the ps4, simply because it has more GPU. 50% more shaders in the GPU is nothing to sneeze at, to put it in PC performance terms, it's probably about the relative difference between a 660ti & a 680.
elitewolverine - Thursday, May 23, 2013 - link
you might want to re-understand how the internals will work.ddr5 is great at bandwidth, something a video card needs, because its sharing large amounts of predetermined data, latency is not a real issue. DDR3, is used in pcs because its cheaper, but also because it has lower latency in general.
You off load one of main things a gpu does, framebuffering, and ddr5 becomes a highly priced memory module.
Don't forget, the x1 is carring a set of 2133 ddr3's, what you are talking about is people going from 1333 ddr3 to 1800 or 2133. Good thing the x1 already has 2133
mayankleoboy1 - Thursday, May 23, 2013 - link
Could it be that 2 years down the line, MS or Sony could overclock the CPU part to something like 1.8GHz through a firmware update ?tipoo - Thursday, May 23, 2013 - link
Why would they not launch at that speed then, since every single shipped PS4 would have to be validated for the higher speed first?WoodyPWX - Thursday, May 23, 2013 - link
Sony is the winner here. The architecture is the same, so developers can easily tune some numbers (higher quality shaders, more particles, higher FPS, higher resolution etc) to get noticeably better results on the PS4. I would prefer PS4 as my next gaming device. Hopefully Sony will not screw up developer tools.kensa99 - Thursday, May 23, 2013 - link
I prefer Xbox one too! I liked playing Xbox 360 games ever and even read many articles about it from Aneesoft Xbox 360 column. And now I will choose Xbox as my favorite game console.alex@1234 - Thursday, May 23, 2013 - link
GPU is the most important factor in determining the console. PS4 holds the advantage here. Xbox one unless they change the GPU similar to PS4, I will not opt for it. Other than this the integration of TV, Internet is not necessary for most of the gamers. Still Xbox should change the GPU, otherwise it will lose.elitewolverine - Thursday, May 23, 2013 - link
its the same gpu at heart, sure shaders are lower, because of eSram. You might want to rethink how internals work. Advantage will be very minimalalex@1234 - Friday, May 24, 2013 - link
In every place its mentioned 32% higher GPU power, I don't think A GTX 660 TI and GTX 680 are equal. For sure PS4 holds the advantage. Lower shaders and lower in everything compared to PS4, DDR3 Xbox one-PS4 DDR5. For ESRAM, I will tell you something have a SSD, have 32 GB RAM, it cannot make it for a better GPU.cjb110 - Thursday, May 23, 2013 - link
In some ways this is the opposite to the previous generation. The 360 screamed games (at least its original dashboard), whereas the PS3 had all the potential media support (the xbar interface though let it down) as well as being an excellent blu-ray player (which is the whole reason I got mine).This time around MS have gone all out entertainment, that can do games, where as Sony seems to have gone games first. I'm imagining that physically the PS4 is more flashy too like the PS3 and 360 where...game devices not family entertainment boxes.
Personally I'm keeping the 360 for my games library, and the One will likely replace the PS3.
Tuvok86 - Thursday, May 23, 2013 - link
Xbox One ~ 7770 GhzPS4 ~ 7850
jnemesh - Thursday, May 23, 2013 - link
One of my biggest concerns with the new system is the Kinect requirement. I have my Xbox and other electronics in a rack in the closet. I would need to extend the USB 3.0 (and I am assuming this time around, the Kinect is using a standard USB connector on all models) over 40 feet to get the wire from my closet to the location beneath or above my wall mounted TV. With the existing Kinect for the 360, I never bothered with it, but you COULD buy a fairly expensive USB over cat5 extender (Gefen makes one of the more reliable models, but it's $499!). I know of no such adapter for USB 3.0, and since Kinect HAS to be used for the console to operate, this means I won't be buying an Xbox One! Does anyone know of a product that will extend USB 3.0 over a cat5 or cat6 cable? Or any solution?epobirs - Saturday, May 25, 2013 - link
There are USB 3.0 over fiber solutions available but frankly, I doubt anyone at MS is losing sleep over those few homes with such odd arrangements.Panzerknacker - Thursday, May 23, 2013 - link
Is it just me or are these new gen consoles seriously lacking in CPU performance? According to the benchmarks of the A4-5000, of which you could say the consoles have two, the CPU power is not even going to come close to any i5 or maybe even i3 chip.Considering the fact they are running the X86 platform this time, which probably is not the most efficient to run games (probably the reason why consoles in the past never used x86), and the fact that they run lots of secondary applications next to the game (which leaves maybe 6/8 cores left for the game on average), I think CPU performance is seriously lacking. CPU intensive games will be a no-no on this next gen on consoles.
Th-z - Saturday, May 25, 2013 - link
The first Xbox used x86 CPU. Cost was the main reason not many consoles used x86 CPU in the past, unlike IBM Power and ARM, x86 doesn't give out license to whatever company to make their own CPU. But this time they probably see benefit has outweighed the cost (or even less cost) with x86 APU design from AMD - good performance per dollar/per watt for both CPU and GPU. I am not sure if Power today can reach this kind of performance per dollar/per watt for a CPU, or ARM has the CPU performance to run high end games. Also bear in mind that consoles use less CPU cycle to run games than PC.hfm - Thursday, May 23, 2013 - link
"Differences in the memory subsytems also gives us some insight into each approach to the next-gen consoles. Microsoft opted for embedded SRAM + DDR3, while Sony went for a very fast GDDR5 memory interface. Sony’s approach (especially when combined with a beefier GPU) is exactly what you’d build if you wanted to give game developers the fastest hardware. Microsoft’s approach on the other hand looks a little more broad. The Xbox One still gives game developers a significant performance boost over the previous generation, but also attempts to widen the audience for the console."I don't quite understand how their choice of memory is going to "widen the audience for the console". Unless it's going to cause the XBox One to truly be cheaper, which I doubt. Or if you are referring to the entire package with Kinect, though it didn't seem so in the context of the statement.
FloppySnake - Friday, May 24, 2013 - link
It's my understanding (following an AMD statement during a phone conference over 8000m announcement) that ZeroCore had been enhanced for graceful fall-back, powering-down individual GPU segments not just the entire GPU. If this is employed we could see the PS4 delivering power as needed (not sure what control they'll have over GDDR5 clocks if any), but potentially not power hungry unless it needs to be. Perhaps warrants further investigation?I agree with the article that if used appropriately, the 32MB SRAM buffer could compensate for limited bandwidth, but only in a traditional pipeline; it could severely limit GPGPU potential as there's limited back-and-forth bandwidth between the CPU and GPU, a buffer won't help here.
For clarity, the new Kinect uses a time-of-flight depth sensor, completely different technology to the previous Kinect. This offers superior depth resolution and fps but the XY resolution is actually something like 500x500 (or some combination that adds up to 250,000 pixels).
novastar78 - Saturday, May 25, 2013 - link
I'm curious to see what feature sets each of these GPU's has. These are not the run of the mill APU's that you can buy at the store. These are both custom SoC's and it's my understanding that they may even be from different generations (7000 vs. 8000), similar to how the PS3's RSX was from the 7900 era and the Xenos was around the R600 era (Unified Architecture). Although this would be a much smaller difference here being the same make (AMD) and similar model (GCN).In the end simply measuring CU's may not be enough to determine the true power/quality of the two GPU's.
We may never know as I highly doubt they will easily divulge this info for fear of the outcry (especially from M$ standpoint).
Still i'm very curious...
epobirs - Saturday, May 25, 2013 - link
"The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely."I would disagree. You won't see compatibility with existing Xbox discs but I very much expect a line of original Xbox titles to be offered as download purchases on the new machine. If Nintendo thinks $5 for an NES game running ont he Wii U is reasonable, Microsoft should able to make some good coin on a core set of two or three dozen Xbox titles at $10 each.
As for the 360 library, those should start turning up in HD (well, HD-ier) remakes in about four years as the market ripens for bringing those item back into circulation. This has worked very well for adding value to the PS3 with HD remake collections of PS2 hits. Given the right tools, reworking old IP can be very cost effective.
Some of the best original Xbox titles might get native remakes. We've already had Halo Anniversary and I wouldn't be surprised to see a Halo 2 Anniversary turn up for Xbox One. Jade Empire and the Knights of the Old Republic games may be worth the investment.
RedavutstuvadeR - Saturday, May 25, 2013 - link
Anand Lal Shimpi why did you not mention any thing about the four move engines in the Xbox one and the capabilities of the cloud quadrupling the Xbox ones powerRedavutstuvadeR - Saturday, May 25, 2013 - link
a link to cloud power of XB1news.softpedia.com/news/Xbox-One-Cloud-Makes-the-Console-Four-Times-More-Powerful-355818.shtml
tipoo - Monday, May 27, 2013 - link
iirc the PS4 had similar hardware blocks to the Move engines, just no fancy branding? And the cloud compute thing is a future theoretical, I'll factor it in when it's actually shown to work well. It can't be used for any latency sensitive calculations of course.slickr - Saturday, May 25, 2013 - link
http://i.imgur.com/5WXh32l.jpgjmr99 - Saturday, May 25, 2013 - link
The Xbox 1 (aka PS4 Mini aka PS4 Lite) sure is a colossal disappointment. Microsoft are trying to cut costs and save money in order to create the biggest gap they can btwn selling price and production cost. In other words, the Apple approach: rape your customers. Kudos to Sony for 1152 cores and gddr5.croc123 - Sunday, May 26, 2013 - link
Interesting article in my 'local' rag this AM...http://www.smh.com.au/digital-life/games/how-micro...
JimF2 - Wednesday, May 29, 2013 - link
I won't buy any console that needs an internet connection. It is a huge privacy risk to have a console with a camera that connects to the internet. A console that connects to the internet once per day or once per week has the same privacy risk as a console with an always-on connection.Gamers should boycott Xbox One so the console manufacturers get the message that we won't accept a required internet connection. If a physical disk is inserted in the console, no internet connection should be needed to prevent piracy. The console manufacturers just have to develop a proprietary disk format that can't be copied by Windows, Mac or Linux. It would be fine if gamers who don't want to put a physical disk in the console to prove they own the game are required to have an internet connection. That way, if a gamer wants to prevent game companies from spying on them, they would just swap disks when switching games. If a gamer uses LIVE or they want the convenience of not needing to swap disks, they would provide an internet connection.
TheEvilBlight - Wednesday, May 29, 2013 - link
The PS4's PS3 games are allegedly coming via GaiKai. I'm curious what MS will do for the old stable of games. I wonder if it would be too much to implement other VM's for the Xbox and the Xbox 360; though a VM on an x86 running PPC is likely to suffer severe penalties. It's either state or gaming from the cloud.Alternatively, developers will recompile some of the "best hits" on the 360 and re-release for the Xbox One. I wonder how that would work with the Halo series, but having Gears of War on a faster machine might be fun.
Thermalzeal - Wednesday, May 29, 2013 - link
Anand, any information on whether the Xbox One will utilize HMA (Hybrid Memory Access) in comparison to the PS4?tipoo - Wednesday, May 29, 2013 - link
Do you mean HUMA by any chance? Yes, both would have that.Buccomatic - Friday, May 31, 2013 - link
xbox one - everything we don't want in a video game console, except the controller.ps3 - everything we do want in a video game console, except the controller.
that's how i see it.
Buccomatic - Friday, May 31, 2013 - link
can anyone tell me if the following statement is correct or incorrect?pc games will be ports from the games made for consoles. both consoles (xbox one and ps4) will have 5gb vram in their gpu. so that means the system requirements for pc games, as early as december when they start porting games over from the consoles to pc, will require a pc gamer to have a video card of at least 5gb vram, or more, just to run the game.
?
yes or no and why?
fteoath64 - Monday, June 10, 2013 - link
Before the hardware is released and analysed, we have no idea how much of the PS4 GDDR5 ram is going to be shared and/or dedicated for gpu use and how much of those are going to be available to user data. It is anyone's guess at this stage. But the improvements in hUMA design with dual ported frame buffer for gpu and cpu makes it a rather quick gpu by PC standards. Since only one game is loaded at a time, there can be shared memory reconfiguration going on just before the game loading so it can depend on the game and how much ram it can grab. The cpu counts very little in the process and it is why it can be clocked at 1.6Ghz rather than storming at 3.6Ghz as in Trinity chips. Still with faster gpu and globs of ram now, there is certainly greater leeway in the development process and optimizing process for game developers. One can assume at 3X the Trinity gpu core counts, the PS4 must be at least 2.5X the speed of Trinity gpus since those ran at 900Mhz. With good cooling, the PS4 could well clock their gpu cores at 1.2Ghz since Intel is going 1.3Ghz on the GT3 core.SnOOziie - Sunday, June 2, 2013 - link
Looking at the motherboard they have use solder balls on CPU to BGA it's going to RRODWolfpup - Monday, June 3, 2013 - link
This has never been an easier choice-Microsoft doesn't let you buy games, Sony does, and their system is 50% more powerful, more focused on games, while Microsoft's off doing yet more Kinnect.SirGCal - Thursday, June 13, 2013 - link
YUP, and as a cripple, what good is flailing my arms about and hopping around like a retard going to do me? Kinetic is about the dumbest thing I've seen people use. Accept for work-out stuff and kids stuff sure, makes sense. But then now they give those in the dial-up and cellular internet locations the finger and say 'stick with the 360' when they know damn well developers won't make games for it within a year... Morons. I'm done with M$. If I do get a new console, it will be the PS4. Besides, I've always loved the Kingdom Hearts series more then any others...NoKidding - Monday, June 24, 2013 - link
i am glad that these consoles have finally seen the light of day. though a bit underpowered compared to an average mid range rig, at least game developers will be forced to utilized each and every available core at such low clock rates on these consoles. heavily threaded games will finally be the norm and not just a mere exception. if the playing field no longer relies heavily on ipc advantages, will amd's "more cores but cheaper" strategy finally catch intel's superior ipc advantage? will amd finally reclaim the low to mid range market? no, not likely. but one can hope so. i yearn for the good old c2d days when intel was forced to pull all the stops.kondor999 - Tuesday, July 16, 2013 - link
Who gives a shit about heat and power consumption in a console? Both machines are miserly, and they're not notebooks for Gods sake. Looks like MS simply cheaped out to me. Letting them off the hook by pointing out the tiny heat/power savings as a "benefit" is a real reach. By this logic, why not just cut the compute power even more?No thanks.
Shawn74 - Tuesday, September 10, 2013 - link
mmmmm....Custom CPU (6 operations per clock compared to the 4 of PS4) and now overclocked.
GPU (now overclocked)
eSram (ultra fast memory with extremely low access time, we will see it's real function soon)
DDR3 (extremely fast access time memory)
Maybe this combination may become a nightmare for the PS4 owners?? xD
Yes, i really think YES.
And please don't forget the new pulse triggers (apparently fantastic and a must have for a completely-new experience)
YES, my final decision is for the ONE
Shad0w59 - Wednesday, September 11, 2013 - link
I don't really trust Microsoft with all that overclocking after Xbox 360's high failure rate.Shawn74 - Wednesday, September 11, 2013 - link
Shadow, have you seen the cooling system? It's giant..Have you seen the case? It's giant..(a lot of fresh air inside ;-)
Have you seen the Xbox One will detect heat, power down to avoid meltdown? http://www.vg247.com/2013/08/13/xbox-one-will-dete...
And the very heat power supply is outside.....
A perfect system for overclocking.... obviously for me....
Ah, for my first message here a reply to PS4 team made directly by Albert Penello (Microsoft Director of Product Planning):
"*******************************************************************************************
I see my statements the other day caused more of a stir than I had intended. I saw threads locking down as fast as they pop up, so I apologize for the delayed response.
I was hoping my comments would lead the discussion to be more about the games (and the fact that games on both systems look great) as a sign of my point about performance, but unfortunately I saw more discussion of my credibility.
So I thought I would add more detail to what I said the other day, that perhaps people can debate those individual merits instead of making personal attacks. This should hopefully dismiss the notion I'm simply creating FUD or spin.
I do want to be super clear: I'm not disparaging Sony. I'm not trying to diminish them, or their launch or what they have said. But I do need to draw comparisons since I am trying to explain that the way people are calculating the differences between the two machines isn't completely accurate. I think I've been upfront I have nothing but respect for those guys, but I'm not a fan of the mis-information about our performance.
So, here are couple of points about some of the individual parts for people to consider:
• 18 CU's vs. 12 CU's =/= 50% more performance. Multi-core processors have inherent inefficiency with more CU's, so it's simply incorrect to say 50% more GPU.
• Adding to that, each of our CU's is running 6% faster. It's not simply a 6% clock speed increase overall.
• We have more memory bandwidth. 176gb/sec is peak on paper for GDDR5. Our peak on paper is 272gb/sec. (68gb/sec DDR3 + 204gb/sec on ESRAM). ESRAM can do read/write cycles simultaneously so I see this number mis-quoted.
• We have at least 10% more CPU. Not only a faster processor, but a better audio chip also offloading CPU cycles.
• We understand GPGPU and its importance very well. Microsoft invented Direct Compute, and have been using GPGPU in a shipping product since 2010 - it's called Kinect.
• Speaking of GPGPU - we have 3X the coherent bandwidth for GPGPU at 30gb/sec which significantly improves our ability for the CPU to efficiently read data generated by the GPU.
Hopefully with some of those more specific points people will understand where we have reduced bottlenecks in the system. I'm sure this will get debated endlessly but at least you can see I'm backing up my points.
I still I believe that we get little credit for the fact that, as a SW company, the people designing our system are some of the smartest graphics engineers around – they understand how to architect and balance a system for graphics performance. Each company has their strengths, and I feel that our strength is overlooked when evaluating both boxes.
Given this continued belief of a significant gap, we're working with our most senior graphics and silicon engineers to get into more depth on this topic. They will be more credible then I am, and can talk in detail about some of the benchmarking we've done and how we balanced our system.
Thanks again for letting my participate. Hope this gives people more background on my claims.
"*****************************************************************************
Once again i would like to warn PS4 fan........Everytime Sony announced a new console, Sony have publicized it as the most powerful.... every time Xbox does the job better....
In my opinion
P.S. Sorry for my bad english, i'm italian
Shawn74 - Wednesday, September 11, 2013 - link
Penello's post is here:http://67.227.255.239/forum/showthread.php?p=80951...
tipoo - Saturday, September 21, 2013 - link
Regarding the eDRAM, it's now known not to be an automatically managed cache, from developer comments about having to code specifically to use it.