The NVIDIA GeForce RTX 2080 Ti & RTX 2080 Founders Edition Review: Foundations For A Ray Traced Future
by Nate Oh on September 19, 2018 5:15 PM EST- Posted in
- GPUs
- Raytrace
- GeForce
- NVIDIA
- DirectX Raytracing
- Turing
- GeForce RTX
Meet The GeForce RTX 2080 Ti & RTX 2080 Founders Editions Cards
Moving onto the design of the cards, we've already mentioned the biggest change: a new open air cooler design. Along with the Founders Edition specification changes, the cards might be considered 'reference' in that they remain a first-party video card sold direct by NVIDIA, but strictly-speaking they are not because they no longer carry reference specifications.
Otherwise, NVIDIA's industrial design language prevails, and the RTX cards bring a sleek flattened aesthetic over the polygonal shroud of the 10 series. The silver shroud now encapsulates an integrated backplate, and in keeping with the presentation, the NVLink SLI connectors have a removable cover.
Internally, the dual 13-blade fans accompany a full-length vapor chamber and component baseplate, connected to a dual-slot aluminum finstack. Looking at improving efficiency and granular power control, the 260W RTX 2080 Ti Founders Edition features a 13-phase iMON DrMOS power subsystem with a dedicated 3-phase system for the 14 Gbps GDDR6, while the 225W RTX 2080 Founders Edition weighing in with 8-phases main and 2-phases memory.
As is typical with higher quality designs, NVIDIA is pushing overclocking, and for one that means a dual 8-pin PCIe power configuration for the 2080 Ti; on paper, this puts the maximum draw at 375W, though specifications-wise the TDP of the 2080 Ti Founders Edition against the 1080 Ti Founders Edition is only 10W higher. The RTX 2080 Founders Edition has the more drastic jump, however, with 8+6 pins and a 45W increase over the 1080's lone 8 pin and 180W TDP. Ultimately, it's a steady increase from the power-sipping GTX 980's 165W.
One of the more understated changes comes with the display outputs, which thanks to Turing's new display controller now features DisplayPort 1.4 and DSC support, the latter of which is part of the DP1.4 spec. The eye-catching addition is the VR-centric USB-C VirtualLink port, which also carries an associated 30W not included in the overall TDP.
Something to note is that this change in reference design, combined with the seemingly inherent low-volume nature of the Turing GPUs, cuts into an often overlooked but highly important aspect of GPU sales: big OEMs in the desktop and mobile space. Boutique system integrators will happily incorporate the pricier higher-end parts but from the OEM’s perspective, the GeForce RTX cards are not just priced into a new range beyond existing ones but also bringing higher TDPs and no longer equipped with blower-style coolers in its ‘reference’ implementation.
Given that OEMs often rely on the video card being fully self-exhausting because of a blower, it would certainly preclude a lot of drop-in replacements or upgrades – at least not without further testing. It would be hard to slot into the standard OEM product cycle at the necessary prices, not to mention the added difficulty in marketing. In that respect, there is definitely more to the GeForce RTX 20 series story, and it’s somewhat hard to see OEMs offering GeForce RTX cards. Or even the RT Cores themselves existing below the RTX 2070, just on basis of the raw performance needed for real time ray tracing effects at reasonable resolutions and playable framerates. So it will be very interesting to see how the rest of NVIDIA’s product stack unfolds.
337 Comments
View All Comments
ESR323 - Wednesday, September 19, 2018 - link
I agree with the conclusion that these cards aren't a good buy for 1080ti owners. My 1080ti overclocks very nicely and I'll be happy to stick with it until the next generation in 7 nm. By then we might have a decent selection of games that make use of ray tracing and the performance increase will be more appealing.imaheadcase - Wednesday, September 19, 2018 - link
Yah i agree, especially its only a 20-25fps increase on average. While many might thing thats great, considering the price increase over 1080TI and the fact many 1080TI can overclock to close that gap even more. The features don't justify the cost.However, it could be lots of performance could be unlocked via driver updates..we really don't know how tensor cores could increase performance till the games get updated to use it. Also, while super expensive option...how does the new SLI performance increase performance? Lets see a compare from 1080TI sli to newer sli 2080TI..maybe its easier to put into games? So many what-ifs with this product.
I feel this product should of been delayed till more games/software already had feature sets available to see.
Aybi - Thursday, September 20, 2018 - link
There wont be driver&optimization support for 1000 series. They will focus on 2000 series and with that the gap going to increase a lot.If you remember 980ti and 1080ti it was the same case when 1080ti announced and then you know what happened.
Vayra - Friday, September 21, 2018 - link
Actually I don't and there is also no data to back up what you're saying. The 980ti still competes with the 1070 as it did at Pascal launch.Don't spread BS
Matthmaroo - Sunday, September 23, 2018 - link
Dude that’s not true at allNvidia will fully support the 10 series for the next 5 -10 years
They all use the same CUDA cores
Don’t just make crap up to justify your purchase
SanX - Thursday, September 20, 2018 - link
What the useless job the reviewer is doing comparing only to latest generstion cards? Add at least 980Ti and 780TiMrSpadge - Thursday, September 20, 2018 - link
Ever heard of their benchmark database?Ryan Smith - Thursday, September 20, 2018 - link
You'll be glad to hear then that we'll be backfilling cards.There was a very limited amount of time ahead of this review, and we thought it to be more important to focus on things like clocking the cards at their actual reference clocks (rather than NVIDIA's factory overclocks).
dad_at - Sunday, September 23, 2018 - link
Many thanks for that, I think it is useful job, people are still using maxwell(or even older) generation GPU in 2018. And when we could expect maxwell (980/980ti) results to appear in GPU 2018 bench? Could you also please add Geforce GTX Titan X (maxwell) to GPU 2018?StevoLincolnite - Sunday, September 23, 2018 - link
Hopefully you back-fill a substantial amount, the GPU bench this year has been a bit lacking... Especially in regards to mid-range and older parts.Whole point of it is so that you can see how the latest and greatest compare it to your old and crusty.