The NVIDIA GeForce GTX 780 Ti Review
by Ryan Smith on November 7, 2013 9:01 AM ESTCrysis: Warhead
Up next is our legacy title for 2013/2014, Crysis: Warhead. The stand-alone expansion to 2007’s Crysis, at over 5 years old Crysis: Warhead can still beat most systems down. Crysis was intended to be future-looking as far as performance and visual quality goes, and it has clearly achieved that. We’ve only finally reached the point where single-GPU cards have come out that can hit 60fps at 1920 with 4xAA, never mind 2560 and beyond.
Whereas Battlefield 3 is a game that traditionally favors NVIDIA, Crysis: Warhead has generally favored AMD in this generation, leading to an uphill battle for NVIDIA. 290X was able to beat GTX Titan here at 2560, but with the additional performance offered by the GTX 780 Ti NVIDIA is once again at the top, though only by a margin of under 2fps (or 2%). Otherwise compared to NVIDIA’s other cards, Crysis: Warhead is another consistent game for GTX 780 Ti, with NVIDIA’s latest card beating GTX Titan and GTX 780 by 9% and 18% respectively.
Moving on, even when we double up on cards the GTX 780 Ti and 290X remain close. At 2560 it’s a virtual tie at 87fps apiece, while at 4K the GTX 780 Ti SLI takes a slight lead.
As for our minimum framerates under Crysis: Warhead, NVIDIA does end up breaking the deadlock here with a slight performance advantage for the GTX 780 Ti, beating the 290X by several percent, pushing its minimum framerate above 40fps.
302 Comments
View All Comments
Wreckage - Thursday, November 7, 2013 - link
The 290X = Bulldozer. Hot, loud, power hungry and unable to compete with an older architecture.Kepler is still king even after being out for over a year.
trolledboat - Thursday, November 7, 2013 - link
Hey look, it's a comment from a permanently banned user at this website for trolling, done before someone could of even read the first page.Back in reality, very nice card, but sorely overpriced for such a meagre gain over 780. It also is slower than the cheaper 290x in some cases.
Nvidia needs more price cuts right now. 780 and 780ti are both badly overpriced in the face of 290 and 290x
neils58 - Thursday, November 7, 2013 - link
I think Nvidia probably have the right strategy, G-Sync is around the corner and it's a game changer that justifies the premium for their brand - AMD's only answer to it at this time is going crossfire to try and ensure >60FPS at all times for V-Sync. Nvidia are basically offering a single card solution that even with the brand premium and G-sync monitors comes out less expensive than crossfire. 780Ti for 1440p gamers, 780 for for 1920p gamers.Kamus - Thursday, November 7, 2013 - link
I agree that G-Sync is a gamechanger, but just what do you mean AMD's only answer is crossfire? Mantle is right up there with g-sync in terms of importance. And from the looks of it, a good deal of AAA developers will be supporting Mantle.As a user, it kind of sucks, because I'd love to take advantage of both.
That said, we still don't know just how much performance we'll get by using mantle, and it's only limited to games that support it, as opposed to G-Sync, which will work with every game right out of the box.
But on the flip side, you need a new monitor for G-Sync, and at least at first, we know it will only be implemented on 120hz TN panels. And not everybody is willing to trade their beautiful looking IPS monitor for a TN monitor, specially since they will retail at $400+ for 23" 1080p.
Wreckage - Thursday, November 7, 2013 - link
Gsync will work with every game past ad present. So far Mantle is only confirmed in one game. That's a huge difference.Basstrip - Thursday, November 7, 2013 - link
TLDR: When considering Gsync as a competitive advantage, add the cost of a new monitor. When considering Matnle support, think multiplatform and think next-gen consoles having AMD GPUs. Another plus side for NVidia is shadowplay and SHIELD though (but again, added costs if you consider SHIELD).Gsync is not such a game changer as you have yet to see both a monitor with Gsync AND its pricing. The fact that I would have to upgrade my monitor and that that Gsync branding will add another few $$$ on the price tag is something you guys have to consider.
So to consider Gsync as a competitive advantage when considering a card, add the cost of a monitor to that. Perfect for those that are going to upgrade soon but for those that won't, Gsync is moot.
Mantle on its plus side will be used on consoles and pc (as both PS4 and Xbox One have AMD processors, developpers of games will most probably be using it). You might not care about consoles but they are part of the gaming ecosystem and sadly, we pc users tend to get the shafted by developpers because of consoles. I remember Frankieonpc mentioning he used to play tons of COD back in the COD4 days and said that development tends to have shifted towards consoles so the tuning was a bit more off for pc (paraphrasing slightly).
I'm in the market for both a new monitor and maybe a new card so I'm a bit on the fence...
Wreckage - Thursday, November 7, 2013 - link
Mantle will not be used on consoles. AMD already confirmed this.althaz - Thursday, November 7, 2013 - link
Mantle is not used on consoles...because the consoles already have something very similar.Kamus - Thursday, November 7, 2013 - link
You are right, consoles use their own API for GCN, guess what mantle is used for?*spoiler alert* GCN
EJS1980 - Thursday, November 7, 2013 - link
Mantle is irrefutably NOT coming to consoles, so do your due diligence before trying to make a point. :)