Results 1 to 25 of 257

Thread: Nvidia unveils the GeForce GTX 780 Ti

Hybrid View

  1. #1
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by GoldenTiger View Post
    FXAA is what most people use
    I enable Post AA (any level) (or leave it enabled as the default setting) 33 26.40%
    I disable Post AA 92 73.60%
    http://hardforum.com/showthread.php?t=1788367
    Last edited by Final8ty; 10-31-2013 at 11:40 AM.

  2. #2
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    So we go from 290x is better than Titan, fact.
    To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
    To "even footing 290x doesn't beat Titan."

    Make up your mind and please stop shifting the goalposts.


    Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.

    290x isn't competing against a card that has a max power consumption of 180w.
    GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
    290x is using ~30-40w more to beat the competition by ~15% on average.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  3. #3
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by LordEC911 View Post
    So we go from 290x is better than Titan, fact.
    To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
    To "even footing 290x doesn't beat Titan."

    Make up your mind and please stop shifting the goalposts.


    Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.

    290x isn't competing against a card that has a max power consumption of 180w.
    GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
    290x is using ~30-40w more to beat the competition by ~15% on average.
    I already told you sites that showed they 290x can pull more than as much as a gtx 480 in the same scenarios.

    290x is a complex product because it tries to pull two modes at once. One where it is a hair slower than titan quiet mode.

    And a second mode that it where it is a bit faster than a gtx titan. It's the nature of the beast. No other cards have this quiet mode.

    10 percent average maybe against a a gtx 780. Not 15 percent more on average unless maybe when we turn on uber mode.

    The gtx 780 is so neutered and underclocked that it might as well be a small chip like the 5870. Its Nvidia attempt to sell massively docked gk110 and do no binning. If we consider how mangled it is and how much AMD is pushing their own chip(also it being fully enabled), it should be no surprise it gets beat. Particularly with more and more gaming evolved titles being used.

    Fermi might have not shown its gaming worth so much in games for it size, but it more than made up for it for its primary purpose, the professional market.

    It was the best professional compute card ever at the time. It was literally a made trash of anything at the time. It spanked AMD professional cards at the time being often twice as fast and sometimes 4 times as fast. Even today, AMD w9000 cards have a hard time competing against it and this was against a cut down chip with very low clocks.(hothardware and tomshardware review). So even those it had high power consumption, it justified it with its other capabilities that are yet to be seen on 290x.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  4. #4
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by LordEC911 View Post
    So we go from 290x is better than Titan, fact.
    To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
    To "even footing 290x doesn't beat Titan."
    Did you overlook where I said "on even ground"? I'll reiterate for you:
    quiet mode (stock setting) vs. Titan stock setting
    OR
    uber mode vs Titan@maxed targets

    Titan and the 290X are equally fast if you value a fair comparison - that's a fact. Don't fall for most of the English speaking reviews - they are biased towards AMD (maybe not even intentionally) by not or not properly pre-heating their cards and by testing uber mode vs Titan/780 stock settings with a temp target of 80C.

    Read the review links I posted if you don't believe me. And don't quote selectively, quote whole sentences.

    Quote Originally Posted by LordEC911 View Post
    Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.
    290x isn't competing against a card that has a max power consumption of 180w.
    GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
    290x is using ~30-40w more to beat the competition by ~15% on average.
    Source for your 300+W claim?
    The 290X certainly can, 306,88W:
    http://ht4u.net/reviews/2013/amd_rad...ew/index21.php

    You do understand how an average works, right? In some games a card may pull more than the average, in others less. You won't find better power consumption data than what I posted.
    And maybe you also overlooked where I said that the 290X wasn't as awful in perf/W compared to the competition as Fermi was.

    And you are the one who should make up your mind. What is it now - Titan killer? Then compare it to Titan which it doesn't beat. Or GTX 780 killer? Okay, if you want to call 10-15% a "killer", fine. But then also use the GTX 780 power numbers instead of Titans: 189W (GTX 780) vs 239W (290X quiet mode). That is a hefty 50W more.
    Last edited by boxleitnerb; 10-30-2013 at 11:16 PM.

  5. #5
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by boxleitnerb View Post
    Did you overlook where I said "on even ground"? I'll reiterate for you:
    quiet mode (stock setting) vs. Titan stock setting
    OR
    uber mode vs Titan@maxed targets

    Titan and the 290X are equally fast if you value a fair comparison - that's a fact. Don't fall for most of the English speaking reviews - they are biased towards AMD (maybe not even intentionally) by not or not properly pre-heating their cards and by testing uber mode vs Titan/780 stock settings with a temp target of 80C.

    Read the review links I posted if you don't believe me. And don't quote selectively, quote whole sentences.
    Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
    Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
    That's right because the results don't show what you want.


    Quote Originally Posted by boxleitnerb View Post
    Source for your 300+W claim?
    The 290X certainly can, 306,88W:
    http://ht4u.net/reviews/2013/amd_rad...ew/index21.php

    You do understand how an average works, right? In some games a card may pull more than the average, in others less. You won't find better power consumption data than what I posted.
    And maybe you also overlooked where I said that the 290X wasn't as awful in perf/W compared to the competition as Fermi was.

    And you are the one who should make up your mind. What is it now - Titan killer? Then compare it to Titan which it doesn't beat. Or GTX 780 killer? Okay, if you want to call 10-15% a "killer", fine. But then also use the GTX 780 power numbers instead of Titans: 189W (GTX 780) vs 239W (290X quiet mode). That is a hefty 50W more.
    Look at any GTX480 review.
    No I don't know what average means... please explain.

    I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
    There is something screwy going on there because those results don't reflect what other sites have found.

    Quote Originally Posted by tajoh111 View Post
    More evidence for 290x power drinking ways.
    *snip*
    Ahhh... good old scientific method, lets just go ahead and throw you out the window, we don't need you.

    All I hear is "ifs" "ands" and "buts," don't worry about comparing what is, we can only discuss what could be.
    Last edited by LordEC911; 10-30-2013 at 11:35 PM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  6. #6
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by LordEC911 View Post
    Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
    Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
    That's right because the results don't show what you want.




    Look at any GTX480 review.
    No I don't know what average means... please explain.

    I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
    There is something screwy going on there because those results don't reflect what other sites have found.


    Ahhh... good old scientific method, lets just go ahead and throw you out the window, we don't need you.

    All I hear is "ifs" "ands" and "buts," don't worry about comparing what is, we can only discuss what could be.
    And lets discard anything that might paint AMD in a bad light. Those graphs are as concrete as you can get.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  7. #7
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by LordEC911 View Post
    Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
    Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
    That's right because the results don't show what you want.
    I don't know where it was in other reviews, but hardware.fr, PCGH, ht4u and Computerbase have been preheating Kepler cards from day one. Not finally several other sites begin to do it, among them hardwarecanucks, techreport and Anandtech iirc.
    And it IS a stock vs stock comparison. Quiet mode is the stock setting, uber mode isn't.

    Quote Originally Posted by LordEC911 View Post
    Look at any GTX480 review.
    No I don't know what average means... please explain.

    I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
    There is something screwy going on there because those results don't reflect what other sites have found.
    The power consumption numbers there are not strange for one very simple reason. 3DCenter compiles values mostly from realistic gaming scenarios, i.e. with pre-heating (at least HT4U, PCGH and hardware.fr). This leads to all Nvidia Boost 2.0 cards clocking lower due to the 80C temperature target being hit, thus they also consume less power.
    I don't need to look at "any" GTX 480 review, because this compilation of measurements is far better. Most reviews test the whole system which is very prone to errors. Where else on the web do you have 5-6 measurements of the consumption of the card itself? Nowhere.

    Btw still waiting on proof on your claim that Fermi could use 300+W under gaming loads...
    Last edited by boxleitnerb; 10-31-2013 at 01:01 AM.

  8. #8
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by boxleitnerb View Post
    I don't know where it was in other reviews, but hardware.fr, PCGH, ht4u and Computerbase have been preheating Kepler cards from day one. Not finally several other sites begin to do it, among them hardwarecanucks, techreport and Anandtech iirc.
    And it IS a stock vs stock comparison. Quiet mode is the stock setting, uber mode isn't.
    I don't know about the rest of them, which I doubt, but hardware.fr definitely did not. If they did, they didn't mention it.
    Uber is stock. Maybe you meant to say Uber isn't default?



    Quote Originally Posted by boxleitnerb View Post
    The power consumption numbers there are not strange for one very simple reason. 3DCenter compiles values mostly from realistic gaming scenarios, i.e. with pre-heating (at least HT4U, PCGH and hardware.fr). This leads to all Nvidia Boost 2.0 cards clocking lower due to the 80C temperature target being hit, thus they also consume less power.
    I don't need to look at "any" GTX 480 review, because this compilation of measurements is far better. Most reviews test the whole system which is very prone to errors. Where else on the web do you have 5-6 measurements of the consumption of the card itself? Nowhere.

    Btw still waiting on proof on your claim that Fermi could use 300+W under gaming loads...
    I didn't see them mention the list of games they used for power consumption.
    I don't see them testing the card by itself...

    I thought you didn't need to look at GTX480 reviews. You supposedly know it all.
    I don't recall you asking for 5870 to be clocked up so that it could be at the same temperatures as GTX480... or the same power consumption.

    Shifting the parameters. Tsk tsk.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  9. #9
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by LordEC911 View Post
    I don't know about the rest of them, which I doubt, but hardware.fr definitely did not. If they did, they didn't mention it.
    Uber is stock. Maybe you meant to say Uber isn't default?
    I meant Boost 2.0 Kepler cards since those are the ones mainly affected by temperature. Boost 1.0 cards like the GTX 680 are barely affected. Sorry if that wasn't clear.
    Yes, default. Stock is an unmodified card, I guess. Like no bios flash, no watercooling etc.

    Quote Originally Posted by LordEC911 View Post
    I didn't see them mention the list of games they used for power consumption.
    I don't see them testing the card by itself...
    Well, you need to go to the respective reviews of course.
    TPU tests with Crysis 2
    PCGH tests with Battlefield Bad Company 2
    hardware.fr tests with Anno 2070 and Battlefield 3
    HT4U tests with Tom Clancy's HawX
    I forgot what Heise tests with, I'll write them an email.

    It is obvious by the values that these are only the cards themselves. The values are way too low for power consumption of the whole system. It is also well known that these sites do test that way.

    Quote Originally Posted by LordEC911 View Post
    I thought you didn't need to look at GTX480 reviews. You supposedly know it all.
    I don't recall you asking for 5870 to be clocked up so that it could be at the same temperatures as GTX480... or the same power consumption
    What? The GTX 480 was measured as well by the sites I mentioned. What do I need other reviews for, especially with inferior testing methodologies?
    And what's up with your second comment? Relevance?
    The point is, quiet mode of the 290X is the default setting. Either you test both cards at default or not. You cannot have it both ways.

  10. #10
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by LordEC911 View Post
    I don't know about the rest of them, which I doubt, but hardware.fr definitely did not. If they did, they didn't mention it.
    In fact they did and they mentionned it :

    Notez enfin que compte tenu de l'influence de la temp?rature sur les r?sultats, et du fait que nous mesurons les performances sur une table de benchs en laissant la temp?rature/fr?quence des diff?rentes cartes se stabiliser, la temp?rature de la pi?ce a ?t? contr?l?e et fix?e ? 26 ?C pour l'ensemble des tests.
    http://www.hardware.fr/articles/910-...cole-test.html

    They wait temperatures/frequencies stabilization to launch the bench. Room temperature is fixed @ 26 degrees celsius also, in order to give fair results accuracy.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •