Page 6 of 11 FirstFirst ... 3456789 ... LastLast
Results 126 to 150 of 257

Thread: Nvidia unveils the GeForce GTX 780 Ti

  1. #126
    Moderator
    Join Date
    Oct 2007
    Location
    Oregon - USA
    Posts
    830
    If the power circuitry has been modified there is a chance the PCB may be physically different as well unless the 780 Or Titan pcb was not entirely utilized.
    Asus Rampage IV Extreme
    4930k @4.875
    G.Skill Trident X 2666 Cl10
    Gtx 780 SC
    1600w Lepa Gold
    Samsung 840 Pro 256GB


  2. #127
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by AliG View Post
    Mind explaining the difference and how it relates to the gtx 780 ti?
    Aside from shader count, it's double precision performance: 1/3 FP32 vs 1/24 FP32.
    I expect 780 Ti to stick to 1/24.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  3. #128
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by zalbard View Post
    Aside from shader count, it's double precision performance: 1/3 FP32 vs 1/24 FP32.
    I expect 780 Ti to stick to 1/24.
    Ahh that actually makes perfect sense, wasn't Titan designed for GPGPU purposes?
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  4. #129
    Registered User
    Join Date
    Feb 2010
    Location
    Cebu, Philippines
    Posts
    59
    GeForce GTX 780 GHz Edition clocked at 1006/1046 MHz.

    beats 290X and Titan

    780ti will do it even better...





    more here: http://videocardz.com/47420/nvidia-u...80-ghz-edition
    original source: http://www.expreview.com/29089.html

    Gigabyte GA-X38-DQ6
    Core 2 Quad Q9450 @ 3.4Ghz (Zalman CNPS9700 LED)
    Corsair Twin2X4096-6400C4DHX @ DDR2-1066
    RIP GeForce 9800 GX2 715/1720/1050
    2 x 500GB WD Caviar SE (RAID 0)
    Corsair HX-620W
    ACER P243WAID 1920 x 1200

  5. #130
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    I think this is all great, regardless of who wins the performance crown. We haven't seen this type of back and forth between AMD and NVIDIA since the days of the 9800 gtx vs 4870
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  6. #131
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Hahaha... beats 290x... Good one.
    FXAA, minimal AA settings, Max tesselation, mainly synthetics.

    Quote Originally Posted by tajoh111 View Post
    Is it the power consumption lower than Thermi part 2, AKA 290x? If these are the clocks of a stock card with just the regular fan, imagine the gtx lightnings. 780 Ti. 1050 normal clocks and 1150 boost clocks.

    Look at the gtx 780 galaxy HOF edition and you will see b1 are already being used on some cards.
    Galaxy HOF is using A1 still...
    Thermi part 2? Wow... stretching reality a bit aren't you?

    Quote Originally Posted by Tim View Post
    The big Hawaii show was basically cut short by a simple pricedrop and launch of a Ti. Man.....
    Not at all. Just means Nvidia is trying to stay competitive.

    Quote Originally Posted by tajoh111 View Post
    Hawaii could still do well and still do Nvidia alot of damage, if they get their stock situation sorted sorted out. And from the looks of its, they didn't even deliver that entire 8000 quantity BF4 editions, on Hardocp, new stock coming in are still coming with BF4 and that was as of yesterday. So that 8000 quantity was not just pre-order editions as people who ordered on wednesday were getting BF4 editions.

    So much for 10's of thousands.
    Where do you see any different?
    290x is clearly going in and out of stock multiple times a day.

    Quote Originally Posted by AliG View Post
    Mind explaining the difference and how it relates to the gtx 780 ti?
    Different bins of silicon. They are saying 780Ti is a metal respin which is extremely unlikely, though not impossible.
    It would mean there was some inherent architectural flaw with GK110 that they need to fix.
    Last edited by LordEC911; 10-30-2013 at 05:45 PM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  7. #132
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    FXAA is what most people use and maybe 2x (4x on older games) MSAA at high res like 2560x1440 or better, more isn't performant enough to be worth using (low fps no one would actually want to play at, and basically useless in terms of visual quality in motion anyway). I game at 2560x1440 110hz, many use 1080p 144hz or 2560x1440 and many of them with high refresh rates with how inexpensive 27" has gotten for PLS/IPS panels.

    Nvidia has indeed pretty much cut Hawaii's show short I'd say I have to agree. Who on Earth is going to buy a $550 card that runs at 95c (consistently hotter/louder than Fermi), has little oc headroom, and is loud, when you can buy a $500 one with lots of oc headroom, generally the same performance when taking into account pre-heated benchmarks rather than short sprints before the 290X throttles (which happens during normal gaming, clocks are 800-900mhz not 1000mhz+), and a quiet heatsink (or custom designs for $510)?

    290X has had tiny amounts of units come in-stock and sell off, that doesn't have anything to do with them selling well, just minute quantities. As he said, so much for tens of thousands...

    Competition is good, but there's a clear and obvious choice for buying unless you have a huge favoritism towards one brand: at $500 ($510-520 with custom coolers) and barely any slower if at all during actual gaming except 4K res which is so tiny a market as to be nonexistent so far, while being quiet and cool, the GTX 780 post-pricedrop is that.
    Last edited by GoldenTiger; 10-30-2013 at 06:07 PM.

  8. #133
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by GoldenTiger View Post
    FXAA is what most people use and maybe 2x (4x on older games) MSAA at high res like 2560x1440 or better, more isn't performant enough to be worth using (low fps no one would actually want to play at, and basically useless in terms of visual quality in motion anyway).

    Nvidia has indeed pretty much cut Hawaii's show short I'd say I have to agree. Who on Earth is going to buy a $550 card that runs at 95c, has little oc headroom, and is loud, when you can buy a $500 one with lots of oc headroom, generally the same performance when taking into account pre-heated benchmarks rather than short sprints before the 290X throttles (which happens during normal gaming, clocks are 800-900mhz not 1000mhz+), and a quiet heatsink (or custom designs for $510)?

    290X has had tiny amounts of units come in-stock and sell off, that doesn't have anything to do with them selling well, just minute quantities. As he said, so much for tens of thousands...

    Competition is good, but there's a clear and obvious choice for buying unless you have a huge favoritism towards one brand: at $500 ($510-520 with custom coolers) and barely any slower if at all during actual gaming except 4K res which is so tiny a market as to be nonexistent so far, while being quiet and cool, the GTX 780 post-pricedrop is that.
    I'm honestly perfectly fine with all that. Sure it'll be disappointing for AMD if they can't pull out as much profit as they would have expected from Hawaii, but from my perspective they've done their job. We wouldn't be seeing a GTX 780 Ghz edition for $550 if it weren't for the 290X, and I wouldn't be surprised if AMD sweetens their bundle with a bunch of games like the 7970 GHz packages.

    Competition is not just good, it is the best thing we could ask for. I really don't care who manufactures my hardware as long as I'm getting good value for my money, and that normally won't happen if a company can sit back on their laurels (just look at Intel, for what I do my 2500k really isn't worth upgrading).
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  9. #134
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    @AliG absolutely, I'm VERY glad to see there be fierce competition here, better pricing and performance for everyone regardless of who is the "winner". I'd hate to see what's happened with CPU's happen with GPU's... talk about bad for consumers. The only thing better than what's happened with the GPU fight now would be if AMD had a good card (and I'd be cheering its release) in the 20nm gen too and we saw this kind of competition there!

  10. #135
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by GoldenTiger View Post
    FXAA is what most people use and maybe 2x (4x on older games) MSAA at high res like 2560x1440 or better, more isn't performant enough to be worth using (low fps no one would actually want to play at, and basically useless in terms of visual quality in motion anyway). I game at 2560x1440 110hz, many use 1080p 144hz or 2560x1440 and many of them with high refresh rates with how inexpensive 27" has gotten for PLS/IPS panels.

    Nvidia has indeed pretty much cut Hawaii's show short I'd say I have to agree. Who on Earth is going to buy a $550 card that runs at 95c (consistently hotter/louder than Fermi), has little oc headroom, and is loud, when you can buy a $500 one with lots of oc headroom, generally the same performance when taking into account pre-heated benchmarks rather than short sprints before the 290X throttles (which happens during normal gaming, clocks are 800-900mhz not 1000mhz+), and a quiet heatsink (or custom designs for $510)?

    290X has had tiny amounts of units come in-stock and sell off, that doesn't have anything to do with them selling well, just minute quantities. As he said, so much for tens of thousands...

    Competition is good, but there's a clear and obvious choice for buying unless you have a huge favoritism towards one brand: at $500 ($510-520 with custom coolers) and barely any slower if at all during actual gaming except 4K res which is so tiny a market as to be nonexistent so far, while being quiet and cool, the GTX 780 post-pricedrop is that.
    You don't buy a +$500 card to run FXAA at 1080p...

    Hotter and louder than Fermi? Completely false... It actually pulls less power than Fermi.

    It boosts over 900mhz consistently in gaming situations. So that is false.

    There have been tens of thousands shipped by partners.
    Tiny amounts? You do realized how many are on a pallet that gets shipped to the etailers, right?

    Sorry, Nvidia just evened the playing field, they didn't take any sort of advantage.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  11. #136
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by LordEC911 View Post
    You don't buy a +$500 card to run FXAA at 1080p...

    Hotter and louder than Fermi? Completely false... It actually pulls less power than Fermi.

    It boosts over 900mhz consistently in gaming situations. So that is false.

    There have been tens of thousands shipped by partners.
    Tiny amounts? You do realized how many are on a pallet that gets shipped to the etailers, right?

    Sorry, Nvidia just evened the playing field, they didn't take any sort of advantage.
    You do if you don't like alpha and shader aliasing which is far worse than edge aliasing , most people know this. Most people running 1080p don't run 60hz nowadays, or they'd be on an IPS panel instead including 1080p 60hz or 2560x1440 60-120hz.

    Hotter/louder indeed, power consumption is a different thing.

    It boosts on some non-preheated ones to that but when pre-heated or in actual gaming it often falls to 800-900mhz as I said, rather than 1ghz.

    Tens of thousands shipped? Doubt it since newegg seems to receive 5-10 at a time and thus sells out quickly due to the tiny quantity .

    $500 + 100 or more worth of games (ignoring the shield coupon) versus 550 + no games... with similar performance but better acoustics/thermals, most people will go for the better-priced option. Don't get me wrong, I am VERY glad we're seeing good competition finally, it's only a good thing for us as end-users!
    Last edited by GoldenTiger; 10-30-2013 at 06:42 PM.

  12. #137
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by GoldenTiger View Post
    @AliG absolutely, I'm VERY glad to see there be fierce competition here, better pricing and performance for everyone regardless of who is the "winner". I'd hate to see what's happened with CPU's happen with GPU's... talk about bad for consumers. The only thing better than what's happened with the GPU fight now would be if AMD had a good card (and I'd be cheering its release) in the 20nm gen too and we saw this kind of competition there!
    Yup, once TSMC finally rolls out their 20nm process will get real
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  13. #138
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    But speaking of stagnant CPU innovation, a lot of people have completely overlooked just how far along peripherals have come quietly in the background. I just got a Corsair K95 and holy is a mechanical keyboard a massive upgrade over the previous ergonomic one. It's interesting that it's almost more comfortable to type with, because even though I sacrificed the wrist support, the keys are just so damn responsive. Or you could look at the Korean 1440p monitors and how SSDs are finally affordable. Just in general I think there's a lot of nifty stuff that's been rolling out without much noise
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  14. #139
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by LordEC911 View Post
    You don't buy a +$500 card to run FXAA at 1080p...

    Hotter and louder than Fermi? Completely false... It actually pulls less power than Fermi.

    It boosts over 900mhz consistently in gaming situations. So that is false.

    There have been tens of thousands shipped by partners.
    Tiny amounts? You do realized how many are on a pallet that gets shipped to the etailers, right?

    Sorry, Nvidia just evened the playing field, they didn't take any sort of advantage.
    In guru3d and techpowerup, in gaming loads, they pull more or similar amounts gtx 480. Maybe only by 20 watts at times, but that still more. Thus to make an educated guess that 290x consumes as much as fermi is pretty safe to assume.

    In addition they both operate at 95 degree and at that point, 290x still continue to throttle in test by HARDOCP. Hardocp had to set the fans beyond the uber limits to stop it from throttling and where it can maintain 1000mhz.

    At this point, they actually beat Titan significantly(10% maybe), but noise goes up beyond fermi levels, beyond dual cards.

    So it is very much a fermi, but Nvidia had a better excuse for it. That being, it was generation one silicon and it was even bigger than 290x is. AMD has lots of experience with 28nm and they are designing a refinement of GCN rather than a whole new architecture like fermi.

    The amount of heat it can generate is spectacular.

    http://hardforum.com/showpost.php?p=...10&postcount=2

    An overclocked a 290x can get even in the 80's C with water cooling, I have never seen a card do that.

    Add in the overvolt overclock studying with techpowerup and 290x is indeed piping hot.
    Last edited by tajoh111; 10-30-2013 at 07:57 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  15. #140
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by tajoh111 View Post
    In guru3d and techpowerup, in gaming loads, they pull more or similar amounts gtx 480. Maybe only by 20 watts at times, but that still more. Thus to make an educated guess that 290x consumes as much as fermi is pretty safe to assume.

    In addition they both operate at 95 degree and at that point, 290x still continue to throttle in test by HARDOCP. Hardocp had to set the fans beyond the uber limits to stop it from throttling and where it can maintain 1000mhz.

    At this point, they actually beat Titan significantly(10% maybe), but noise goes up beyond fermi levels, beyond dual cards.

    So it is very much a fermi, but Nvidia had a better excuse for it. That being, it was generation one silicon and it was even bigger than 290x is. AMD has lots of experience with 28nm and they are designing a refinement of GCN rather than a whole new architecture like fermi.

    The amount of heat it can generate is spectacular.

    http://hardforum.com/showpost.php?p=...10&postcount=2

    An overclocked a 290x can get even in the 80's C with water cooling, I have never seen a card do that.

    Add in the overvolt overclock studying with techpowerup and 290x is indeed piping hot.
    So similar to Fermi, not worse. Fermi fan does a good 65-70db at stock. I didn't see 290x going over 60db.
    Nvidia's excuse was they had a broken architecture that could barely beat a chip almost half the size. Nvidia simply didn't have a choice.
    AMD made a decision and decided to set the limits to where they did to balance the design.

    That 80C is false. If you believe that, well then I can't help you.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  16. #141
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by AliG View Post
    Mind explaining the difference and how it relates to the gtx 780 ti?
    Sure.
    PedantOne said the 780 Ti is GK110-x00-B2.
    I asked what B1 was (and A1 for that matter).

    If the 780 Ti has indeed 2880 cores, it is not GK110-400-xx but GK110-500 (or 600 or whatever)-xx.
    Since the first fully enabled GK110 GPUs entered the market only very recently or are not even released yet (Quadro K6000, Tesla K40), the 780 Ti will most certainly not use different silicon, but the same. Thus there are no previous 2880-core GPUs that could be A1 or B1 in order for a B2 to exist.

    Unless the stepping is independent of the number of enabled SMX clusters. But even then, why not B1 for the 780 Ti?

    Edit:
    tajoh111 is right. 290X pulls more power than Fermi with the uber bios, significantly so:
    http://www.3dcenter.org/artikel/laun...9-290x-seite-2
    http://www.3dcenter.org/artikel/eine...tromverbrauchs
    279W on average for the 290X uber mode
    235W on average for the 480
    Both values are averages of multiple measurements that measured only the card itself, so it doesn't get more accurate than that.

    • The 290X is late just like Fermi was.
    • It is hot and loud just like Fermi was.
    • It doesn't beat the competition at even ground across the board (quiet bios vs Titan stock or uber bios vs Titan@maximized targets) just like Fermi did. In 4K it wins slightly by under 10% but loses in 1080p or with SGSSAA:
      http://ht4u.net/reviews/2013/amd_rad...ew/index46.php
      https://www.computerbase.de/artikel/...90x-im-test/5/
    • It uses more power than the competition in either comparison (see above), the same or more than Fermi depending on the bios mode.



    Now the positives:
    The 290X's perf/W doesn't fall as far from Titan's compared to GTX 480 vs 5870.
    The GPU is smaller, great feat of engineering
    Price
    Last edited by boxleitnerb; 10-30-2013 at 10:05 PM.

  17. #142
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by LordEC911 View Post
    So similar to Fermi, not worse. Fermi fan does a good 65-70db at stock. I didn't see 290x going over 60db.
    Nvidia's excuse was they had a broken architecture that could barely beat a chip almost half the size. Nvidia simply didn't have a choice.
    AMD made a decision and decided to set the limits to where they did to balance the design.

    That 80C is false. If you believe that, well then I can't help you.
    Considering the power consumption scaling we have seen, I could see it being that hot. Wizard was already when applying just a bit more voltage and overclocking to 1190mhz. His system, consumed 650(originally 400) watts which meant the card was consuming 550watts. This was at 1.26 volts. An overvolted gtx 780 lightning consumed less than 400 watts for the whole systems so(300 watts in total) in the same test and it was overvolted even more and was clocked higher as well. I have seen people clock their cards at for a 1.4volts for a 290x under air, so if someone feels they want to push water even more and if anything else is in the loop, I could see it being overwhelmed and reaching those temps. Some of Intels hotter chips are capable of reaching 80 in water when sufficiently clocked. I have just looking around and fermi could pushing near 70 in a water cooling loop when stressed. And don't think, its capable of drinking as much power as an overclocked 290x. I would be scared to see what kind of voltage this thing can drink down at greater than 1.4 volts. I don't think we have anything use as much power as a 290x as far as power consumption going up as we overclock further.

    Also According to tech powerup, in ubermode the card is just as loud as a gtx 480. Both cards are 9 decibels above a gtx 580.

    You might hate to admit it but Thermi and 290x have a lot of common as far as thermals and temp. It almost striking.
    Last edited by tajoh111; 10-30-2013 at 09:45 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  18. #143
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by GoldenTiger View Post
    FXAA is what most people use
    I enable Post AA (any level) (or leave it enabled as the default setting) 33 26.40%
    I disable Post AA 92 73.60%
    http://hardforum.com/showthread.php?t=1788367
    Last edited by Final8ty; 10-31-2013 at 11:40 AM.

  19. #144
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    So we go from 290x is better than Titan, fact.
    To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
    To "even footing 290x doesn't beat Titan."

    Make up your mind and please stop shifting the goalposts.


    Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.

    290x isn't competing against a card that has a max power consumption of 180w.
    GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
    290x is using ~30-40w more to beat the competition by ~15% on average.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  20. #145
    NooB MOD
    Join Date
    Jan 2006
    Location
    South Africa
    Posts
    5,799
    Quote Originally Posted by zalbard View Post
    Aside from shader count, it's double precision performance: 1/3 FP32 vs 1/24 FP32.
    I expect 780 Ti to stick to 1/24.
    ^^ This. There is no longer a desktop card with insane floating point power. If the 780 Ti was 1/3 it would be a $1500+ card so as not to hurt the workstation sales.
    Xtreme SUPERCOMPUTER
    Nov 1 - Nov 8 Join Now!


    Quote Originally Posted by Jowy Atreides View Post
    Intel is about to get athlon'd
    Athlon64 3700+ KACAE 0605APAW @ 3455MHz 314x11 1.92v/Vapochill || Core 2 Duo E8500 Q807 @ 6060MHz 638x9.5 1.95v LN2 @ -120'c || Athlon64 FX-55 CABCE 0516WPMW @ 3916MHz 261x15 1.802v/LN2 @ -40c || DFI LP UT CFX3200-DR || DFI LP UT NF4 SLI-DR || DFI LP UT NF4 Ultra D || Sapphire X1950XT || 2x256MB Kingston HyperX BH-5 @ 290MHz 2-2-2-5 3.94v || 2x256MB G.Skill TCCD @ 350MHz 3-4-4-8 3.1v || 2x256MB Kingston HyperX BH-5 @ 294MHz 2-2-2-5 3.94v

  21. #146
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by LordEC911 View Post
    So we go from 290x is better than Titan, fact.
    To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
    To "even footing 290x doesn't beat Titan."

    Make up your mind and please stop shifting the goalposts.


    Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.

    290x isn't competing against a card that has a max power consumption of 180w.
    GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
    290x is using ~30-40w more to beat the competition by ~15% on average.
    I already told you sites that showed they 290x can pull more than as much as a gtx 480 in the same scenarios.

    290x is a complex product because it tries to pull two modes at once. One where it is a hair slower than titan quiet mode.

    And a second mode that it where it is a bit faster than a gtx titan. It's the nature of the beast. No other cards have this quiet mode.

    10 percent average maybe against a a gtx 780. Not 15 percent more on average unless maybe when we turn on uber mode.

    The gtx 780 is so neutered and underclocked that it might as well be a small chip like the 5870. Its Nvidia attempt to sell massively docked gk110 and do no binning. If we consider how mangled it is and how much AMD is pushing their own chip(also it being fully enabled), it should be no surprise it gets beat. Particularly with more and more gaming evolved titles being used.

    Fermi might have not shown its gaming worth so much in games for it size, but it more than made up for it for its primary purpose, the professional market.

    It was the best professional compute card ever at the time. It was literally a made trash of anything at the time. It spanked AMD professional cards at the time being often twice as fast and sometimes 4 times as fast. Even today, AMD w9000 cards have a hard time competing against it and this was against a cut down chip with very low clocks.(hothardware and tomshardware review). So even those it had high power consumption, it justified it with its other capabilities that are yet to be seen on 290x.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  22. #147
    Xtreme Enthusiast
    Join Date
    Dec 2010
    Posts
    594
    Quote Originally Posted by LordEC911 View Post
    So we go from 290x is better than Titan, fact.
    To "we can't compare 290x to Titan because GTX780 is basically Titan for cheap."
    To "even footing 290x doesn't beat Titan."
    Did you overlook where I said "on even ground"? I'll reiterate for you:
    quiet mode (stock setting) vs. Titan stock setting
    OR
    uber mode vs Titan@maxed targets

    Titan and the 290X are equally fast if you value a fair comparison - that's a fact. Don't fall for most of the English speaking reviews - they are biased towards AMD (maybe not even intentionally) by not or not properly pre-heating their cards and by testing uber mode vs Titan/780 stock settings with a temp target of 80C.

    Read the review links I posted if you don't believe me. And don't quote selectively, quote whole sentences.

    Quote Originally Posted by LordEC911 View Post
    Fermi doesn't pull only 235w. It can pull over 300w in gaming situations.
    290x isn't competing against a card that has a max power consumption of 180w.
    GTX480 was consistently pulling ~100w more in gaming situations to give ~10% more performance.
    290x is using ~30-40w more to beat the competition by ~15% on average.
    Source for your 300+W claim?
    The 290X certainly can, 306,88W:
    http://ht4u.net/reviews/2013/amd_rad...ew/index21.php

    You do understand how an average works, right? In some games a card may pull more than the average, in others less. You won't find better power consumption data than what I posted.
    And maybe you also overlooked where I said that the 290X wasn't as awful in perf/W compared to the competition as Fermi was.

    And you are the one who should make up your mind. What is it now - Titan killer? Then compare it to Titan which it doesn't beat. Or GTX 780 killer? Okay, if you want to call 10-15% a "killer", fine. But then also use the GTX 780 power numbers instead of Titans: 189W (GTX 780) vs 239W (290X quiet mode). That is a hefty 50W more.
    Last edited by boxleitnerb; 10-30-2013 at 11:16 PM.

  23. #148
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    More evidence for 290x power drinking ways.

    http://www.anandtech.com/show/2977/n...rth-the-wait-/

    http://www.anandtech.com/show/7457/t...9-290x-review/





    Guess what these reviews have in common.

    There is a 5870 in both of them.

    And there are cards that use about 100 more watts than a 5870..... guess what cards those are.

    The gtx 480 and the 290x(in quiet mode). With the card in uber mode this grows to about 130 watts which would make it the biggest power drinking single gpu card of all time by a comfortable margin.

    If nvidia didn't cut down the gtx 780 so badly and clock it so conservatively, AMD wouldn't even have that lead. the gtx 780 can be clocked quite a bit higher without increasing power consumption if we look at some reviews of overclocked models at Hardocp or techpowerup.
    Last edited by tajoh111; 10-30-2013 at 11:29 PM.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

  24. #149
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by boxleitnerb View Post
    Did you overlook where I said "on even ground"? I'll reiterate for you:
    quiet mode (stock setting) vs. Titan stock setting
    OR
    uber mode vs Titan@maxed targets

    Titan and the 290X are equally fast if you value a fair comparison - that's a fact. Don't fall for most of the English speaking reviews - they are biased towards AMD (maybe not even intentionally) by not or not properly pre-heating their cards and by testing uber mode vs Titan/780 stock settings with a temp target of 80C.

    Read the review links I posted if you don't believe me. And don't quote selectively, quote whole sentences.
    Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
    Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
    That's right because the results don't show what you want.


    Quote Originally Posted by boxleitnerb View Post
    Source for your 300+W claim?
    The 290X certainly can, 306,88W:
    http://ht4u.net/reviews/2013/amd_rad...ew/index21.php

    You do understand how an average works, right? In some games a card may pull more than the average, in others less. You won't find better power consumption data than what I posted.
    And maybe you also overlooked where I said that the 290X wasn't as awful in perf/W compared to the competition as Fermi was.

    And you are the one who should make up your mind. What is it now - Titan killer? Then compare it to Titan which it doesn't beat. Or GTX 780 killer? Okay, if you want to call 10-15% a "killer", fine. But then also use the GTX 780 power numbers instead of Titans: 189W (GTX 780) vs 239W (290X quiet mode). That is a hefty 50W more.
    Look at any GTX480 review.
    No I don't know what average means... please explain.

    I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
    There is something screwy going on there because those results don't reflect what other sites have found.

    Quote Originally Posted by tajoh111 View Post
    More evidence for 290x power drinking ways.
    *snip*
    Ahhh... good old scientific method, lets just go ahead and throw you out the window, we don't need you.

    All I hear is "ifs" "ands" and "buts," don't worry about comparing what is, we can only discuss what could be.
    Last edited by LordEC911; 10-30-2013 at 11:35 PM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  25. #150
    Xtreme Addict
    Join Date
    Mar 2006
    Location
    Saskatchewan, Canada
    Posts
    2,207
    Quote Originally Posted by LordEC911 View Post
    Ok... Where was all these "even footing" "we need to change our testing/benching methodology" the past 2-3 years?
    Why do you need to shift the parameters of the test? Why can't you test stock vs stock?
    That's right because the results don't show what you want.




    Look at any GTX480 review.
    No I don't know what average means... please explain.

    I bet I can because those results look extremely strange. GTX780 and Titan pulling less than 200w?
    There is something screwy going on there because those results don't reflect what other sites have found.


    Ahhh... good old scientific method, lets just go ahead and throw you out the window, we don't need you.

    All I hear is "ifs" "ands" and "buts," don't worry about comparing what is, we can only discuss what could be.
    And lets discard anything that might paint AMD in a bad light. Those graphs are as concrete as you can get.
    Core i7 920@ 4.66ghz(H2O)
    6gb OCZ platinum
    4870x2 + 4890 in Trifire
    2*640 WD Blacks
    750GB Seagate.

Page 6 of 11 FirstFirst ... 3456789 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •