MMM
Page 3 of 4 FirstFirst 1234 LastLast
Results 51 to 75 of 83

Thread: GeForce GTX 280 is Three Times Faster in F@H than HD3870

  1. #51
    Xtreme Member
    Join Date
    Apr 2007
    Posts
    280
    Anyone knows if this GTX 280 will have aftermarket air cooling or it will be like the 9800GX2 that can only be cooled with water?

  2. #52
    Xtreme Enthusiast
    Join Date
    Jul 2007
    Posts
    668
    Quote Originally Posted by Isaac MM View Post
    Anyone knows if this GTX 280 will have aftermarket air cooling or it will be like the 9800GX2 that can only be cooled with water?
    Probably you'll be able to use HR-03 Plus if distance is the same,it will be able to be cooled by air coolers but they will have to very powerful to be able to dissipate 240W

  3. #53
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by trinibwoy View Post
    True, they really should've waited for AMD to send them a few early 4870 samples to use in their presentation. I know I would have.....

    And it is really lame of them to show numbers from their upcoming products at a conference about their upcoming products. How dastardly!


    For those who didn't get it, the point was:
    - They should have thrown in some 3870X2 numbers (which really doubles the single 3870 performance on F@H, since you can run two clients at the same time).

  4. #54
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by ToTTenTranz View Post
    For those who didn't get it, the point was:
    - They should have thrown in some 3870X2 numbers (which really doubles the single 3870 performance on F@H, since you can run two clients at the same time).
    Exactly. The 3870X2 is still a multi-GPU solution requiring two separate instances of F@H. So I'm not sure why you think think it should have been included in that graph or what benefit it would have been to the audience.

  5. #55
    Xtreme Legend
    Join Date
    Mar 2005
    Location
    Australia
    Posts
    17,242
    Quote Originally Posted by NH|Delph1 View Post
    That's my distorted picture!

    The original can be found over at a certain site which reported about NVIDIA F@H client, but fumbled with the pictures. I didn't want to rat them out, but it still interesting news...

    http://www.nordichardware.com/news,7777.html



    //Andreas
    damn dude
    you were there?
    shame i didn't realise at the time
    Team.AU
    Got tube?
    GIGABYTE Australia
    Need a GIGABYTE bios or support?



  6. #56
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by metro.cl View Post
    You honestly think AMD owould have sent NVIDIA a card that is yet to be released, and not high end, to be put against the high end?
    Heh, I thought I had laid the sarcasm on pretty thick there....guess it still wasn't enough.

  7. #57
    Xtreme Member
    Join Date
    Dec 2006
    Location
    Edmonton,Alberta
    Posts
    182
    Does the GTX 280 CUDA client beat HD3870 CAL client by 300%, I believe it would

    I want to see what CUDA would do against CTM results

  8. #58
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by metro.cl View Post
    Not sure how they measured it, but this Gefore is looking like a monster.



    You honestly think AMD owould have sent NVIDIA a card that is yet to be released, and not high end, to be put against the high end?



    Maybe becasue NVIDIA didnt support F@H before? + shows the power of the new card.



    How can you claim that? you got all 3 cards? and tested them?

    If you could read and understand the whole sentence, you would have realized that if these numbers had been translated into gaming frames per second, it would have had similar performance compared to 9800GX2, which is a card that will lose to 4870X2.

  9. #59
    Xtreme Enthusiast
    Join Date
    May 2006
    Location
    Austria
    Posts
    532
    Quote Originally Posted by Nuker_ View Post
    If you could read and understand the whole sentence, you would have realized that if these numbers had been translated into gaming frames per second, it would have had similar performance compared to 9800GX2, which is a card that will lose to 4870X2.
    And you are Charlie Demerjian? Or the oracle of delphi?

    Am I the only one that thinks it would be pathetic if a dual card setup from AMD could not obliterate a single nvidia card, though, NV can release a dual card themselves, thus it is pretty irrelevant if a 4870X2 is superior.

    Dunno, it seems impressive and it probably is as Dinos gave us a clear hint
    Quote Originally Posted by freecableguy
    the idiots out number us 10,000:1

  10. #60
    Banned
    Join Date
    May 2005
    Location
    Belgium, Dendermonde
    Posts
    1,292
    Quote Originally Posted by Jacky View Post
    And you are Charlie Demerjian? Or the oracle of delphi?

    Am I the only one that thinks it would be pathetic if a dual card setup from AMD could not obliterate a single nvidia card, though, NV can release a dual card themselves, thus it is pretty irrelevant if a 4870X2 is superior.

    Dunno, it seems impressive and it probably is as Dinos gave us a clear hint
    A GT200 has a TDP op 238W, do you really want to have a dual die single pcb with that?

  11. #61
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    gt280

    4xxx series.

    more revenues for the 'checkout-chick' e-tailers.
    i7 3610QM 1.2-3.2GHz

  12. #62
    Xtreme Enthusiast
    Join Date
    May 2006
    Location
    Austria
    Posts
    532
    Quote Originally Posted by GoThr3k View Post
    A GT200 has a TDP op 238W, do you really want to have a dual die single pcb with that?
    well normally they'd use an underclocked GTS-variant, whether it will feasible in any way I have no clue
    Quote Originally Posted by freecableguy
    the idiots out number us 10,000:1

  13. #63
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by Jacky View Post
    And you are Charlie Demerjian? Or the oracle of delphi?

    Am I the only one that thinks it would be pathetic if a dual card setup from AMD could not obliterate a single nvidia card, though, NV can release a dual card themselves, thus it is pretty irrelevant if a 4870X2 is superior.

    Dunno, it seems impressive and it probably is as Dinos gave us a clear hint
    No, but AMD is in a very bad position, should the 4870X2 lose to 9800GX2. I do however find that event very unlikely. If nV would release a GTX280 GX2, those 1 kW+ power supplies would finally become useful.
    Last edited by Nuker_; 05-25-2008 at 08:03 AM.

  14. #64
    Xtreme Member
    Join Date
    Apr 2007
    Posts
    280
    Is there any confirmation which one is the fastest? GTX 280 or 4870X2?

  15. #65
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Isaac MM View Post
    Is there any confirmation which one is the fastest? GTX 280 or 4870X2?
    how can there be a confirmation, if nothing but rumors are available of both cards?

    wait till the 18th of june to find out.

  16. #66
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by Hornet331 View Post
    how can there be a confirmation, if nothing but rumors are available of both cards?

    wait till the 18th of june to find out.
    That cant be true. Is there NDA till release day? We have always seen benches some weeks before launch.

  17. #67
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Kilkenny, Ireland
    Posts
    259
    NDA is June 17th

  18. #68
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    That's just folding and just pre marketing. Remember the old news where 2900XT was suppose to be n times faster then 8800GTX, they even got the monster specifications but the results were - how should I put it: Lame! Now, nVidia is doing the same thing - I doubt GTX 280 would be even 2x faster then HD 3870 in games, even with his 6+8 power connector... maybe 3 times the power consumption, now that I might believe.

  19. #69
    Xtreme Member
    Join Date
    Feb 2007
    Location
    Dallas, TX
    Posts
    311
    Quote Originally Posted by ToTTenTranz View Post
    For those who didn't get it, the point was:
    - They should have thrown in some 3870X2 numbers (which really doubles the single 3870 performance on F@H, since you can run two clients at the same time).
    If you can link/post a guide for doing this it would be much appreciated. According to the folding forum running a second gpu client on the second core of the 3870X2 doesn't work. See for example:

    http://foldingforum.org/viewtopic.php?f=42&t=2145
    2600K | Maximus IV Formula | 12G Corsair 1600 C8 | 2x 6950 | Coolermaster Scout

  20. #70
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by XSAlliN View Post
    That's just folding and just pre marketing. Remember the old news where 2900XT was suppose to be n times faster then 8800GTX, they even got the monster specifications but the results were - how should I put it: Lame! Now, nVidia is doing the same thing - I doubt GTX 280 would be even 2x faster then HD 3870 in games, even with his 6+8 power connector... maybe 3 times the power consumption, now that I might believe.
    the 2900 is way faster at maya and CAD and call of warez (i think that its valid its sponsored by both ATI and NV), but in gaming rendering takes a back seat to assembling frames and its not like low shader optimization and dx9 textures with dx10 shading/overlays helps ati out (dont flame, im saying NV = game, ATI = workstation)


    Quote Originally Posted by ToTTenTranz View Post
    For those who didn't get it, the point was:
    - They should have thrown in some 3870X2 numbers (which really doubles the single 3870 performance on F@H, since you can run two clients at the same time).
    there is a huge limit with slot bandwidth and the cpu queuing up stuff for the gpu and the client cant do it (it only will touch 1core)
    Last edited by zanzabar; 05-26-2008 at 10:45 PM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  21. #71
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    I don't know if anyone has noticed, but this is a win for ATI actually.

    The HD 3870 costs 3.75X Cheaper than the $600 GTX 280: http://www.newegg.com/Product/Produc...82E16814161218

    So technically you get more for your card per dollar with the 3870 than you do with the GTX 280.

    Perkam

  22. #72
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by perkam View Post
    I don't know if anyone has noticed, but this is a win for ATI actually.

    The HD 3870 costs 3.75X Cheaper than the $600 GTX 280: http://www.newegg.com/Product/Produc...82E16814161218

    So technically you get more for your card per dollar with the 3870 than you do with the GTX 280.

    Perkam
    And instead of having a 240W TDP you get a fairly decent 330W TDP, much better indeed
    Are we there yet?

  23. #73
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by Luka_Aveiro View Post
    And instead of having a 240W TDP you get a fairly decent 330W TDP, much better indeed
    Meow?





    That plus the price difference means Nvidia cannot claim F@H crunching superiority with this demonstration alone. Frankly I'm surprised this thread was not debunked much earlier.

    Perkam

  24. #74
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    Quote Originally Posted by Luka_Aveiro View Post
    And instead of having a 240W TDP you get a fairly decent 330W TDP, much better indeed

    Says who? I own one of those and I can assure you that 330W TDP is fake, stressed at max with FurMark never got beyond 200W.

  25. #75
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by perkam View Post
    Meow?





    That plus the price difference means Nvidia cannot claim F@H crunching superiority with this demonstration alone. Frankly I'm surprised this thread was not debunked much earlier.

    Perkam
    Techreport is not quite good with those figures. Here is the figures from the HD3870X2.




    Absolutely no consistency. Also i think he wanted you to scale AMD to the same performance. And price/performance on highend parts vs lowend parts is...not right. Its like comparing a 100$ 2.4Ghz Core 2 with a 1499$ 3.2Ghz.
    Last edited by Shintai; 05-27-2008 at 05:27 AM.
    Crunching for Comrades and the Common good of the People.

Page 3 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •