Results 1 to 25 of 127

Thread: Real Power Consumption - 4870 X2 & GTX295 out of Spec!

Hybrid View

  1. #1
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Its hardly no surprise I guess. Pretty sad tho. GPU makers have for along time been truely horrible. They push things too hard. We get shortlived products and the consumer ends up paying twice in the terms of power bill. not even to talk about the dustbuster effect.

    I mean we sit here and talk about uhhh..this CPU uses 20-30W more. BUUH BAD CPU! 295 and 4870X2 both over 300W. One almost 400W. Both AMD and nVidia can go screw themselves honestly. One thing is sure, my next GFX card will be sub 75W.

    Also there is hardly any progress. They simply just keep pumping power in. Trislot coolers cant be far away.

    Just imagine a retail 300 or 400W C2Q, i7, PH2 etc. If Intel or AMD came with that, everyone here would talk about removing people from the gene pool.

    I guess the reference designs everyone uses is a nice way to hide the consumption behind fake TDPs.
    Last edited by Shintai; 02-03-2009 at 03:41 PM.
    Crunching for Comrades and the Common good of the People.

  2. #2
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by Shintai View Post
    Its hardly no surprise I guess. Pretty sad tho. GPU makers have for along time been truely horrible. They push things too hard. We get shortlived products and the consumer ends up paying twice in the terms of power bill. not even to talk about the dustbuster effect.

    I mean we sit here and talk about uhhh..this CPU uses 20-30W more. BUUH BAD CPU! 295 and 4870X2 both over 300W. One almost 400W. Both AMD and nVidia can go screw themselves honestly. One thing is sure, my next GFX card will be sub 75W.

    Also there is hardly any progress. They simply just keep pumping power in. Trislot coolers cant be far away.

    Just imagine a retail 300 or 400W C2Q, i7, PH2 etc. If Intel or AMD came with that, everyone here would talk about removing people from the gene pool.
    Couldnät agree more.

  3. #3
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Shintai View Post
    Its hardly no surprise I guess. Pretty sad tho. GPU makers have for along time been truely horrible. They push things too hard. We get shortlived products and the consumer ends up paying twice in the terms of power bill. not even to talk about the dustbuster effect.

    I mean we sit here and talk about uhhh..this CPU uses 20-30W more. BUUH BAD CPU! 295 and 4870X2 both over 300W. One almost 400W. Both AMD and nVidia can go screw themselves honestly. One thing is sure, my next GFX card will be sub 75W.

    Also there is hardly any progress. They simply just keep pumping power in. Trislot coolers cant be far away.

    Just imagine a retail 300 or 400W C2Q, i7, PH2 etc. If Intel or AMD came with that, everyone here would talk about removing people from the gene pool.

    I guess the reference designs everyone uses is a nice way to hide the consumption behind fake TDPs.
    Hum?
    Furmark is useless. Real world tests are the way to go, not crap benchmarks.

    And if you want GFX <75W you have plently of them on the market.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  4. #4
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    Quote Originally Posted by v_rr View Post
    Hum?
    Furmark is useless. Real world tests are the way to go, not crap benchmarks.

    And if you want GFX <75W you have plently of them on the market.
    lol, with that reason should we all here just be using Crysis/GTA4 + some firefox tabs to test our overclock stability? They do count as real world tests, that means they are better right.
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  5. #5
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by LiquidReactor View Post
    lol, with that reason should we all here just be using Crysis/GTA4 + some firefox tabs to test our overclock stability? They do count as real world tests, that means they are better right.
    If your overclock don´t fail in any program or game and circunstance you use your pc, yes it is stable for it´s use

    Some people only want stability to make a run for 3dmark one time, other want for 24h use. Depends on people = real world tests.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  6. #6
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    I am not sure I understand the validity of this article. Did AMD and Nvidia base their TDP on furmark? Did the author of said article contact both Nvidia and AMD to find out what was used to determine TDP and use his results as a litmus as to their claims?

    After reading this article I find myself asking more questions then what this article is suppose to answer. Another concern is that the use of furmark in this fashion can give the impression to others that those results are what you get in games and other programs. The truth is (if you used furmark) is that you don't draw nearly as much power and heat in games and programs as you do with furmark (at least based on my on examination of this).

    Therefore, IMO this type of review should be about pushing power consumption to the max.
    [SIGPIC][/SIGPIC]

  7. #7
    Xtreme Addict
    Join Date
    Nov 2003
    Location
    NYC
    Posts
    1,592
    Quote Originally Posted by v_rr View Post
    And if you want GFX <75W you have plently of them on the market.
    +1 take your mitts off the mid to high end, people already pay for the cards through the nose compared to the flagship cards of olde

  8. #8
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Birmingham AL.
    Posts
    1,079
    Quote Originally Posted by Shintai View Post

    Just imagine a retail 300 or 400W C2Q, i7, PH2 etc. If Intel or AMD came with that, everyone here would talk about removing people from the gene pool.

    I guess the reference designs everyone uses is a nice way to hide the consumption behind fake TDPs.
    Not if It could play super Pi
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

  9. #9
    Registered User
    Join Date
    Aug 2007
    Location
    Romania
    Posts
    92
    Quote Originally Posted by Shintai View Post
    Trislot coolers cant be far away.
    what do we have here?
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	gainward-rampage700-radeon-hd-4870-x2-glh.jpg 
Views:	145 
Size:	19.0 KB 
ID:	94821  
    [SIGPIC][/SIGPIC]

  10. #10
    Xtreme Member
    Join Date
    Dec 2008
    Posts
    177
    As long as performance/watt is increased between generations I think there is progress made...

    Just look at from 2900XT to 3850 to 4850. 100% decrease in power 0% increase in performance from 2900XT to 3850. 100% increase in performance 50% increase in power from 3850 to 4850.

    These are rough estimates but you get the picture.

  11. #11
    Xtreme Addict
    Join Date
    May 2005
    Location
    Sugar Land, TX
    Posts
    1,418
    Quote Originally Posted by Shintai View Post
    Its hardly no surprise I guess. Pretty sad tho. GPU makers have for along time been truely horrible. They push things too hard. We get shortlived products and the consumer ends up paying twice in the terms of power bill. not even to talk about the dustbuster effect.

    I mean we sit here and talk about uhhh..this CPU uses 20-30W more. BUUH BAD CPU! 295 and 4870X2 both over 300W. One almost 400W. Both AMD and nVidia can go screw themselves honestly. One thing is sure, my next GFX card will be sub 75W.

    Actually this generation is actually VERY power efficient the way parts get shut off when not in use. Not sure about the 295 but the 285 drops down to like 20W idle and even 35W for Blu-RAY. No thank you on a card thats sub 75W that would perform like crap. A well coded game with vsync on would do MUCH better than something like Crysis that sucks up every resource you can give it.
    Last edited by ewitte; 02-15-2009 at 07:06 AM.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •