Page 16 of 33 FirstFirst ... 61314151617181926 ... LastLast
Results 376 to 400 of 812

Thread: ATI HD4800 Review Thread

  1. #376
    Xtreme Addict
    Join Date
    Mar 2008
    Location
    川崎市
    Posts
    2,076
    Quote Originally Posted by serialkilla277 View Post
    damn! my 750w isn't going to be enough for my oc plus 4870 crossfire is it? looks like gtx280 .... nooooo!
    no, its more than enough provided its a quality psu and not one with made up ratings, look at the anand review for example what the complete system uses, no where near 750W, even if you factor in an extra 100W for oc a quality 750W psu would still be plenty.

  2. #377
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Quote Originally Posted by Clairvoyant129 View Post
    Show me this "review" that shows 2x GTX 280s in SLI drawing only 20W more than 2x HD4870s under load.

    I don't think he was talking about SLI/CF, dude.

  3. #378
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Clairvoyant129 View Post
    Show me this "review" that shows 2x GTX 280s in SLI drawing only 20W more than 2x HD4870s under load.
    eh, care to read my post?

    i said 280gtx (one) vs 4870 (one)

    i dont care about sli/cf power consumption, since theres only a slim chance that i ever you that.
    Last edited by Hornet331; 06-26-2008 at 01:03 AM.

  4. #379
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    nice!
    Attached Thumbnails Attached Thumbnails Click image for larger version. 

Name:	4870shops.jpg 
Views:	607 
Size:	173.3 KB 
ID:	81075  

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  5. #380
    Xtreme Member
    Join Date
    Nov 2005
    Posts
    143
    This whole time I was waiting for the GTX 280 to come out, but it seems that the HD4870 is the real hit. It is less than half the price and in certain games and lower resolution it even manages to beat out the GTX 280. Even in cases when it loses to the GTX 280, it is not by much.

  6. #381
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by biohead View Post
    nice!
    yeah prices are nice, thought i want to see 1gb version.

  7. #382
    Registered User
    Join Date
    Mar 2008
    Posts
    22
    good to hear that 750 should be enough and yes its not bad, cooler master real power 750. Also i might add that i will be running two pumps, two hdd's and 10 fans + 4 lights.

  8. #383
    Xtreme X.I.P.
    Join Date
    Jun 2004
    Location
    France
    Posts
    1,368


    OCM Member / IXTREMTEK Admin !!



    DDR1 2*256 BH5 Adata @324.7Mhz 1.5/2/2/5 1T at 4v @318.6Mhz Benchs
    DDR2 1*512 Kingston Pc8500 @702Mhz 5/5/5/18 at 2.42v réel
    DDR2 2*1024 Cell Shock Pc8000c4 @534Mhz 3/3/3/8 at 3.5v réel

    Cooling : XP90C , Big Typhoon , waterchiller R507 , LN2 ....

  9. #384
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by boblemagnifique View Post


    hmm, what board again supports 4 dual slot cards?

    keep us updated on this.

  10. #385
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    is 16xCSAA the same as 16xQ?
    Are we there yet?

  11. #386
    Xtreme X.I.P.
    Join Date
    Jun 2004
    Location
    France
    Posts
    1,368
    Quote Originally Posted by Hornet331 View Post
    hmm, what board again supports 4 dual slot cards?

    keep us updated on this.

    yes i have a P5E64 WS Evo but not place for 4 cards (Just 3)
    OCM Member / IXTREMTEK Admin !!



    DDR1 2*256 BH5 Adata @324.7Mhz 1.5/2/2/5 1T at 4v @318.6Mhz Benchs
    DDR2 1*512 Kingston Pc8500 @702Mhz 5/5/5/18 at 2.42v réel
    DDR2 2*1024 Cell Shock Pc8000c4 @534Mhz 3/3/3/8 at 3.5v réel

    Cooling : XP90C , Big Typhoon , waterchiller R507 , LN2 ....

  12. #387
    Xtreme Member
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    120
    Quote Originally Posted by boblemagnifique View Post
    yes i have a P5E64 WS Evo but not place for 4 cards (Just 3)
    i have the evo and was thinking the exact same thing, it fits 3 of them? but the heat will be to much with stock cooler, will you exchange the coolers?

    regards
    LianLi PC7 rev3 + PCPower&Coolling Quad 750w blue
    Asus WS64 Evolution
    E8400 + Noctua U12P vortex
    4x1GB OCZ DDR3 platinum
    XFX GTX 275 OC
    2x WD 320GB RE3 raid0 + WD 640GB
    LG L226WA-WN HDMI + PS3 + Xbox360

  13. #388
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Seems 16xCSAA performs the same as 4xMSAA, but with slight better quality??

    http://developer.nvidia.com/object/c...ampled-aa.html

    Below are images from the SDK 10 CSAA sample highlighting the difference in quality between standard 4x MSAA and 16x CSAA modes. Observe the improvement in image quality for 16x, and note the fact that 16x CSAA typically performs similarly to 4x MSAA.
    And how can AnandTech say 16xCSAA is 4xMSAA and 8xMSAA? I don't get it

    Well, I must say, 8xAA Edge detect looks perfect.
    Last edited by Luka_Aveiro; 06-26-2008 at 02:58 AM.
    Are we there yet?

  14. #389
    Xtreme Cruncher
    Join Date
    Oct 2006
    Location
    1000 Elysian Park Ave
    Posts
    2,669
    I just bought a HD3870 like a month ago HD4850 will be a nice bump plus i can throw in Bioshock HD4870 looks like a power hog and i've never spent so much on a Video Card. I'm sure it's like $350 after tax+ship, but all that horsepowazz!!!! I'm sure it'll handle games for years to come and get insane PPD in Folding@Home. I game at 16*10 and have a Corsair 520HX, decisions decisions, HIS looks clean, simple and tidy
    i3-8100 | GTX 970
    Ryzen 5 1600 | RX 580
    Assume nothing; Question everything

  15. #390
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    What would anyone need TWO 4870x2's for?

    I guess everyone has 30 inch monitors and won't step down from 8xAA at 2560 resolution.

    Honestly, except for some instances in Crysis, I don't see where a single 4870 wouldn't be enough in resolutions of around 1680. Let alone FOUR of them. All games are console ports. Even a 8800GT I found to be quite enough, if you are content to play at low AA levels.

    Buy one 4870 and enjoy it. It's more than enough for everything.

  16. #391
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by gojirasan View Post
    Clearly you are an AMD stockholder. Uttering speculation like its fact. Where are you pulling that out of? The GTX280 has at least as much overclocking headroom as a HD4870. A single HD4870 isn't going to beat it no matter how much you cool it. But maybe if you repeat your speculations enough some people will believe you. It might beat a GTX260 though. Although I doubt it because it looks like the GTX260 has a lot more overclocking headroom than the HD4870. I think if you overclock both cards you will find that the GTX260 clearly beats the HD4870 in most benchmarks due to the additional headroom. And for only $50 more than a 1 gig HD4870. Although that is just MSRP. Post rebate street prices will tell the real story. At stock the two cards are about equal, but fully overclocked (and maybe volt modded) on water I think the GTX260 would waste the HD4870. Of course that's just more speculation.


    Who said anything about pure overclocking...? I was simple saying that for a lot less you can have an HD4870 running par with a GTX280...!

    Are people forgetting that the GeForce GTX280 is $650 bucks....? I can easily afford such things, but I just see no need. ATI has proven to me that a HD4870 is a better VALUE <--------

    And I talking about the GTX280, not the GTX260. Which is already bested by the 512MB 4870... I fail to see your point!
    *************

  17. #392
    Xtreme Enthusiast Kai Robinson's Avatar
    Join Date
    Oct 2007
    Location
    East Sussex
    Posts
    831
    The MSI K9A2 Platinum is a board that supports 4 dual slot cards...that what they're using for that crunching rig - they use 9800GX2s though...

    Main Rig

    Intel Core i7-2600K (SLB8W, E0 Stepping) @ 4.6Ghz (4.6x100), Corsair H80i AIO Cooler
    MSI Z77A GD-65 Gaming (MS-7551), v25 BIOS
    Kingston HyperX 16GB (2x8GB) PC3-19200 Kit (HX24C11BRK2/16-OC) @ 1.5v, 11-13-13-30 Timings (1:8 Ratio)
    8GB MSI Radeon R9 390X (1080 Mhz Core, 6000 Mhz Memory)
    NZXT H440 Case with NZXT Hue+ Installed
    24" Dell U2412HM (1920x1200, e-IPS panel)
    1 x 500GB Samsung 850 EVO (Boot & Install)
    1 x 2Tb Hitachi 7K2000 in External Enclosure (Scratch Disk)


    Entertainment Setup

    Samsung Series 6 37" 1080p TV
    Gigabyte GA-J1800N-D2H based media PC, Mini ITX Case, Blu-Ray Drive
    Netgear ReadyNAS104 w/4x2TB Toshiba DTACA200's for 5.8TB Volume size

    I refuse to participate in any debate with creationists because doing so would give them the "oxygen of respectability" that they want.
    Creationists don't mind being beaten in an argument. What matters to them is that I give them recognition by bothering to argue with them in public.

  18. #393
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by ceevee View Post
    The name calling and vitriol in this thread is unbelievable. Really, really nasty people on XS now, wasn't like that when I joined.

    Anyway the main difference between triSLI and 3xCrossfire is that one of them almost always works and the other usually performs worse than a single card. Yeah thats a good value for your money.


    My friend... did you read ANY of the HD4870 reviews...?

    ATI is not really betting on Crossfire to push them forward. Their multi-GPU cards have yet to be tested, but their new architecture hints at a new internal bridge that will be a huge boon for cards with dual-gpu's (X2). Possible similar to what dual-core CPU utilize.

    Since their cores are so much cheaper, it is easily conceivable to see this new configuration for under $499... the scaling should be superior to a SLI/Crossfire configuration. AMD was hinting at this when they acquired ATI.

    Needless to say, you've just spent $1300 on a graphics subsystem that has microstutter...
    ****************

  19. #394
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Man, this thread has gone downhill fast.

    Regarding all of these "supposed" benchmarks showing the HD4870 being close to the GTX 280 under load, I say bollocks. Either the testing procedures are way out of wack ("hey, let's run 3DMark at default settings with the system plugged into a Kill A Watt!") or they aren't reporting it properly.

    The reality of the situation is that under load the GTX 280 consumes up to 20&#37; (around 70W) more power than the HD4870 while being disgustingly loud. That is the truth whether people accept it or not.

  20. #395
    Xtreme Addict
    Join Date
    Dec 2005
    Location
    UK
    Posts
    1,713
    Quote Originally Posted by SKYMTL View Post
    Man, this thread has gone downhill fast.

    Regarding all of these "supposed" benchmarks showing the HD4870 being close to the GTX 280 under load, I say bollocks. Either the testing procedures are way out of wack ("hey, let's run 3DMark at default settings with the system plugged into a Kill A Watt!") or they aren't reporting it properly.

    The reality of the situation is that under load the GTX 280 consumes up to 20% (around 70W) more power than the HD4870 while being disgustingly loud. That is the truth whether people accept it or not.
    Its the latest eplage called fanboyism, spreading even to the far reaches of the XS. What ever you do please do not try and have a constructive argument with the ones that are already infected or your common sense and logic will suffer a painful death.

    Once you become one of them people will always
    TAMGc5: PhII X4 945, Gigabyte GA-MA790X-UD3P, 2x Kingston PC2-6400 HyperX CL4 2GB, 2x ASUS HD 5770 CUcore Xfire, Razer Barracuda AC1, Win8 Pro x64 (Current)

    TAMGc6: AMD FX, Gigabyte GA-xxxx-UDx, 8GB/16GB DDR3, Nvidia 680 GTX, ASUS Xonar, 2x 120/160GB SSD, 1x WD Caviar Black 1TB SATA 6Gb/s, Win8 Pro x64 (Planned)

  21. #396
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by SKYMTL View Post
    The reality of the situation is that under load the GTX 280 consumes up to 20% (around 70W) more power than the HD4870 while being disgustingly loud. That is the truth whether people accept it or not.

    what reality?

    there are currently no users out there that own both, and can make an independent statement.

    So i have to go after the data that is available. The available show that the powerconsumption is between 20-50W more then a 280gtx if you look around the web.

  22. #397
    Registered User
    Join Date
    Jun 2008
    Posts
    4
    Can someone with HD4870 test the performance in Age of Conan? Also report what settings (resolution, AA, bloom etc) and zone (wild, ot), ati driver and hardware setup.
    I'd relly appriciate it, and you can give yourself a hug ;p

  23. #398
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Hornet331 View Post
    what reality?
    While I can't go into every one of the details, I trust my own results since we are using the proper equipment and tests to properly determine power consumption of the full system (Line conditioner, UPM power meter, etc.). In addition, since I personally spent weeks determining the 3D application which would put the most constant load on the GPU, I can am confident in any numbers we produce.

    Many tests you see are highly influenced by the CPU power consumption fluctuating up and down. The trick to properly determining power consumption is to first see which application will properly load the GPU while letting the CPU sit as idle as possible. Then make sure that the program can put an almost constant load on the GPU and its memory with minimal load times in order to get an even testing field.

    Finally, the test MUST be run for AT LEAT 45 minutes in order to determine a peak power consumption number. This is due to the fact that many power meters have a sampling rate of about 250ms to 750ms which means that peaks in the power draw may not be logged. That is why you need to have a very long test under constant load; so the power meter can pick up the peaks in consumption even if it misses it the first, second, third and so on time.

    I also have to say that it is extremely important to use a line conditioner for power consumption tests. As many of us know, input voltage can fluctuate quite a bit from a household power outlet. This voltage fluctuation can have a pretty large impact on efficiency numbers of power supplies which in turn would influence the numbers generated by any GPU power consumption test.

    That is my story, that is how I tested and I am sticking behind my 70W difference statement 100&#37;.
    Last edited by SKYMTL; 06-26-2008 at 05:08 AM.

  24. #399
    Xtreme Addict
    Join Date
    May 2003
    Location
    Hopatcong, NJ
    Posts
    1,078
    Quote Originally Posted by n3m3sis View Post
    Can someone with HD4870 test the performance in Age of Conan? Also report what settings (resolution, AA, bloom etc) and zone (wild, ot), ati driver and hardware setup.
    I'd relly appriciate it, and you can give yourself a hug ;p
    HardOCP tested AoC with the 4870

    'Gaming' AMD FX-6300 @ 4.5GHz | Asus M5A97 | 16GB DDR3 2133MHz | GTX760 2GB + Antec Kuhler620 mod | Crucial m4 64GB + WD Blue 2x1TB Str
    'HTPC' AMD A8-3820 @ 3.5GHz | Biostar TA75A+ | 4GB DDR3 | Momentus XT 500GB | Radeon 7950 3GB
    'Twitch' AMD 720BE @ 3.5GHz | Gigabyte GA-78LMT-S2P | 4GB DDR3 | Avermedia Game Broadcaster

    Desktop Audio: Optical Out > Matrix mini DAC > Virtue Audio ONE.2 > Tannoy Reveal Monitors + Energy Encore 8 Sub
    HTPC: Optoma HD131XE Projector + Yamaha RX-V463 + 3.2 Speaker Setup

  25. #400
    Xtreme Addict
    Join Date
    May 2003
    Location
    Hopatcong, NJ
    Posts
    1,078
    Quote Originally Posted by SKYMTL View Post
    While I can't go into every one of the details, I trust my own results since we are using the proper equipment and tests to properly determine power consumption of the full system (Line conditioner, UPM power meter, etc.). In addition, since I personally spent weeks determining the 3D application which would put the most constant load on the GPU, I can am confident in any numbers we produce.

    Many tests you see are highly influenced by the CPU power consumption fluctuating up and down. The trick to properly determining power consumption is to first see which application will properly load the GPU while letting the CPU sit as idle as possible. Then make sure that the program can put an almost constant load on the GPU and its memory with minimal load times in order to get an even testing field.

    Finally, the test MUST be run for AT LEAT 45 minutes in order to determine a peak power consumption number. This is due to the fact that many power meters have a sampling rate of about 250ms to 750ms which means that peaks in the power draw may not be logged. That is why you need to have a very long test under constant load; so the power meter can pick up the peaks in consumption even if it misses it the first, second, third and so on time.

    I also have to say that it is extremely important to use a line conditioner for power consumption tests. As many of us know, input voltage can fluctuate quite a bit from a household power outlet. This voltage fluctuation can have a pretty large impact on efficiency numbers of power supplies which in turn would influence the numbers generated by any GPU power consumption test.

    That is my story, that is how I tested and I am sticking behind my 70W difference statement 100&#37;.
    Just curious, but which application did you find that best tests power consumption Sky?

    When I tested the power consumption of my RV670, I used Prime95 Small FFT to find the load of CPU only. Afterwards I let it run on a single thread, then use the free core to run ATITool Artifact tool, which seems to put the highest temperatures on my video card. Its single threaded, maxes out the cpu and maxes out the gpu.

    Then I subtract the difference between (Prime95(singlethread)+ATITool) and Prime95(dualthread) to get just the power consumption of just my gpu. I'm using a killwatt plugged into a Belkin PureAV Line Conditioner. Care to comment if this sounds like an accurate way to test just GPU consumption? I use SmallFFTs because system memory does not get loaded much

    'Gaming' AMD FX-6300 @ 4.5GHz | Asus M5A97 | 16GB DDR3 2133MHz | GTX760 2GB + Antec Kuhler620 mod | Crucial m4 64GB + WD Blue 2x1TB Str
    'HTPC' AMD A8-3820 @ 3.5GHz | Biostar TA75A+ | 4GB DDR3 | Momentus XT 500GB | Radeon 7950 3GB
    'Twitch' AMD 720BE @ 3.5GHz | Gigabyte GA-78LMT-S2P | 4GB DDR3 | Avermedia Game Broadcaster

    Desktop Audio: Optical Out > Matrix mini DAC > Virtue Audio ONE.2 > Tannoy Reveal Monitors + Energy Encore 8 Sub
    HTPC: Optoma HD131XE Projector + Yamaha RX-V463 + 3.2 Speaker Setup

Page 16 of 33 FirstFirst ... 61314151617181926 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •