Page 20 of 33 FirstFirst ... 101718192021222330 ... LastLast
Results 476 to 500 of 812

Thread: ATI HD4800 Review Thread

  1. #476
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by Kai Robinson View Post
    AliG - 160GB/sec? Sorry but you've been on the funny sauce again. It's a 48 lane PCIe 2.0 Bridge chip from either PLX or IDT (more than likely PLX). Thats 16 lanes for each GPU, and 16 lanes for the bus connection, which means at most, 32GB/sec bandwidth between GPU's.
    that's what the ATI representative said, don't shoot the messenger. I'll dig up the link. Only thing I don't get is why does ati need a fancy bridge chip when the 9800gx2 clearly did fine without one, all it does is add power consumption.

    And you said I'm on the funny sauce again, when was the first time?
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  2. #477
    Xtreme Enthusiast
    Join Date
    Dec 2005
    Location
    Peoples Republic of Berkeley (PRB), USA
    Posts
    928
    Quote Originally Posted by Hornet331 View Post
    hmm maybe needs further investigation.

    Did you oc your card already?

    Also can you see any physical damage?
    No physical damage, nothing different than when I plugged the card in in the first place. Of course I OC'd it, duh. No voltage bumps, nothing out of the ordinary except routine catalyst overdrive changes.
    core i7 920, gigabyte EX58-UD5, 4870x2

    DDC3.2 > PA120.3 > DD4870x2fc > Fuzion v1 > PA120.2 > EK Res

  3. #478
    Xtreme Enthusiast
    Join Date
    Dec 2004
    Location
    Bucharest, Romania
    Posts
    656
    http://www.hardforum.com/showthread.php?t=1319753

    2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
    Intel i7 920 d0 @ 4410MHz @ 1.36v :: Prolimatech Mega Shadow :: Gigabyte EX58-UD5 F9K :: 6GB Mushkin XP3-15000 :: HIS 5870 :: Corsair 1000W :: HannsG 27.5" :: Lian Li V1010B

  4. #479
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by AliG View Post
    that's what the ATI representative said, don't shoot the messenger. I'll dig up the link. Only thing I don't get is why does ati need a fancy bridge chip when the 9800gx2 clearly did fine without one, all it does is add power consumption.

    And you said I'm on the funny sauce again, when was the first time?
    X2 cards aren't regular corssfire, GX2 aren't crossfire, crossfire works in a different way SLI does. Does this help out?

  5. #480
    Xtreme Enthusiast
    Join Date
    Dec 2004
    Location
    Bucharest, Romania
    Posts
    656
    Quote Originally Posted by halo112358 View Post
    No physical damage, nothing different than when I plugged the card in in the first place. Of course I OC'd it, duh. No voltage bumps, nothing out of the ordinary except routine catalyst overdrive changes.
    By OC you mean thru BIOS update with modded frequencies or thru ati catalyst?
    Intel i7 920 d0 @ 4410MHz @ 1.36v :: Prolimatech Mega Shadow :: Gigabyte EX58-UD5 F9K :: 6GB Mushkin XP3-15000 :: HIS 5870 :: Corsair 1000W :: HannsG 27.5" :: Lian Li V1010B

  6. #481
    Banned
    Join Date
    Apr 2008
    Location
    Brisbane, Australia
    Posts
    3,601
    What I can't believe is the power consumption of the 4870, especially in CF. It's just nuts.

  7. #482
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by Calmatory View Post
    X2 cards aren't regular corssfire, GX2 aren't crossfire, crossfire works in a different way SLI does. Does this help out?
    I'm serious, I believe the only reason why they have the bridge chip is to increase the bandwidth between the two gpus for a faster interconnection as they've claimed no micro-stuttering with both the 3870x2 and 4870x2. Of course later on they admitted the micro-stuttering on the 3870x2 was horrible because the bridge chip didn't provide enough bandwidth (8 GB/s) and the extra latency made it worse than regular xfire.
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  8. #483
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Quote Originally Posted by AriciU View Post
    http://www.hardforum.com/showthread.php?t=1319753

    2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
    That was interesting info for me:

    The GTX 200 series power saving features do NOT work on an Intel platform. That’s right. I thought I was the odd one out when I saw countless other websites posting about the down-clocks and down-voltages, whereby the GPU settles at 300Mhz Memory to 100Mhz etc when no 3D app is running. What some of them failed to state was that they only achieved this on an Nvidia platform; while ALL the reviews out there failed to warn consumers that this does NOT work on an Intel platform at all. The writeups out there further confuse you by stating the downclocks are achieved in drivers, while only the full hybrid power features, i.e., shutting down the GPU and offloads 2D to the onboard IGP require an Nvidia platform. They just don’t know what they are talking about.

    This was confirmed after a 30 minute phone conversation with an XFX engineer, the manufacturer of my GTX 280.

    What this translates to is that all the idle power consumption graphs you saw on reviews over the GTX 200 series do NOT apply to an Intel platform. Again, out of the 15 odd reviews out there, not one, I repeat, NOT A SINGLE ONE, bothered to state this clearly. Whats the user footprint in the real world with an Nvidia platform? 0.1% of the PC population?

    The GPU temp does drop back to the 50s while idling in Windows, but the clocks do not scale back, which leads me to believe there is no real power savings.
    Anyone to confirm this from personal experience?
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  9. #484
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Lightman View Post
    That was interesting info for me:



    Anyone to confirm this from personal experience?
    Well, I know my GTX 280 downclocks fine on an X38 and P35 chipset. The power consumption figures I have seen follow the trend of it working when at idle.

  10. #485
    Xtreme Member
    Join Date
    Jan 2007
    Location
    WI
    Posts
    316
    If you mean clocking down when not in 3D,that works just fine for me on an Intel chipset.

  11. #486
    Xtreme Addict
    Join Date
    Jul 2005
    Location
    ATX
    Posts
    1,004
    Edit:
    There's been more than 1 response saying the downclock works. I stand corrected and it must have been due solely to my copy of the XFX card. The engineer I spoke to may not be in a very experienced position with the GTX 280 to identify whether its the card or the series at fault.
    Guess it works.
    That guy sounds like an AMD/ATI infomercial though! It almost sounds too good to be true, but I'm rooting for the underdog here... hope it is true

  12. #487
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by m0da View Post
    Guess it works.
    That guy sounds like an AMD/ATI infomercial though! It almost sounds too good to be true, but I'm rooting for the underdog here... hope it is true
    what an opportunity... he can discredit intel and nv in one blow, and forgot to mention that atis own powersaving is not working right now.

  13. #488
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Hornet331 View Post
    what an opportunity... he can discredit intel and nv in one blow, and forgot to mention that atis own powersaving is not working right now.
    Actually, it is in some cases and will soon be working with ALL cards. Only some of the first batch of cards had it disabled which should be fixed in a future driver release. Other cards had it working (most notably the MSI cards) without a problem.

  14. #489
    Registered User
    Join Date
    Mar 2008
    Location
    Finland
    Posts
    35
    Quote Originally Posted by AriciU View Post
    http://www.hardforum.com/showthread.php?t=1319753

    2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
    "Sorry for the wall of text, I hope you found the above a little more useful than a bunch of graphs from some teenage “me-too” review sites."

    What bad have hardware sites ever done to this guy??
    Case: NZXT Lexa
    PSU: Corsair HX620
    Memory: G.Skill F2-8000CL5D-4GBPQ (2GB x 2)
    CPU cooler: ThermalRight Ultra-120 EXTREME with Noctua NF-P12
    Motherboard: Asus Rampage Formula (BIOS 219, made in china )
    Processor: E2160
    Graphics card: Club3d 9800 GTX

    Input: Logitech G15 v2 + Logitech G9

  15. #490
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    I stopped reading when he was attacking reviewers credibility and then proceeded to use his "trusty right foot" for temperature testing.

    That, along with many other flaws/contradictions.

  16. #491
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Tonucci View Post
    I stopped reading when he was attacking reviewers credibility and then proceeded to use his "trusty right foot" for temperature testing.

    That, along with many other flaws/contradictions.
    What flaws/contradictions did you observe?
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  17. #492
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by STaRGaZeR View Post
    What flaws/contradictions did you observe?
    Sorry, Im not going through that again. I would have to re-read, taking notes

  18. #493
    I am Xtreme
    Join Date
    Jul 2007
    Location
    The Sacred birth place of Watercooling
    Posts
    4,689
    would 850w psu be enough for 4870 1b cf?
    Quote Originally Posted by skinnee View Post
    No, I think he had a date tonight...

    He and his EK Supreme are out for a night on the town!

  19. #494
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    North USA
    Posts
    670
    Quote Originally Posted by disruptfam View Post
    would 850w psu be enough for 4870 1b cf?
    Name brand, yes..... assuming the load of the rest of your system is "average".
    Asus P6T-DLX V2 1104 & i7 920 @ 4116 1.32v(Windows Reported) 1.3375v (BIOS Set) 196x20(1) HT OFF
    6GB OCZ Platinum DDR3 1600 3x2GB@ 7-7-7-24, 1.66v, 1568Mhz
    Sapphire 5870 @ 985/1245 1.2v
    X-Fi "Fatal1ty" & Klipsch ProMedia Ultra 5.1 Speaks/Beyerdynamic DT-880 Pro (2005 Model) and a mini3 amp
    WD 150GB Raptor (Games) & 2x WD 640GB (System)
    PC Power & Cooling 750w
    Homebrew watercooling on CPU and GPU
    and the best monitor ever made + a Samsung 226CW + Dell P2210 for eyefinity
    Windows 7 Utimate x64

  20. #495
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Well his personal review confirms for me that CF 4870s best the GTX 280 in AoC and that was my only remaining concern. Good to know as AoC is one of the more demanding games out currently and if the 4870s don't choke with high levels of AA in it, thats awesome.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  21. #496
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by AliG
    Of course later on they admitted the micro-stuttering on the 3870x2 was horrible because the bridge chip didn't provide enough bandwidth (8 GB/s) and the extra latency made it worse than regular xfire.
    Who is "they"? AMD? Did they really admit that? Since when do these large corporations ever admit to a flaw in their product? I had heard that both Nvidia and AMD are doing very well at the moment by not admitting to the existence of microstuttering or even the lack of multi-GPU driver support.

  22. #497
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    They did indeed admit that they should have used the PCIE 2.0 bridge chip and not the PCIE 1.1 one, but they didn't say. But they did indeed admit using the wrong chip.

  23. #498
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by zerazax View Post
    They did indeed admit that they should have used the PCIE 2.0 bridge chip and not the PCIE 1.1 one, but they didn't say. But they did indeed admit using the wrong chip.
    Uh, you'll see that has absolutely nothing to do with microstuttering though.

  24. #499
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Note they never said anything about microstuttering, they just admitted they made a mistake with the type of bridge chip used. As to for what reasons, I don't know. I'm just re-affirming the fact they did indeed state it.

  25. #500
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by zerazax View Post
    Note they never said anything about microstuttering, they just admitted they made a mistake with the type of bridge chip used. As to for what reasons, I don't know. I'm just re-affirming the fact they did indeed state it.
    More like with the performance output.

    and the cards can always be faster and better designed
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

Page 20 of 33 FirstFirst ... 101718192021222330 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •