Results 1 to 25 of 812

Thread: ATI HD4800 Review Thread

Hybrid View

  1. #1
    Xtreme Enthusiast Kai Robinson's Avatar
    Join Date
    Oct 2007
    Location
    East Sussex
    Posts
    831
    AliG - 160GB/sec? Sorry but you've been on the funny sauce again. It's a 48 lane PCIe 2.0 Bridge chip from either PLX or IDT (more than likely PLX). Thats 16 lanes for each GPU, and 16 lanes for the bus connection, which means at most, 32GB/sec bandwidth between GPU's.

    Main Rig

    Intel Core i7-2600K (SLB8W, E0 Stepping) @ 4.6Ghz (4.6x100), Corsair H80i AIO Cooler
    MSI Z77A GD-65 Gaming (MS-7551), v25 BIOS
    Kingston HyperX 16GB (2x8GB) PC3-19200 Kit (HX24C11BRK2/16-OC) @ 1.5v, 11-13-13-30 Timings (1:8 Ratio)
    8GB MSI Radeon R9 390X (1080 Mhz Core, 6000 Mhz Memory)
    NZXT H440 Case with NZXT Hue+ Installed
    24" Dell U2412HM (1920x1200, e-IPS panel)
    1 x 500GB Samsung 850 EVO (Boot & Install)
    1 x 2Tb Hitachi 7K2000 in External Enclosure (Scratch Disk)


    Entertainment Setup

    Samsung Series 6 37" 1080p TV
    Gigabyte GA-J1800N-D2H based media PC, Mini ITX Case, Blu-Ray Drive
    Netgear ReadyNAS104 w/4x2TB Toshiba DTACA200's for 5.8TB Volume size

    I refuse to participate in any debate with creationists because doing so would give them the "oxygen of respectability" that they want.
    Creationists don't mind being beaten in an argument. What matters to them is that I give them recognition by bothering to argue with them in public.

  2. #2
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Kai Robinson View Post
    AliG - 160GB/sec? Sorry but you've been on the funny sauce again. It's a 48 lane PCIe 2.0 Bridge chip from either PLX or IDT (more than likely PLX). Thats 16 lanes for each GPU, and 16 lanes for the bus connection, which means at most, 32GB/sec bandwidth between GPU's.
    dont forget "magic" crossfire x sideport.

  3. #3
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by Kai Robinson View Post
    AliG - 160GB/sec? Sorry but you've been on the funny sauce again. It's a 48 lane PCIe 2.0 Bridge chip from either PLX or IDT (more than likely PLX). Thats 16 lanes for each GPU, and 16 lanes for the bus connection, which means at most, 32GB/sec bandwidth between GPU's.
    that's what the ATI representative said, don't shoot the messenger. I'll dig up the link. Only thing I don't get is why does ati need a fancy bridge chip when the 9800gx2 clearly did fine without one, all it does is add power consumption.

    And you said I'm on the funny sauce again, when was the first time?
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  4. #4
    Xtreme Enthusiast
    Join Date
    Dec 2004
    Location
    Bucharest, Romania
    Posts
    656
    http://www.hardforum.com/showthread.php?t=1319753

    2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
    Intel i7 920 d0 @ 4410MHz @ 1.36v :: Prolimatech Mega Shadow :: Gigabyte EX58-UD5 F9K :: 6GB Mushkin XP3-15000 :: HIS 5870 :: Corsair 1000W :: HannsG 27.5" :: Lian Li V1010B

  5. #5
    Xtreme Mentor
    Join Date
    Nov 2005
    Location
    Devon
    Posts
    3,437
    Quote Originally Posted by AriciU View Post
    http://www.hardforum.com/showthread.php?t=1319753

    2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
    That was interesting info for me:

    The GTX 200 series power saving features do NOT work on an Intel platform. That’s right. I thought I was the odd one out when I saw countless other websites posting about the down-clocks and down-voltages, whereby the GPU settles at 300Mhz Memory to 100Mhz etc when no 3D app is running. What some of them failed to state was that they only achieved this on an Nvidia platform; while ALL the reviews out there failed to warn consumers that this does NOT work on an Intel platform at all. The writeups out there further confuse you by stating the downclocks are achieved in drivers, while only the full hybrid power features, i.e., shutting down the GPU and offloads 2D to the onboard IGP require an Nvidia platform. They just don’t know what they are talking about.

    This was confirmed after a 30 minute phone conversation with an XFX engineer, the manufacturer of my GTX 280.

    What this translates to is that all the idle power consumption graphs you saw on reviews over the GTX 200 series do NOT apply to an Intel platform. Again, out of the 15 odd reviews out there, not one, I repeat, NOT A SINGLE ONE, bothered to state this clearly. Whats the user footprint in the real world with an Nvidia platform? 0.1% of the PC population?

    The GPU temp does drop back to the 50s while idling in Windows, but the clocks do not scale back, which leads me to believe there is no real power savings.
    Anyone to confirm this from personal experience?
    RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W

    RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU

    SmartPhone Samsung Galaxy S7 EDGE
    XBONE paired with 55'' Samsung LED 3D TV

  6. #6
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Lightman View Post
    That was interesting info for me:



    Anyone to confirm this from personal experience?
    Well, I know my GTX 280 downclocks fine on an X38 and P35 chipset. The power consumption figures I have seen follow the trend of it working when at idle.

  7. #7
    Xtreme Member
    Join Date
    Jan 2007
    Location
    WI
    Posts
    316
    If you mean clocking down when not in 3D,that works just fine for me on an Intel chipset.

  8. #8
    Registered User
    Join Date
    Mar 2008
    Location
    Finland
    Posts
    35
    Quote Originally Posted by AriciU View Post
    http://www.hardforum.com/showthread.php?t=1319753

    2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
    "Sorry for the wall of text, I hope you found the above a little more useful than a bunch of graphs from some teenage “me-too” review sites."

    What bad have hardware sites ever done to this guy??
    Case: NZXT Lexa
    PSU: Corsair HX620
    Memory: G.Skill F2-8000CL5D-4GBPQ (2GB x 2)
    CPU cooler: ThermalRight Ultra-120 EXTREME with Noctua NF-P12
    Motherboard: Asus Rampage Formula (BIOS 219, made in china )
    Processor: E2160
    Graphics card: Club3d 9800 GTX

    Input: Logitech G15 v2 + Logitech G9

  9. #9
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    I stopped reading when he was attacking reviewers credibility and then proceeded to use his "trusty right foot" for temperature testing.

    That, along with many other flaws/contradictions.

  10. #10
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Tonucci View Post
    I stopped reading when he was attacking reviewers credibility and then proceeded to use his "trusty right foot" for temperature testing.

    That, along with many other flaws/contradictions.
    What flaws/contradictions did you observe?
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  11. #11
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Quote Originally Posted by STaRGaZeR View Post
    What flaws/contradictions did you observe?
    Sorry, Im not going through that again. I would have to re-read, taking notes

  12. #12
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by AliG View Post
    that's what the ATI representative said, don't shoot the messenger. I'll dig up the link. Only thing I don't get is why does ati need a fancy bridge chip when the 9800gx2 clearly did fine without one, all it does is add power consumption.

    And you said I'm on the funny sauce again, when was the first time?
    X2 cards aren't regular corssfire, GX2 aren't crossfire, crossfire works in a different way SLI does. Does this help out?

  13. #13
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by Calmatory View Post
    X2 cards aren't regular corssfire, GX2 aren't crossfire, crossfire works in a different way SLI does. Does this help out?
    I'm serious, I believe the only reason why they have the bridge chip is to increase the bandwidth between the two gpus for a faster interconnection as they've claimed no micro-stuttering with both the 3870x2 and 4870x2. Of course later on they admitted the micro-stuttering on the 3870x2 was horrible because the bridge chip didn't provide enough bandwidth (8 GB/s) and the extra latency made it worse than regular xfire.
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •