MMM
Results 1 to 25 of 372

Thread: Official HD4870X2 (R700) Review Thread

Hybrid View

  1. #1
    Xtreme Member
    Join Date
    Mar 2008
    Posts
    332
    Quote Originally Posted by SKYMTL View Post
    CPU bottleneck is DEFINATELY an issue this time around, there is no doubt about that. If one card is bottlenecked by a QX9770 at 3.85Ghz at some resolutions, I don't think there is a CPU on this planet that is 25 / 7 stable which can take advantage of two of those cards.
    What about one card?

    This card is incredible, well if it was a Crysis killer it would be.

    Anyway would Q9450 @ 3.60ghz bottleneck the card @ 1920 x 1200?

    Im finding it hard to believe two of these cards could be bottlenecked @ 1600p, with a 4ghz Dual.

  2. #2
    Xtreme Member
    Join Date
    Feb 2008
    Posts
    192
    Seriously guys, Crysis is very playable with this card at 22" Monitor size @ DX10 Very high:



    At this fps it is totally smooth and playable. Be interesting to see what the performance is like once the sideport is enabled in the drivers and whether or not it would help a title like Crysis. EDIT -

    Just seen this:

    http://www.anandtech.com/video/showdoc.aspx?i=3372&p=3

    The CrossFire Sideport is simply another high bandwidth link between the GPUs. Data can be sent between them via a PCIe switch on the board, or via the Sideport. The two aren't mutually exclusive, using the Sideport doubles the amount of GPU-to-GPU bandwidth on a single Radeon HD 4870 X2. So why disable it?

    According to AMD the performance impact is negligible, while average frame rates don't see a gain every now and then you'll see a boost in minimum frame rates. There's also an issue where power consumption could go up enough that you'd run out of power on the two PCIe power connectors on the board. Board manufacturers also have to lay out the additional lanes on the graphics card connecting the two GPUs, which does increase board costs (although ever so slightly).

    AMD decided that since there's relatively no performance increase yet there's an increase in power consumption and board costs that it would make more sense to leave the feature disabled.

    The reference 4870 X2 design includes hardware support for the CrossFire Sideport, assuming AMD would ever want to enable it via a software update. However, there's no hardware requirement that the GPU-to-GPU connection is included on partner designs. My concern is that in an effort to reduce costs we'll see some X2s ship without the Sideport traces laid out on the PCB, and then if AMD happens to enable the feature in its drivers later on some X2 users will be left in the dark.

    I pushed AMD for a firm commitment on how it was going to handle future support for Sideport and honestly, right now, it's looking like the feature will never get enabled. AMD should have never mentioned that it ever existed, especially if there was a good chance that it wouldn't be enabled. AMD (or more specifically ATI) does have a history of making a big deal of GPU features that never get used (Truform anyone?), so it's not too unexpected but still annoying.
    Looks like it might be a long wait!!

    EDIT - Just saw that you posted this news earlier in the thread Fornowagain, sorry mate, my bad.
    Last edited by Mumid; 08-12-2008 at 05:13 AM. Reason: additional info
    E8400 Q15A @ 8.5x500=4250Mhz with 1.28 vcore -- TRUE 120 with Scythe Ultra Kaze in push pull
    ASUS P5Q Deluxe 1702 Bios -- PC Power & Cooling Silencer QUAD Crossfire 750w
    2x1GB Crucial Ballistix PC6400 (16FD5) & 2x1GB Crucial Ballistix PC8500 (16FD5) @ 1000mhz 4-4-4-12 PL9 @ 2.17v real with Corsair Dominator Active Cooling
    ATI HD4870x2 Cat 9.1 - Custom modded ASUS TOP Bios 832/1000
    2 x Samsung F1 1TB + 1 x Samsung F1 320GB -- X-Fi Xtreme Music -- Logitech G15 v2 + Razer Lachesis
    Dual Boot XP 32 SP2 + Vista 64 Ultimate SP1 -- Samsung T220 22" Monitor (Samsung Panel) + Samsung LEA656 40" 1080p TV

  3. #3
    Xtreme Addict
    Join Date
    Jul 2002
    Location
    [M] - Belgium
    Posts
    1,744
    Quote Originally Posted by Mumid View Post
    Seriously guys, Crysis is very playable with this card at 22" Monitor size
    The HD4870X2 is definitely meant for use with 1920x1200 and preferably 2560x1600; so that's 24"~27" and 30" TFT screens.

    At 22" 1680x1050 you are better of buying HD4870/GTX 260 , will offer a better price/performance point; unless you really really want to use 24xAA/16xAF in all your games (except Crysis) on your 22"


    Belgium's #1 Hardware Review Site and OC-Team!

  4. #4
    Xtreme Member
    Join Date
    Feb 2008
    Posts
    192
    Quote Originally Posted by jmke View Post
    The HD4870X2 is definitely meant for use with 1920x1200 and preferably 2560x1600; so that's 24"~27" and 30" TFT screens.

    At 22" 1680x1050 you are better of buying HD4870/GTX 260 , will offer a better price/performance point; unless you really really want to use 24xAA/16xAF in all your games (except Crysis) on your 22"
    Oh yeah defo mate if price/performance is your main objective, however, that is'nt the case for all of us!

    Besides, I also play many PC games @ 1080p on my 40" HDTV.
    E8400 Q15A @ 8.5x500=4250Mhz with 1.28 vcore -- TRUE 120 with Scythe Ultra Kaze in push pull
    ASUS P5Q Deluxe 1702 Bios -- PC Power & Cooling Silencer QUAD Crossfire 750w
    2x1GB Crucial Ballistix PC6400 (16FD5) & 2x1GB Crucial Ballistix PC8500 (16FD5) @ 1000mhz 4-4-4-12 PL9 @ 2.17v real with Corsair Dominator Active Cooling
    ATI HD4870x2 Cat 9.1 - Custom modded ASUS TOP Bios 832/1000
    2 x Samsung F1 1TB + 1 x Samsung F1 320GB -- X-Fi Xtreme Music -- Logitech G15 v2 + Razer Lachesis
    Dual Boot XP 32 SP2 + Vista 64 Ultimate SP1 -- Samsung T220 22" Monitor (Samsung Panel) + Samsung LEA656 40" 1080p TV

  5. #5
    Xtreme Addict
    Join Date
    Jul 2002
    Location
    [M] - Belgium
    Posts
    1,744
    Quote Originally Posted by Mumid View Post
    Oh yeah defo mate if price/performance is your main objective, however, that is'nt the case for all of us!

    Besides, I also play many PC games @ 1080p on my 40" HDTV.
    you didn't mention that last part ;-)
    how far away are you from that 40" HDTV? because at 3+ meter difference between 1080p and 720p is nihil and you can just run at 1280x720 with all the AA/AF turned up.


    Belgium's #1 Hardware Review Site and OC-Team!

  6. #6
    Xtreme Member
    Join Date
    Feb 2008
    Posts
    192
    Quote Originally Posted by jmke View Post
    you didn't mention that last part ;-)
    how far away are you from that 40" HDTV? because at 3+ meter difference between 1080p and 720p is nihil and you can just run at 1280x720 with all the AA/AF turned up.
    About a metre away when im playing games on it. Sure beats the graphics a 360 puts out, everything rendered in true 1080p instead of upscaled lol. Makes me laugh when 360 and PS3 advertise 'full HD' as one of thier features.
    E8400 Q15A @ 8.5x500=4250Mhz with 1.28 vcore -- TRUE 120 with Scythe Ultra Kaze in push pull
    ASUS P5Q Deluxe 1702 Bios -- PC Power & Cooling Silencer QUAD Crossfire 750w
    2x1GB Crucial Ballistix PC6400 (16FD5) & 2x1GB Crucial Ballistix PC8500 (16FD5) @ 1000mhz 4-4-4-12 PL9 @ 2.17v real with Corsair Dominator Active Cooling
    ATI HD4870x2 Cat 9.1 - Custom modded ASUS TOP Bios 832/1000
    2 x Samsung F1 1TB + 1 x Samsung F1 320GB -- X-Fi Xtreme Music -- Logitech G15 v2 + Razer Lachesis
    Dual Boot XP 32 SP2 + Vista 64 Ultimate SP1 -- Samsung T220 22" Monitor (Samsung Panel) + Samsung LEA656 40" 1080p TV

  7. #7
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by y2kbos View Post
    What about one card?

    This card is incredible, well if it was a Crysis killer it would be.

    Anyway would Q9450 @ 3.60ghz bottleneck the card @ 1920 x 1200?

    Im finding it hard to believe two of these cards could be bottlenecked @ 1600p, with a 4ghz Dual.
    First of all, trust me...they are bottlenecked which is why I refused to add any CrossfireX testing to my review. I did a few short tests on my system at my nominal 3.85Ghz and a somewhat quick and unstable overclock to 4.2Ghz and I saw some very significant increases in framerates with a pair of these cards. We are talking about increases which are above and beyond what an additional 400Mhz overclock can achieve.

    At for a quad @ 3.60Ghz bottlenecking it @ 24" resolution. If you start jacking up the AA and AF then no but if you run without higher IQ presets, then there is that chance. However, even IF it gets bottlenecked you are still pretty much guaranteed high performance.

  8. #8
    Xtreme Member
    Join Date
    Oct 2005
    Posts
    462
    Quote Originally Posted by SKYMTL View Post
    First of all, trust me...they are bottlenecked which is why I refused to add any CrossfireX testing to my review. I did a few short tests on my system at my nominal 3.85Ghz and a somewhat quick and unstable overclock to 4.2Ghz and I saw some very significant increases in framerates with a pair of these cards. We are talking about increases which are above and beyond what an additional 400Mhz overclock can achieve.

    At for a quad @ 3.60Ghz bottlenecking it @ 24" resolution. If you start jacking up the AA and AF then no but if you run without higher IQ presets, then there is that chance. However, even IF it gets bottlenecked you are still pretty much guaranteed high performance.
    I agree completely. My graph seems to confirm some serious CPU bottleneck action going on.

    New EOCF SuperPi thread! Post your scores here
    PCProfile ClubOC ClubNBOC

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •