Page 6 of 19 FirstFirst ... 345678916 ... LastLast
Results 126 to 150 of 454

Thread: Radeon HD 5870 Six with 6 display outputs, prices, performance

  1. #126
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    185

    Radeon HD 5870 has 1600 shaders

    http://www.fudzilla.com/content/view/15436/1/

    We finally figured out out the final specification of the chip that we called RV870 and the fact that AMD plans to call it Radeon HD 5870 doesn’t come as a big surprise. The chip works at 825MHz and has 1600 shaders, two times more than RV770 which indicates that the chip is two times faster than the year old RV770.

    The chip has as mmany as 2.1 billion transistors and is more than twice the number the RV770 packs, which has 956 million transistors. The card uses GDDR5 memory clocked at 1.3GHz (5.2GHz in quad mode) and can provide more than 150GB/second bandwidth. The power of this card stays at 180W while in idle the power drops down to 27W, three times less than the 90W on 4870.

    By a rough specification-based estimate, the Radeon HD 5870 could end up two times faster than the Radeon HD 4870 but realistically, you should expect the new card to be faster by about 60 percent across the board.

  2. #127
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    800
    Quote Originally Posted by SoulsCollective View Post
    Quote Originally Posted by SoulsCollective View Post
    As I said - DisplayPort.

    HDMI has more than adequate "bandwidth" to carry digital data at resolutions even higher than what we consider "extreme" now (ie. 2560), and I didn't think anyone would call the ability to carry full 7.1 audio data over the same cable "useless" (DisplayPort has the same functionality anyway - would you call it useless in that context?). Furthermore, HDMI supports xvYCC, which we should all be using anyway.

    But this isn't about HDMI><DP, this is about DP itself - the supporters are advocating it as "HDMI for your PC", which sounds neat, but the problem is the DRM - the number of consumer-level displays which are HDCP certified is tiny compared to the number of otherwise perfectly good displays that will be needlessly and pointlessly rendered useless if DP gains wide acceptance. If Big Content (R) and Big Beige Boxes (TM) get together and enforce this standard on us, we all lose.

    The point is, it isn't needed, it isn't even desirable when ordinary DVI works just fine for even the highest resolutions available today, and really it has no purpose other than to restrict consumer rights and choice.
    If it is an open standard, adding another port wouldn't cost much. Like largon said in the thread you made, there are adaptors, which would nullify your point bolded in the quote. They're not electrically compatible but adaptors still exist.

    http://www.monoprice.com/products/su...04&cp_id=10428

    xvYCC support: When we have such capable monitors, I do think we would also have DisplayPort support it.

    Well here's my conclusion: DisplayPort will not affect anything that exists today, why not embrace it?

    http://en.wikipedia.org/wiki/Display...tages_over_DVI

    Here's a few advantages. It really needs a citation link though.

    http://www.edn.com/article/CA6594089.html

    The only problem I see is HDMI/DVI to DisplayPort, which isn't mentioned. Then again, if that ever happen, that means all TVs, all computer monitors by that time are using DisplayPorts.

    Quote Originally Posted by jaredpace View Post
    First rumored benchmark is Crysis:

    HD5870 Crysis Benchmark Score
    CPU: AMD Phenom II X4 955BE
    Win 7 RTM
    VGA: HD5870 1GB

    Crysis 1900x1200 4AA +16 AF DX10 Very High
    min: 30 .**
    avg: 43 .**
    max: 54 .**



    http://74.125.159.132/translate_c?hl...ivsFsnZPqEd56Q



    Well looks like it is faster than 4890 Crossfire and GTX295

    Crysis in full detail isn't a dream anymore

    Did their memory management got more efficient? IIRC, 4xAA kills memory (though I admit, I've never tried 4x AA on Crysis myself).
    Last edited by blindbox; 09-10-2009 at 12:42 AM. Reason: Grammatical errors. trying not to get Off Topic

  3. #128
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Get the out of here! If those numbers are true I'm picking one of these up as soon as possible regardless of price. That and 9700pro part 2. Might just be abnormaly good at crysis though.

    Damn, I hope those numbers are true.

  4. #129
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    The card, rocks.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  5. #130
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    another comparison from that thread:


    PhII 955 + HD5870 1GB: 43FPS

    It appears it is literally two perfectly scaling 4890's in a dual core. woah.

    However, on the other hand, here is another perspective:


    edit: here is one more from THG:


    According to this, 4870x2 < GTX295 < GTX280 sli < 5870 1GB OC < GTX280 tri-sli < 4870x2 Quadfire < GTX295 quad sli
    Last edited by jaredpace; 09-10-2009 at 02:52 PM.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  6. #131
    L-l-look at you, hacker.
    Join Date
    Jun 2007
    Location
    Perth, Western Australia
    Posts
    4,644
    Quote Originally Posted by blindbox View Post
    If it is an open standard, adding another port wouldn't cost much. Like largon said in the thread you made, there are adaptors, which would nullify your point bolded in the quote. They're not electrically compatible but adaptors still exist.

    xvYCC support: When we have such capable monitors, I do think we would also have DisplayPort support it.

    Well here's my conclusion: DisplayPort will not affect anything that exists today, why not embrace it?

    http://en.wikipedia.org/wiki/Display...tages_over_DVI

    Here's a few advantages. It really needs a citation link though.

    http://www.edn.com/article/CA6594089.html

    The only problem I see is HDMI/DVI to DisplayPort, which isn't mentioned. Then again, if that ever happen, that means all TVs, all computer monitors by that time are using DisplayPorts.
    I think you're missing the point. You can convert DP or MiniDP to DVI, sure, meaning you can connect a monitor to it no probs. But because of the DRM, you can't play back HD content on that monitor, even if the monitor is perfectly capable of doing so. If DP wasn't an option, HD content marketed at PC users would have to be DRM-free, because our current standards of DVI and VGA do not support HDCP. As for colour space, I don't think you can just add support at a later stage, but I could be wrong on that point, not being an expert on such things.

    DP has no advantages for the current market and will continue to have no advantages for the next few years at least. Pushing DP is all about content control and rights restriction, nothing more - it gives the end-user no benefit and there is no compelling reason for us to switch to it until HD media is regularly produced and available in resolutions above 2560*1600. And by that point I sincerely hope we've all abandoned this ridiculous screw-the-consumer DRM bullcrap.
    Rig specs
    CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200

    Foundational Falsehoods of Creationism



  7. #132
    Xtreme Addict
    Join Date
    Nov 2006
    Posts
    1,402
    super vga was the best. better than dual dvi link ...

  8. #133
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Will you guys take it somewhere else with the displayport crap?

  9. #134
    L-l-look at you, hacker.
    Join Date
    Jun 2007
    Location
    Perth, Western Australia
    Posts
    4,644
    Quote Originally Posted by madcho View Post
    super vga was the best. better than dual dvi link ...
    Er...no. Unless you mean WQUXGA, which is just silly and was to the best of my knowledge never actually implemented.
    Rig specs
    CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200

    Foundational Falsehoods of Creationism



  10. #135
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by jaredpace View Post
    another comparison from that thread:


    PhII 955 + HD5870 1GB: 43FPS

    It appears it is literally two perfectly scaling 4890's in a dual core. woah.

    However, on the other hand, here is another perspective:
    and also, the 2gb version of cypress might add some higher numbers.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  11. #136
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    So 5850 is 1440SP + 725MHz = 2.08 TFlops... and 5870 is 1600SP + 850 (or 825?) MHz = 2.6-2.7TFlops?

    Wow, I can't even imagine what a 5870X2 would do.

    As far as memory management goes... it appears that ATI has been better at memory management than the Nvidia equivalent. Take a look at benches of G92 chip cards with increased AA or increased memory vs. RV770 cards...

  12. #137
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Quote Originally Posted by SoulsCollective View Post
    Er...no. Unless you mean WQUXGA, which is just silly and was to the best of my knowledge never actually implemented.
    I believe it's called sarcasm!

  13. #138
    Xtreme Addict
    Join Date
    Jul 2002
    Location
    [M] - Belgium
    Posts
    1,744
    Crysis or Crysis Warhead. Not the same benchmark, not the same results, and between maps there is a huge difference; those numbers

    Crysis 1900x1200 4AA +16 AF DX10 Very High
    min: 30 .**
    avg: 43 .**
    max: 54 .**
    don't mean anything without more info about what benchmark was used; the fact that they refer to "DX10 Very High" means it's Crysis, not Crysis Warhead, so comparing the results to Crysis Warhead is apples vs oranges.
    Without info on what map it was; if it was a fly-by or not etc... no conclusion can be drawn from this; an average of 43 fps at 1920x1200 4xAA/16xAF can be had with a slightly OC'ed HD 4890 in Crysis... depending on the map/benchmark.


    Belgium's #1 Hardware Review Site and OC-Team!

  14. #139
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by jmke View Post
    Crysis or Crysis Warhead. Not the same benchmark, not the same results, and between maps there is a huge difference; those numbers



    don't mean anything without more info about what benchmark was used; the fact that they refer to "DX10 Very High" means it's Crysis, not Crysis Warhead, so comparing the results to Crysis Warhead is apples vs oranges.
    Without info on what map it was; if it was a fly-by or not etc... no conclusion can be drawn from this; an average of 43 fps at 1920x1200 4xAA/16xAF can be had with a slightly OC'ed HD 4890 in Crysis... depending on the map/benchmark.
    well yes, we understand this. thats why I've posted random 1920 4xaa benchmarks from 4 differnent sites. No telling the map or if it's regular/warhead, or if its 32/64 bit, 2gb walmart ram, 12gb domintators, etc etc. it's just to give you a relative idea, since theres nothing to compare it to ....
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  15. #140
    Xtreme Addict
    Join Date
    Jul 2002
    Location
    [M] - Belgium
    Posts
    1,744
    no, you posted random Crysis Warhead benchmarks. not the same

    if a standard "demo" was used for that 43 avg, it would mean the HD 5870 will end up close to 2x as fast as GTX 280 and HD 4870 X2.


    Belgium's #1 Hardware Review Site and OC-Team!

  16. #141
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by jmke View Post
    no, you posted random Crysis Warhead benchmarks. not the same

    if a standard "demo" was used for that 43 avg, it would mean the HD 5870 will end up close to 2x as fast as GTX 280 and HD 4870 X2.
    no, w1zzard's are not warhead....

    Also, 4870x2 & gtx280 aren't the same speed in the first place. And yes, it looks to be 2x the speed of a gtx280.
    Last edited by jaredpace; 09-10-2009 at 01:29 AM.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  17. #142
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    107
    If those numbers are from the first level and the ice level, then this card is golden. The flyby intro I think is the most brutal part of Crysis and even dual GPU can't maintain > 30fps. With a pair of SSC GTX285s, i7 @ 3.8ghz and X25-E SSD, during the flyby I was hitting lows of 27-28fps.

    As far as the beginning of the thread, if it's true, I hope the 25-40% increased performance number does not refer to the 4890 though, because then you are talking about the same speed as an overclocked GTX285.

    Lol even if this game does conquer Crysis it's been what--3 years? Crysis has the title as the longest running game to totally devastate high end PC gaming, lol. Crysis Warhead and Crysis are different though, IIRC the original Crysis is slightly more demanding by ~5%.
    Last edited by astrallite; 09-10-2009 at 01:34 AM.

  18. #143
    Xtreme Mentor
    Join Date
    Feb 2007
    Location
    Oxford, England
    Posts
    3,433
    Quote Originally Posted by blindbox View Post
    Crysis in full detail isn't a dream anymore

    Did their memory management got more efficient? IIRC, 4xAA kills memory (though I admit, I've never tried 4x AA on Crysis myself).
    iirc they've changed the way they do AA so it could just be the way it calculates it it better & more efficient =)

    plus memory management im sure wil get tweaks
    "Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
    //James

  19. #144
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  20. #145
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    800
    Quote Originally Posted by SoulsCollective View Post
    I think you're missing the point. You can convert DP or MiniDP to DVI, sure, meaning you can connect a monitor to it no probs. But because of the DRM, you can't play back HD content on that monitor, even if the monitor is perfectly capable of doing so. If DP wasn't an option, HD content marketed at PC users would have to be DRM-free, because our current standards of DVI and VGA do not support HDCP. As for colour space, I don't think you can just add support at a later stage, but I could be wrong on that point, not being an expert on such things.

    DP has no advantages for the current market and will continue to have no advantages for the next few years at least. Pushing DP is all about content control and rights restriction, nothing more - it gives the end-user no benefit and there is no compelling reason for us to switch to it until HD media is regularly produced and available in resolutions above 2560*1600. And by that point I sincerely hope we've all abandoned this ridiculous screw-the-consumer DRM bullcrap.
    You can convert DP to HDMI. From what I see, they use packet-based delivery system, which probably would allow addition of extra colour gamut easily (I'm no expert nonetheless).

    I don't get it, DP can implement HDCP (and they did), so what's there about incompatibility? They even mentioned backward compatibility as one of their main/important/strong points.
    Last edited by blindbox; 09-10-2009 at 02:04 AM. Reason: main/important/strong

  21. #146
    Registered User
    Join Date
    Feb 2009
    Posts
    32

    Techradar AMD Eyefinity Preview


  22. #147
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    dang, need moe money to get screens

    Finally gaming as it was meant to be played on ati cards!
    Last edited by flopper; 09-10-2009 at 02:19 AM.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  23. #148
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    800
    I wish they post a close-up view of the card running. It's there! Not in a case, c'mon just a little closer

  24. #149
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Dang. Did noone bring a mirror-shot and a tele-zoom to one of the presentations? I'm getting sick and tired of this blurred shots taken with mobiles...
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  25. #150
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Here is a repost - everyone is saying, "Dual-core" regarding 5870 :
    Dual core / MCM: This generation supports a new "hard" method of dual gpu rendering that differs from previous Crossfire implementation that we see R800 and in rv870 crossfire situations. The chinese translation is "dual core" or split frame hardware- Has something to do with the way the shaders/alus etc operates inside an individual gpu (a method of simultaneous operations in the hardware, similar to dual core, yet not actually two dies in one package ala MCM). Perhaps there is no more real-time compiler in the driver and its all handled on the hardware level by the scheduler. Because the core of the chip is so modular & scalable with certain areas sharing parts of the die (ROPS + Memory controller logic), you are able to divide the specs in half (1600/256/80/32 to 800/128/40/16) and have two parts (rv870/rv830) and rv870 appears as two rv830's, yet it is still only one die. Hence term "dual-core" - more like "modular". Some people think, yes, but all GPUs are multi-core because each of the shaders is like a core by itself. Well, here we are dealing with two large arrays of 800 shaders, along with other standalone logic that communicates with either one of the two identical arrays. More specifically, one Rv870 die is composed of two Rv830-like 160x5 clusters/shader arrays (like two current-gen revamped rv770's in one die) sharing certain features, but connected via "internal crossfire" working in unison and the entire design is a continuation of R600 architecture. It is load balanced, efficient, and requires no crossfire software optimization (because it is hardware level communication); it works via SFR 8*8-32*32, and is bandwidth intensive. The board is using next-gen 5ghz Gddr5 to provide the required bandwidth. So, apparently they've slapped together two 40nm rv770's... so it's easy to see where "dual-core" confusion comes from. Cypress is like a 40nm 4890x2 in one die! -So, it is like a "native dual core" CPU. Now that specs are known for Cypress & Juniper (rv870 & rv830) you can expect that the remaining parts Cedar & Redwood (rv840 & rv810) are 3/4 Cypress & 1/4 Cypress respectively, and that Hemlock (r800) is 2 x Cypress in the same fashion as the HD4870x2 on a single PCB. So, Cypress is like a larabee, except that it uses 2 rv770 cores, whereas larabee is using several P54C cores. (Even though this isn't an actual rv870 die shot, rather an artist's rendition posted earlier - when you do see an actual one, it will look more like the first two images ie. a symmetrical reflection of two identical core areas over a center axis, rather than the third picture - which is an actual die shot of rv770. Notice in the 3rd pic that the rv770 is asymmetrical by design, not resembling a dual core architecture.
    http://www.xtremesystems.org/forums/...&postcount=174
    I wrote this a few weeks ago about the rv870 core based off what i was reading about it. In the bolded part, the report I read said that it was "200% efficient" in scaling between the two cores because of zero overhead.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

Page 6 of 19 FirstFirst ... 345678916 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •