Page 3 of 3 FirstFirst 123
Results 51 to 68 of 68

Thread: HD5800 series and HD5700 series not yet at full performance

  1. #51
    Xtreme Mentor
    Join Date
    Feb 2008
    Posts
    2,671
    Quote Originally Posted by flopper View Post
    However, we might see a benefical boost using dx11, which is why I got the card, not a 4800 serie one.
    This is not a 'might', but a definate 'will'

    http://www.driverheaven.net/articles...d=140&pageid=4

  2. #52
    Xtreme Member
    Join Date
    Mar 2005
    Location
    TX, USA
    Posts
    308
    Quote Originally Posted by 96redformula7 View Post
    I expect gain from more mature drivers, but minimal at most. I remember the days of the x1800 and the supposed "miracle" driver that never came.
    Actually, IIRC, when the X1800XT first launched it had performance on par with the 256MB 7800GTX. 2-3 months later, it had a nice lead over the GTX, to the tune of 10-20%. It isn't really far-fetched to think we haven't seen the full potential of the 5000 series cards yet. One can hope anyway.
    2600k@4.9Ghz 1.42V | Asrock Ext 3 Gen 3 | Custom H20 | 16GB PC3 1600 | 7970@1375/1600

  3. #53

  4. #54
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    it isn't anything new that nvidia cards come up top in UE3 games

  5. #55
    Visitor
    Join Date
    May 2008
    Posts
    676
    That's just an example where there probably will be a big performance boost once the drivers are sorted out. I remember something rather bad with the 4870 in WAW and Far Cry 2 as well. Those games also showed some good improvements after driver updates.

  6. #56
    Xtreme Member
    Join Date
    Oct 2009
    Posts
    335
    Quote Originally Posted by cx-ray View Post
    That's just an example where there probably will be a big performance boost once the drivers are sorted out. I remember something rather bad with the 4870 in WAW and Far Cry 2 as well. Those games also showed some good improvements after driver updates.

    I'm thinking the "capacity" and "monster" wording of the source of information is leaning toward some type of major break through in the actual performance of the hardware which will benefit all rendering. I imagine they figured out an algorithm for maximizing power control for improved over clocking.

    It's all speculation at this point in time. Nothing is certain about this.

  7. #57
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Helloworld_98 View Post
    it isn't anything new that nvidia cards come up top in UE3 games
    deferred rendering + warps = fast

    this would apply to cryengine3 and several other powerful 3d engines.

  8. #58
    Xtreme Member
    Join Date
    Aug 2009
    Posts
    278
    For Borderlands you would need to get into the config file to setup properly for a widescreen, the POV should be around 105-110 for 16x10, there is alos a bunch of other features left out even when you use max settings you can enable to get better eye candy. Just like UT3.

  9. #59
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Twimtbp...

  10. #60
    Registered User
    Join Date
    Jan 2009
    Posts
    26
    Interesting if true... ok guys correct me if i'm wrong but here's my hypothesis.

    What if the 4800 series and now the 5800 weren't utilizing their shader cores (SP's) to their full potential since release? Lot people said Crysis wasn't optimised for ATI's but the 4870 has 160x5, what if it was only using 160x1 shaders since (not sure) most of the game were not meant to use more groups (types) of shaders or "TWIWMTBP" and other programmers used only 1 shader "type?" so the radeon might only used 20% of it's shader power? That's a difference from NVIDIA where their SPU's are not "multi-units" and were in a higher number (x1) @higher clocks. It could also have been all in the drivers tho. It could "explain" why the 5870 is "underperforming" (according to users who got that card as i do not have one myself).

    http://www.behardware.com/news/9972/...dx9-cards.html
    The programmer doesn’t control how the tessellation unit itself calculates but works with 2 new types of shaders, the Hull Shader and the Domain Shader, which come before and after the tessellation unit in the pipeline. This makes DirectX 11 a tessellation superset of the Radeon HDs. Another way of putting it is to say that using it today alongside the Radeon HDs will allow developers to reuse and extend their work with DirectX 11, which means they might finally get interested.
    Maybe Directx11 enabled games will use most of the shader groups inside the ATI GPU's now and the programmers can "offload" compute shaders tasks and others on to the other SP groups... ?

    Maybe ATI has got a new driver that's being very efficient in DX11. The SPU's will be able to process physics calculation with the "compute shaders" and throw a big middle finger @Physx. Crossfire might now scale much higher with DX11.

    I am trying to get guru's opinions, share your thoughts
    Last edited by grimeleven; 10-31-2009 at 09:50 AM.
    CM HAF932
    Antec TP3 650Watts
    eVGA X58 3xSLI
    Intel Core i7 920@3.5Ghz w/ TRUE X 120
    6x1GB Aeneon Xtune DDR3-1866
    Visiontek HD4870X2 2GB w/ AC Xtreme cooler
    OCZ 120GB SSD
    2TB Hitachi
    Creative X-FI Titanium FATAL1TY
    Saitek Eclipse 3 Keyboard
    Seven Ultimate 64bit
    Samsung 32inch LCD 1080p

  11. #61
    Xtreme Member
    Join Date
    Oct 2009
    Posts
    335
    Quote Originally Posted by grimeleven

    I am trying to get guru's opinions, share your thoughts

    You sure you're not being a troll or spewing fud?


















    Alot to think about. At this point it's all speculation. Need official confirmation of information in source.

  12. #62
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by grimeleven View Post
    Interesting if true... ok guys correct me if i'm wrong but here's my hypothesis.

    What if the 4800 series and now the 5800 weren't utilizing their shader cores (SP's) to their full potential since release? Lot people said Crysis wasn't optimised for ATI's but the 4870 has 160x5, what if it was only using 160x1 shaders since (not sure) most of the game were not meant to use more groups (types) of shaders or "TWIWMTBP" and other programmers used only 1 shader "type?" so the radeon might only used 20% of it's shader power? That's a difference from NVIDIA where their SPU's are not "multi-units" and were in a higher number (x1) @higher clocks. It could also have been all in the drivers tho. It could "explain" why the 5870 is "underperforming" (according to users who got that card as i do not have one myself).
    they utilization depends on the game. more complex code will not pack enough instructions. usually they will use 3-4 of each vliw core. in bad cases it can be lower. its not like you are going to get more ilp out of games by a magic driver update. twimtbp has nothing to do with it. remember unified shaders? nvidia does not use vectors so each shader gets much higher utilization. both ATi and nvidia design their architectures around what programmers are doing. they take almost opposite approaches to their architectures. you will not see a huge increase just from a driver update.

  13. #63
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by AbelJemka View Post
    Twimtbp...
    I don't care if its twimtbp or not. I want performance.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  14. #64
    Registered User
    Join Date
    Jan 2009
    Posts
    26
    Quote Originally Posted by SonDa5 View Post
    You sure you're not being a troll or spewing fud?



    Alot to think about. At this point it's all speculation. Need official confirmation of information in source.
    Hehe no.. by the way that's why i said "..here's my hypothesis", constructive post in my book :P

    I am no engineer in this, i expected to get clarifications as this seemed intriguing to me, i agree it's all speculation and we'll find out soon enough.
    Last edited by grimeleven; 10-31-2009 at 10:51 AM.
    CM HAF932
    Antec TP3 650Watts
    eVGA X58 3xSLI
    Intel Core i7 920@3.5Ghz w/ TRUE X 120
    6x1GB Aeneon Xtune DDR3-1866
    Visiontek HD4870X2 2GB w/ AC Xtreme cooler
    OCZ 120GB SSD
    2TB Hitachi
    Creative X-FI Titanium FATAL1TY
    Saitek Eclipse 3 Keyboard
    Seven Ultimate 64bit
    Samsung 32inch LCD 1080p

  15. #65
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by annihilat0r View Post
    I don't care if its twimtbp or not. I want performance.
    You will have but you have to wait Catalyst team to get the hands on the game...like the TWIMBP games

  16. #66
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by AbelJemka View Post
    You will have but you have to wait Catalyst team to get the hands on the game...like the TWIMBP games
    Since I'm sure that ATI has to buy a copy at release.

  17. #67
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Sure they have

  18. #68
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Quote Originally Posted by BababooeyHTJ View Post
    Since I'm sure that ATI has to buy a copy at release.
    LOL, good call
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

Page 3 of 3 FirstFirst 123

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •