MMM
Page 25 of 34 FirstFirst ... 1522232425262728 ... LastLast
Results 601 to 625 of 828

Thread: AMD Radeon HD6950/6970(Cayman) Reviews

  1. #601
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Shadov View Post
    Depends on the approach I guess and as far as I remember AMD said in an interview that they will never de optimize games for their competitor.

    That way proprietary standards like PhysX are bad, since you can have normal physics in a game or Nvidia can add a bit of green fairy dust (also known as money) to a title in production and suddenly normal features like e.g. fog are implemented in a way that will work only on NV cards.

    You can call me biased, but I prefer to have standard features available to everyone.
    Someone clearly forgot Call of Juarez, where AMD had the company completely block MSAA support entirely so that it had to be run AMD's way or no way at all as NVidia had not yet optimized that method of AA. This artificially made the 2900XT look competitive with the 8800GTX(although 2 driver releases later that was nowhere near the case). To make it worse, the copy they sent NVidia was BEFORE this change, and they then sent reviewers the benchmark with it all changes literally not even giving NVidia a chance to acknowledge that there was a change.

    Basically AMD have done the same, they just can't afford to do it often, or with big releases, like NVidia can. They'll lie to you and say they won't do it, but they already have... Both companies have... NVidia can just afford to do it more often while AMD has to pick like 1 game every few years.

    As to the people saying they shouldn't include those games... You want them to alienate the entire population of people who do play those titles? How would you feel if you bought a new $379 video card and found out it absolutely sucks in comparison to it's competitors in 2 games you play because the reviewers felt that it would be more fair to AMD. I know I would be OUT RIGHT furious if it's a game I really enjoy playing, wouldn't you?

    Fact is, AMD apparently have a weakness in their current architecture revolving around some methods of tessellation. I think it has to do with more than just drivers, or AMD would've done something about this already in the case of LP2. Also, Dead Rising 2 uses the same engine that LP2 does, so tack another title into that list, along with any other capcom titles that show up this round. AMD have a weakness in terms of standard DX11 features, NVidia apparently doesn't.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  2. #602
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by DilTech View Post
    AMD have a weakness in terms of standard DX11 features, NVidia apparently doesn't.
    Depends on how you look at it. AMD chips have a sweet spot for tessellation performance. Using nearly non-existant and very heavy tessellation can cause lower performance. At least according to the slides they are releasing...
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  3. #603
    Xtreme Addict
    Join Date
    Jun 2005
    Location
    Rochester, NY
    Posts
    2,276
    Quote Originally Posted by zalbard View Post
    Weird, we seem to be playing very diff games, then...
    it's not too weird, the card was high end 2 years ago.
    how much have computer graphics evolved since then? not very much....
    for first person shooters, graphics have improved only a little thanks to those damn consoles, i have no issue playing COD4, Bad Company 2, World at War.
    Starcraft II runs fine, Dragon Age 2, NFS Hot Pursuit, Fallout Vegas, Stalker.

    Some games that have given me some issues:
    Just Cause 2 i run nearly maxed out, metro 2033 i run nearly maxed out.
    Mafia II, i cant quite remember the settings, but i think near maxed out.
    crysis, duh...

    Those are the only games that come to mind that I have played in the last year or two. I do have the card OCed a bit and an i7 definitely helps.
    also as said previously, with the resolution i have, high levels of AA is not needed.

    What card do you have and what games have given you big problems?
    Quote Originally Posted by NKrader View Post
    just start taking pics of peoples kids the parents will come talk to you shortly. if you have a big creepy van it works faster

  4. #604
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by zalbard View Post
    Depends on how you look at it. AMD chips have a sweet spot for tessellation performance. Using nearly non-existant and very heavy tessellation can cause lower performance. At least according to the slides they are releasing...
    Considering we've seen a few titles using this "nearly non-existent and very heavy tessellation", it looks like it may not be as non-existent as AMD may make it out to be. I think we all know that as time moves on tessellation usage will also increase in games. Fact is, AMD still have a weakness on the tessellation side of things, one that NVidia clearly can handle. I find it ironic how AMD were the first company to support this feature, yet they happen to be weaker in this area. What AMD need to be worried about, as well as all owners of the new AMD cards, is if crysis uses tessellation as heavily as HAWX2 does, which current rumors makes it appear that it might. If it does, then people are going to be extremely upset with their purchases.

    Of course, I'm never one to gamble on the "what-if's" of the future when it comes to hardware. I look at the here and now scenario. AMD's big advantage recently had been being a half node ahead of NVidia at each step on the higher end, which allowed them to keep prices lower in comparison. Now they've lost that advantage, and will have to work around that situation, because when 28nm rolls around it's pretty clear both companies will be prepared to make the leap. It'll be interesting to see how AMD will deal with this.

    Next year will definitely be an interesting year indeed for hardware. NVidia with a new architecture, and AMD should seemingly have a few changes to theirs as well.
    Last edited by DilTech; 12-21-2010 at 08:27 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  5. #605
    Xtreme Enthusiast
    Join Date
    Feb 2009
    Posts
    800
    Quote Originally Posted by DilTech View Post
    Considering we've seen a few titles using this "nearly non-existent and very heavy tessellation",
    Name a few other than HAWX? Flat landscape was tessellated as well, by the way. AMD said they're going to fix it through drivers anyway, as complaints fell through deaf ears.

    Metro 2033 we're seeing AMD rocking it
    Call of Pripyat.. amazing performance there.
    AVP... there's not much tesselation there anyway.

    http://www.anandtech.com/show/4061/a...eon-hd-6950/15
    http://www.anandtech.com/show/4061/a...eon-hd-6950/19

  6. #606
    Xtreme Addict
    Join Date
    May 2009
    Location
    Switzerland
    Posts
    1,972
    Quote Originally Posted by Oztopher View Post
    I get some graphical glitching on metro, mostly when outside. It's something to do with the shadow/lighting system, it's all over the place. When i move the camera around the shadows move too, consistently.

    Apart from that it looks and plays great with very high setting. I don't bother playing in DX11, (i know there is) but i see no real visual difference and FPS is quite a bit lower compared to DX10. That's with just tessellation enabled, with advanced DoF it halves my framerate.

    Object motion blur on the very high preset is an absolute must
    Set DX11 and just uncheck DOF + PhysX ( ofc )... Tesselation is not a problem on this game, It's just DOF who kill the fps.

    For the shadow problem, sadly i have no idea, if it's a driver thing it will be solved soon.
    CPU: - I7 4930K (EK Supremacy )
    GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
    Motherboard: Asus x79 Deluxe
    RAM: G-skill Ares C9 2133mhz 16GB
    Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0

  7. #607
    Xtreme Addict
    Join Date
    Sep 2010
    Location
    Australia / Europe
    Posts
    1,310

  8. #608
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    960
    Quote Originally Posted by DilTech View Post
    Someone clearly forgot Call of Juarez, where AMD had the company completely block MSAA support entirely so that it had to be run AMD's way or no way at all as NVidia had not yet optimized that method of AA. This artificially made the 2900XT look competitive with the 8800GTX(although 2 driver releases later that was nowhere near the case). To make it worse, the copy they sent NVidia was BEFORE this change, and they then sent reviewers the benchmark with it all changes literally not even giving NVidia a chance to acknowledge that there was a change.
    AMD was not on charge at that time, don't blame them.

    Quote Originally Posted by DilTech View Post
    How would you feel if you bought a new $379 video card and found out it absolutely sucks in comparison to it's competitors in 2 games you play because the reviewers felt that it would be more fair to AMD. I know I would be OUT RIGHT furious if it's a game I really enjoy playing, wouldn't you?
    It would only be your fault for not searching reviews that benched your favorite games.

    Quote Originally Posted by DilTech View Post
    What AMD need to be worried about, as well as all owners of the new AMD cards, is if crysis uses tessellation as heavily as HAWX2 does, which current rumors makes it appear that it might. If it does, then people are going to be extremely upset with their purchases.

    Of course, I'm never one to gamble on the "what-if's" of the future when it comes to hardware. I look at the here and now scenario.
    Ok, just stay with your "what-if's" on the software side
    Last edited by Piotrsama; 12-22-2010 at 01:15 AM.

  9. #609
    Xtreme Member
    Join Date
    Oct 2010
    Location
    192.168.1.1
    Posts
    221
    Quote Originally Posted by blindbox View Post

    Metro 2033 we're seeing AMD rocking it
    Call of Pripyat.. amazing performance there.
    AVP... there's not much tesselation there anyway.
    Metro 2033 and Call of Pripyat have no noticeable tessellation. Yeah yeah, in CoP some of the shirt pockets of some guards are not just textures any more, they are polygons, but that's about it. In the case of Metro 2033 I think it has got more to do with DOF that destroys the FPS.

    Out of the three, I'd say AVP has the most tessellation, I played a demo version of it, it had noticeable tessellation in the Alien models.

  10. #610
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by blindbox View Post
    Name a few other than HAWX? Flat landscape was tessellated as well, by the way. AMD said they're going to fix it through drivers anyway, as complaints fell through deaf ears.

    Metro 2033 we're seeing AMD rocking it
    Call of Pripyat.. amazing performance there.
    AVP... there's not much tesselation there anyway.

    http://www.anandtech.com/show/4061/a...eon-hd-6950/15
    http://www.anandtech.com/show/4061/a...eon-hd-6950/19
    Metro 2033 doesn't use tessellation very heavily. It's DOF that it really puts to use in DX11 mode, as it runs it through DirectCompute to perform. Take a look at the game with Tessellation on and off, tell me how big of a difference you see.

    Also, correct me if I'm wrong, but doesn't the method used in LP2 also seem to have issues on AMD cards? This is an engine that all capcom games will be running on. I'm pretty sure dead rising 2 runs this engine as well, but shockingly enough I've never seen the performance reviewed on that title. That's 3 games already, a little over 1 year since DX11 cards have been released.

    I always get worried when I hear the driver statement... I'm not 100% convinced this can all be fixed via drivers, if it could LP2 performance would be up as well I would think. Only time will tell though, but AMD knew this was how the results were for awhile now.

    Quote Originally Posted by Piotrsama View Post
    AMD was not on charge at that time, don't blame them.
    ...

    AMD bought ATi in 2006. October of 2006 was when the deal was finalized. The final official call of Juarez dx10 benchmark came out in june of 2007. The change to the benchmark took place but a few days before the release of the benchmark. This was over 6 months after AMD took control.

    Yes, they still had the ATi name, but it was owned by AMD, and as such actions had to go through AMD to get the final OK.

    I'm not blaming them neither. Maybe it's just me, but I don't get mad at companies for using the money I spend to make games work better on the hardware I bought. I look at that as a return on my investment. All I'm saying is, AMD claiming they'd never do such things makes me laugh....hard.... as they've done it before.

    It would only be your fault for not searching reviews that benched your favorite games.
    If reviewers did what people seem to be asking for and not benching the games that NVidia have a clear lead on then wouldn't that mean you wouldn't be able to search for reviews benching those games?

    Ok, just stay with your "what-if's" on the software side
    I just stated I don't play that route. I look at the now, and the near future. Hardware wise we know what we see. AMD competes well until extremely heavy tessellation comes into play, and apparently some methods of use of tessellation. That's not a what-if, that's what we know is a fact.

    Only thing we don't know... which route the developers will take. Will they continue to go easy on the tessellation like some have, or will they start playing rough with it like others have.

    Realistically, my eyes are on Crytek and what they will do, considering everything else runs good enough even on mid-range cards. Cryengine 3 is likely to be licensed out a lot thanks to it's ability to play on consoles(and the SDK for the engine works with all 3 at the same time), and as such it's performance is something that we'll get use to seeing a bit of for sure.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  11. #611
    Xtreme Enthusiast
    Join Date
    Oct 2008
    Posts
    678
    Quote Originally Posted by DilTech View Post
    AMD bought ATi in 2006. October of 2006 was when the deal was finalized. The final official call of Juarez dx10 benchmark came out in june of 2007. The change to the benchmark took place but a few days before the release of the benchmark. This was over 6 months after AMD took control.

    Yes, they still had the ATi name, but it was owned by AMD, and as such actions had to go through AMD to get the final OK.
    I don't think AMD had to allow everything ATi and game developers did together. Especially since AMD hardly could have integrated ATi that much at that time.
    It takes years, and I'm pretty sure it was ATi staff that managed the contact with game developers. If AMD would stomp in and switch personell or try to micromanage ATi at that point the entire operation of ATi would be stalled for a long time. At that point ATi was owned by AMD, but hardly integrated.

  12. #612
    L-l-look at you, hacker.
    Join Date
    Jun 2007
    Location
    Perth, Western Australia
    Posts
    4,644
    Oh god, the fanboyism in this thread. It's painful.

    It's a gorram graphics card. Both nVidia and AMD want your money, and will sometimes play dirty to get it. Both are guilty; come on folks, this isn't some kind of good-vs-evil thing going on here
    Rig specs
    CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200

    Foundational Falsehoods of Creationism



  13. #613
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    I don't mind games like HAWX2, FC2 or LP2 (all shoddy console ports that have gotten bad reviews) being added to the reviews. I don't think they're there because people play them (at least on PC), but because they are the cream of the crop when it comes to graphics. HAWX is picked for tessellation and LP2 for its (Nvidia) DX11 features. FC2 is just pretty and taxing in general, or at least it used to be. These are some of the most taxing applications for the graphics card, so it's reasonable that these are the games that are being tested. Of course they are Nvidia sponsored and favor it heavily, but it doesn't change the fact that if you like these games, you should have Nvidia.

    What's important to realize when looking at aggregate avarage fps results from sites like TPU and HWC is that games like these (that are 50% faster on Nvidia hardware) have a noticable impact on the result, so Nvidia cards look faster on avarage. And it's true, they are, as long as you play the games that were benchmarked. The lesson here is that you should not blindly look at the end avarage results, but look at the games you play and decide based on the results in those games.
    "No, you'll warrant no villain's exposition from me."

  14. #614
    Xtreme Member
    Join Date
    Nov 2008
    Location
    London
    Posts
    300
    Quote Originally Posted by Lanek View Post
    Set DX11 and just uncheck DOF + PhysX ( ofc )... Tesselation is not a problem on this game, It's just DOF who kill the fps.

    For the shadow problem, sadly i have no idea, if it's a driver thing it will be solved soon.
    Yep, that's what i meant i did, left tessellation on but disabled DOF. Phyx was of course off as well.

    But still, at the main menu for example, with DX11 + tessellation only, i get like 45 fps. If i disable tessellation, then it goes up to 66 fps. If i set it to DX10 it goes up to about 68fps. This gives me room to enable AFx16 as well Which IMO yields the best experience all round (on my graphics card at least).

    They should stop putting tessellation on stupid things like bowls and teddy bears start doing what AvP and the likes have done, make it a necessity to have.
    -
    Core i7 860 @ 3.80GHz, 1.28v | GA-P55A-UD4 | G.Skill Ripjaw 4GB DDR3 @ 1900MHz 7-9-8-24 1N, 1.57v | HIS HD 6950 2GB, 1536sp @ 900/1400, 1.10v | Samsung F3 500GB | Thermaltake 750W | Windows 7 64bit | Air

    Crunching away...

  15. #615
    Xtreme Enthusiast
    Join Date
    Jun 2006
    Location
    Space
    Posts
    769
    Quote Originally Posted by Pantsu View Post
    I don't mind games like HAWX2, FC2 or LP2 (all shoddy console ports that have gotten bad reviews) being added to the reviews. I don't think they're there because people play them (at least on PC), but because they are the cream of the crop when it comes to graphics. HAWX is picked for tessellation and LP2 for its (Nvidia) DX11 features. FC2 is just pretty and taxing in general, or at least it used to be. These are some of the most taxing applications for the graphics card, so it's reasonable that these are the games that are being tested. Of course they are Nvidia sponsored and favor it heavily, but it doesn't change the fact that if you like these games, you should have Nvidia.

    What's important to realize when looking at aggregate avarage fps results from sites like TPU and HWC is that games like these (that are 50% faster on Nvidia hardware) have a noticable impact on the result, so Nvidia cards look faster on avarage. And it's true, they are, as long as you play the games that were benchmarked. The lesson here is that you should not blindly look at the end avarage results, but look at the games you play and decide based on the results in those games.
    The problem, as I see it, is that we have GPU sponsored games in the reviews. This is akin to having McDonalds present on an educational eating awareness review.

    Surely there are game that could be used, that haven't been tainted by AMD or NVIDIA. Surely?

  16. #616
    Xtreme Member
    Join Date
    Jul 2010
    Posts
    409
    Most of the AAA games are sponsored by either company so it's hopeless to find an unbiased game. And it's important that AMD and Nvidia do help the developers to get all the performance out of their cards. Of course it's another matter entirely to pay money to include proprietary features and say the developer can't accept anything from the competitor in turn. Whether either company has really done this I don't know for sure. It sure looks like that for games like LP2 or HAWX2. At least they haven't spent a moment trying to get AMD hardware to work anything like Nvidia cards.
    "No, you'll warrant no villain's exposition from me."

  17. #617
    Xtreme Member
    Join Date
    Nov 2005
    Location
    Cape Town - South Africa
    Posts
    261
    Quote Originally Posted by Pantsu View Post
    Most of the AAA games are sponsored by either company so it's hopeless to find an unbiased game. And it's important that AMD and Nvidia do help the developers to get all the performance out of their cards. Of course it's another matter entirely to pay money to include proprietary features and say the developer can't accept anything from the competitor in turn. Whether either company has really done this I don't know for sure. It sure looks like that for games like LP2 or HAWX2. At least they haven't spent a moment trying to get AMD hardware to work anything like Nvidia cards.
    And maybe ATI hardware just can't do those things properly. I love how we keep on blaming the company who makes hardware that works great and try to defend the company who manufacture hardware that can't cope. I'm not a game programmer but I find it hard to belief that "only" programmers and the opposition are to blame for poor performance.

  18. #618
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Motiv View Post
    The problem, as I see it, is that we have GPU sponsored games in the reviews.
    So what? Sponsored games equate a clear representation of the current market situation with triple-A titles.

    Plus, if AMD or NVIDIA see fit to sink tons of money into a title in order to move PC gaming technology forward, why should editors purposely avoid using such products?

  19. #619
    Xtreme Addict
    Join Date
    Nov 2007
    Posts
    1,195
    lol yeah far cry 2 uses some crazy unearthly effects thats why nvidia cards are nearly two times faster ..... all this titles mentioned above run like crap on older generation of amd cards too and every game that have problems comes either from ubisoft or capcom which get heavily paid by nvidia
    Quote Originally Posted by LesGrossman View Post
    So for the last 3 months Nvidia talked about Uniengine and then Uniengine and more Uniengine and finally Uniengine. And then takes the best 5 seconds from all the benchmark run, makes a graph and then proudly shows it everywhere.

  20. #620
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,554
    Quote Originally Posted by eric66 View Post
    lol yeah far cry 2 uses some crazy unearthly effects thats why nvidia cards are nearly two times faster ..... all this titles mentioned above run like crap on older generation of amd cards too and every game that have problems comes either from ubisoft or capcom which get heavily paid by nvidia
    Far Cry 2 ran on ATI cards better than it did on equivalent Nvidia cards for the longest time IIRC.

    EDIT: Look at the HC 4890 review.
    Last edited by BababooeyHTJ; 12-22-2010 at 01:04 PM.

  21. #621
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by eric66 View Post
    lol yeah far cry 2 uses some crazy unearthly effects thats why nvidia cards are nearly two times faster ..... all this titles mentioned above run like crap on older generation of amd cards too and every game that have problems comes either from ubisoft or capcom which get heavily paid by nvidia
    That's simply because AMD's driver team doesn't implement their optimizations until a later date...if at all. The same thing goes for NVIDIA as evidenced by their utter lack of 3DMark 11 SLI support.

    So again: why hamstring the impartiality of an article because of slow driver development?
    Last edited by SKYMTL; 12-22-2010 at 01:00 PM.

  22. #622
    Registered User
    Join Date
    Mar 2010
    Posts
    10
    Quote Originally Posted by SKYMTL View Post
    So what? Sponsored games equate a clear representation of the current market situation with triple-A titles.

    Plus, if AMD or NVIDIA see fit to sink tons of money into a title in order to move PC gaming technology forward, why should editors purposely avoid using such products?
    That's more a side effect than actual intent. To tie up performance and features to your hardware whilst avoiding your weak points in order to one-up the competition and hopefully sell more cards. These same companies will tell you how something their hardware is either lacking or not so strong at, is either unnecessary or wasteful depending on the situation. There is no noble intent here

  23. #623
    Xtreme Enthusiast
    Join Date
    Dec 2002
    Posts
    758
    So what's the TL;DR version of this thread?

  24. #624
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by -Boris- View Post
    I don't think AMD had to allow everything ATi and game developers did together. Especially since AMD hardly could have integrated ATi that much at that time.
    It takes years, and I'm pretty sure it was ATi staff that managed the contact with game developers. If AMD would stomp in and switch personell or try to micromanage ATi at that point the entire operation of ATi would be stalled for a long time. At that point ATi was owned by AMD, but hardly integrated.
    AMD had already done quite a bit of switching around the staff by then. Considering the team up between ATi and the developers of CoJ took place after AMD bought ATi I'm sure they had more to do with it than you would think.

    Quote Originally Posted by Motiv View Post
    The problem, as I see it, is that we have GPU sponsored games in the reviews. This is akin to having McDonalds present on an educational eating awareness review.

    Surely there are game that could be used, that haven't been tainted by AMD or NVIDIA. Surely?
    Maybe there are games that neither hand have touched, but frankly speaking a lot of the titles they have ARE popular games. A journalist should always look at what is popular amongst their readers whenever writing their (hopefully impartial) article. I mean, who here cares about the performance of games they will never play?

    Quote Originally Posted by Oztopher View Post
    Yep, that's what i meant i did, left tessellation on but disabled DOF. Phyx was of course off as well.

    But still, at the main menu for example, with DX11 + tessellation only, i get like 45 fps. If i disable tessellation, then it goes up to 66 fps. If i set it to DX10 it goes up to about 68fps. This gives me room to enable AFx16 as well Which IMO yields the best experience all round (on my graphics card at least).

    They should stop putting tessellation on stupid things like bowls and teddy bears start doing what AvP and the likes have done, make it a necessity to have.
    Only putting tessellation on a few objects will make those which lack it stand out and look flat in comparison. We have the horse power now to tessellate a majority of what's on the screen, why not take advantage of it? The only reason I can see not to do so is because AMD aren't as strong at it as NVidia is, but doing that is literally holding back graphics due to one side's weakness. That is something that should never be done, I don't care which side has the weakness, and I'll buy whichever card lacks said weakness.

    Quote Originally Posted by SKYMTL View Post
    That's simply because AMD's driver team doesn't implement their optimizations until a later date...if at all. The same thing goes for NVIDIA as evidenced by their utter lack of 3DMark 11 SLI support.

    So again: why hamstring the impartiality of an article because of slow driver development?
    This x100. Fact is, just because one side performs obviously worse than the other in a game, there's no reason to leave that title out. That would've been like not benching half life 2 or the original far cry when it was still the 9800XT vs the 5950 Ultra.

    The more you post, the more I seem to agree with you. Guess even though we're in different fields of writing, journalists tend to think a lot alike.

    Quote Originally Posted by Bowtie View Post
    That's more a side effect than actual intent. To tie up performance and features to your hardware whilst avoiding your weak points in order to one-up the competition and hopefully sell more cards. These same companies will tell you how something their hardware is either lacking or not so strong at, is either unnecessary or wasteful depending on the situation. There is no noble intent here
    Whatever the reasoning behind it, if it means we get better looking titles than it's a win for the buyer of said products. If you think back, there were several games that definitely ended up looking better due to the money graphics companies spent to make those titles look better. The original far cry and it's DX9C patch would be a solid example.

    Quote Originally Posted by keiths View Post
    So what's the TL;DR version of this thread?
    6970 is not quite not as fast as people were hoping it would be. Still competitive in a lot with the 570GTX except in a few titles(like HAWX2 and lost planet 2), but is also more expensive than the 570GTX. The 2 gb of ram situation does help out at ultra high resolutions though. 6950 is the sweet spot if you feel you need a cayman for whatever reason, and overclocked it'll match the 6970 for ~$70 less.

    That about sums it up in a nut-shell.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  25. #625
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Location
    London
    Posts
    577
    Quote Originally Posted by DilTech View Post
    6970 is not quite not as fast as people were hoping it would be. Still competitive in a lot with the 570GTX except in a few titles(like HAWX2 and lost planet 2), but is also more expensive than the 570GTX. The 2 gb of ram situation does help out at ultra high resolutions though. 6950 is the sweet spot if you feel you need a cayman for whatever reason, and overclocked it'll match the 6970 for ~$70 less.
    6970 is about equal, actually even slightly faster than the 570 at 1920 and above. I feel drivers will benefit the 6970 more than the 570 too. Certainly depends on the games though. They just made a major change after ~3 years(VLIW4) hence the brilliant performance in some games and relatively poor showing in some IMO. Give 'em time

    Still, both are great cards and the 570 is cheaper too
    i7 920@4.34 | Rampage II GENE | 6GB OCZ Reaper 1866 | 8800GT (zzz) | Corsair AX750 | Xonar Essence ST w/ 3x LME49720 | HiFiMAN EF2 Amplifier | Shure SRH840 | EK Supreme HF | Thermochill PA 120.3 | MCP355 | XSPC Reservoir | 3/8" ID Tubing

    Phenom 9950BE @ 3400/2000 (CPU/NB) | Gigabyte MA790GP-DS4H | HD4850 | 4GB Corsair DHX @850 | Corsair TX650W | T.R.U.E Push-Pull

    E2160 @3.06 | ASUS P5K-Pro | BFG 8800GT | 4GB G.Skill @ 1040 | 600W Tt PP

    A64 3000+ @2.87 | DFI-NF4 | 7800 GTX | Patriot 1GB DDR @610 | 550W FSP

Page 25 of 34 FirstFirst ... 1522232425262728 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •