Quote Originally Posted by -Boris- View Post
I don't think AMD had to allow everything ATi and game developers did together. Especially since AMD hardly could have integrated ATi that much at that time.
It takes years, and I'm pretty sure it was ATi staff that managed the contact with game developers. If AMD would stomp in and switch personell or try to micromanage ATi at that point the entire operation of ATi would be stalled for a long time. At that point ATi was owned by AMD, but hardly integrated.
AMD had already done quite a bit of switching around the staff by then. Considering the team up between ATi and the developers of CoJ took place after AMD bought ATi I'm sure they had more to do with it than you would think.

Quote Originally Posted by Motiv View Post
The problem, as I see it, is that we have GPU sponsored games in the reviews. This is akin to having McDonalds present on an educational eating awareness review.

Surely there are game that could be used, that haven't been tainted by AMD or NVIDIA. Surely?
Maybe there are games that neither hand have touched, but frankly speaking a lot of the titles they have ARE popular games. A journalist should always look at what is popular amongst their readers whenever writing their (hopefully impartial) article. I mean, who here cares about the performance of games they will never play?

Quote Originally Posted by Oztopher View Post
Yep, that's what i meant i did, left tessellation on but disabled DOF. Phyx was of course off as well.

But still, at the main menu for example, with DX11 + tessellation only, i get like 45 fps. If i disable tessellation, then it goes up to 66 fps. If i set it to DX10 it goes up to about 68fps. This gives me room to enable AFx16 as well Which IMO yields the best experience all round (on my graphics card at least).

They should stop putting tessellation on stupid things like bowls and teddy bears start doing what AvP and the likes have done, make it a necessity to have.
Only putting tessellation on a few objects will make those which lack it stand out and look flat in comparison. We have the horse power now to tessellate a majority of what's on the screen, why not take advantage of it? The only reason I can see not to do so is because AMD aren't as strong at it as NVidia is, but doing that is literally holding back graphics due to one side's weakness. That is something that should never be done, I don't care which side has the weakness, and I'll buy whichever card lacks said weakness.

Quote Originally Posted by SKYMTL View Post
That's simply because AMD's driver team doesn't implement their optimizations until a later date...if at all. The same thing goes for NVIDIA as evidenced by their utter lack of 3DMark 11 SLI support.

So again: why hamstring the impartiality of an article because of slow driver development?
This x100. Fact is, just because one side performs obviously worse than the other in a game, there's no reason to leave that title out. That would've been like not benching half life 2 or the original far cry when it was still the 9800XT vs the 5950 Ultra.

The more you post, the more I seem to agree with you. Guess even though we're in different fields of writing, journalists tend to think a lot alike.

Quote Originally Posted by Bowtie View Post
That's more a side effect than actual intent. To tie up performance and features to your hardware whilst avoiding your weak points in order to one-up the competition and hopefully sell more cards. These same companies will tell you how something their hardware is either lacking or not so strong at, is either unnecessary or wasteful depending on the situation. There is no noble intent here
Whatever the reasoning behind it, if it means we get better looking titles than it's a win for the buyer of said products. If you think back, there were several games that definitely ended up looking better due to the money graphics companies spent to make those titles look better. The original far cry and it's DX9C patch would be a solid example.

Quote Originally Posted by keiths View Post
So what's the TL;DR version of this thread?
6970 is not quite not as fast as people were hoping it would be. Still competitive in a lot with the 570GTX except in a few titles(like HAWX2 and lost planet 2), but is also more expensive than the 570GTX. The 2 gb of ram situation does help out at ultra high resolutions though. 6950 is the sweet spot if you feel you need a cayman for whatever reason, and overclocked it'll match the 6970 for ~$70 less.

That about sums it up in a nut-shell.