PDA

View Full Version : Ubisoft: no plans to re-implement DX10.1 in Assassin's Creed



Jakko
05-30-2008, 06:49 AM
http://www.pcgameshardware.de/aid,645430/News/Ubisoft_No_Direct_3D_101_support_for_Assassins_Cre ed_planned/

PCGH: D3D 10.1 support in Assassin's Creed was a hidden feature. Why do you choose not to announce this groundbreaking technology?

Ubisoft: The support for DX10.1 was minimal. When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API.


PCGH: What features from Direct 3D 10.1 do you use with the release version? Why do they make Assassin's Creed faster? And why do FSAA works better on D3D 10.1 hardware?

Ubisoft: The re-usage of the depth buffer makes the game faster. However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.


PCGH: Why do you plan to remove the D3D 10.1 support?

Ubisoft: Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.


PCGH: Are there plans to implement D3D 10.1 again?

Ubisoft: There is currently no plan to re-implement support for DX10.1.


So just to resume, this is what Ubisoft says:

1. DX 10.1 makes the game faster
2. Our implementation was sucky
3. Performance gains seen on ati hardware are inaccurate
4. We didn't fix it, we removed it
5. We will not bring it back

You would expect a gamedeveloper to be proud of making the first title that has support for the latest api, but I guess it is understandable Ubisoft is not going to spend time and resources on fixing features in their TWIMTBP title that only ATI hardware will use.

:down:

munim
05-30-2008, 07:02 AM
Wow, they didn't even give a lame excuse. They're just pulling a pokerface and sticking it to ATI.

ToTTenTranz
05-30-2008, 07:05 AM
I always feel a bit sad when a purely economical move stops technological advance.





I won't buy assassin's creed.

kryptobs2000
05-30-2008, 07:07 AM
Does anyone know how exactly it was 'broken' or 'buggy?' That could be lame excuse if it's untrue, which I'm inclined to say it is since they didn't specify what exactly was broken about it. With the unpatched version it runs faster, looks the same. What's buggy about that?

Who cares if it's 'implemented poorly.' Fact is it's already implemented and the game is better with than without it (for ati owners).

fcry64
05-30-2008, 07:07 AM
they said that a render pass was not used so they removed it ...could it not be fixed? was it too time consuming for them? after all it could allow the dx10.1 usage and more to mess w/

Loque
05-30-2008, 07:07 AM
he should just say: "The way it's meant to be played!"

Luka_Aveiro
05-30-2008, 07:11 AM
DX10.1 Battle Ground:

nVidia 1 - ATI 0

:D

[/flame mode]

IMO, they should have fixed it completely, I would like to see what REALLY Dx10.1 brings better than DX10.

DilTech
05-30-2008, 07:17 AM
they said that a render pass was not used so they removed it ...could it not be fixed? was it too time consuming for them? after all it could allow the dx10.1 usage and more to mess w/

First, it would require new shaders to be written for the particles at every level of AA(that's why in DX10.1 on the 3870 the dust and other particle effects don't show up), which would take quite a bit of time in the first place. Then they'd have to fix the lights from bleeding through certain walls(think the infamous torches bleeding through the floor SS).

There's two biggies and I haven't even touched it in DX10.1 mode.


Does anyone know how exactly it was 'broken' or 'buggy?' That could be lame excuse if it's untrue, which I'm inclined to say it is since they didn't specify what exactly was broken about it. With the unpatched version it runs faster, looks the same. What's buggy about that?

Who cares if it's 'implemented poorly.' Fact is it's already implemented and the game is better with than without it (for ati owners).

See above. There are more than a few things wrong with DX10.1 mode on it.


he should just say: "The way it's meant to be played!"

No, he should just say "ATi hasn't supplied us with any hardware to test our game code in a DX10.1 environment. As such, we don't have the resources to implement the patch to re-enable a working DX10.1 extension to our game-code. They also refuse to work with us on this matter."

That would be a realistic statement compared to "The way it's meant to be played!". Just ask anyone who's ever worked for a development house. :yepp:

kromosto
05-30-2008, 07:20 AM
[QUOTE=Luka_Aveiro;3025446]DX10.1 Battle Ground:

nVidia 1 - ATI 0 - Ubisoft "-1"


[QUOTE]

Luka_Aveiro
05-30-2008, 07:23 AM
DX10.1 Battle Ground:

nVidia 1 - ATI 0 - Ubisoft "-1"




Thanks for the correction :D

jas420221
05-30-2008, 07:33 AM
1. DX 10.1 makes the game fasterb/c its broken..true.

2. Our implementation was suckySee #1

3. Performance gains seen on ati hardware are inaccurateits broken. People cried when Nvidia's drivers had a bug in Crysis that messed up some rendering that gave them a couple fps increase as well. Would you rather see it (pun intended) the way its meant to be played (particles/dust/flames in the right spot), or a couple fps faster. If you prefer the latter, turn down the ingame details...

4. We didn't fix it, we removed it
5. We will not bring it backIts a shame...See DilTech's post.

Jakko
05-30-2008, 07:40 AM
Jas, according to Ubisoft, DX 10.1 increases performance whether or not the implementation gets screwed up.
At least that is how I understand the interview.

Which makes it especially weird that Ubisoft later on says they know for sure the performance improvements seen on ATI hardware are due to bugs.

So 10.1 does improve performance, unless on ati hardware, in which case performance improvements are only caused by :banana::banana::banana::banana:ty implementation?
Hmmmk

jas420221
05-30-2008, 08:04 AM
Jas, according to Ubisoft, DX 10.1 increases performance whether or not the implementation gets screwed up.
At least that is how I understand the interview.

Which makes it especially weird that Ubisoft later on says they know for sure the performance improvements seen on ATI hardware are due to bugs.I didnt see that anywhere. Now, that doesnt mean it doesnt exist. But please feel free to quote Ubi where they said that and the inverse (already done with this thread and excerpt you posted), and you will have a point. Thanks for refreshing my failing (old) mind.

EDIT: After thinking a bit, even if it does improve performance on its own, wouldnt adding in the rendering of the dust and particles etc just slow it down??? I mean, who knows how much it would slow it down, but more stuff on the screen generally equals to lower fps.

Jakko
05-30-2008, 08:10 AM
When they talk about the reason for using 10.1 they say:

When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API. The re-usage of the depth buffer makes the game faster.

And then they say:


the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
How do they know what performance gain was result of the bugs and what performance gain was result of using dx 10.1?
Are they talking :banana::banana::banana::banana:?
The clue comes in the next comment:


This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.

But benchmarks reveal that the performance gain is only there when AA is used. In other words, when using DX 10.1 and not using AA there is no performance gain even though all the dx 10.1 bugs are still there.

This means that Ubisoft is talking out of their ass. The performance gain seen on ATI hardware is not at all result of the buggy implementation (otherwise a gain could be seen even when not using AA), it is result of DX 10.1.
What could be the reason for falsely condemning benchmarks that favor ati?

jas420221
05-30-2008, 08:12 AM
When they talk about the reason for using 10.1 they say:


And then they say:


How do they know what performance gain was result of the bugs and what performance gain was result of using dx 10.1?
Are they talking :banana::banana::banana::banana:?
The clue comes in the next comment:



But benchmarks reveal that the performance gain is only there when AA is used. In other words, when using DX 10.1 and not using AA there is no performance gain even though all the dx 10.1 bugs are still there.

This means that Ubisoft is talking out of their ass. The performance gain seen on ATI hardware is not at all result of the buggy implementation, it is result of DX 10.1.
They are biased.OK... thanks! :)

But arent those 2 seperate things? Its not like you cant have one with out the other....right?

STaRGaZeR
05-30-2008, 08:14 AM
:yawn2:

This is boring. It was already clear that they will not reimplement DX10.1.

DX10.1 IS faster than DX10, but only when AA is on. Some of you please check what DX10.1 is and what it does please.


b/c its broken..true.

Yes and no. See above. In this concrete game you end with: faster because is faster + faster because is broken. Two different things. If you fix the implementation it'll be slower than it's now, but still faster than DX10 (only with AA on, again).

I don't think this is soooo difficult to understand... people will not accept it until they see some proper DX10.1 implementations.

jas420221
05-30-2008, 08:18 AM
Yes, like I said in my edit in post #13....right?

Its faster on its own (10.1 w/AA), but you add in the 'broken' part (dust and particle effects), and it would slow it back down...how much, no idea.

Jakko
05-30-2008, 08:19 AM
OK... thanks! :)

But arent those 2 seperate things? Its not like you cant have one with out the other....right?

Possible yes, but the benchmarks show that the bugs don't add any performance, but enabling AA does. (DX 10.1 works)
So why would Ubi say the boost for ati hardware is an inaccurate result?

cadaveca
05-30-2008, 08:21 AM
Because it really was. At least it was for me, using multiple cards. Since DX10.1 removal, the game plays far smoother than before, and has far less rendering errors. Boosts made by improper rendering do not equal better performance, unless maybe benching!

saaya
05-30-2008, 08:22 AM
so why do they lock this "bug" if people actually want it?
doesnt make any sense...

jas420221
05-30-2008, 08:26 AM
Because it really was. At least it was for me, using multiple cards. Since DX10.1 removal, the game plays far smoother than before, and has far less rendering errors. Boosts made by improper rendering do not equal better performance, unless maybe benching!That was exactly my point with the Nvidia/Crysis bug reference and its improvements that people were :banana::banana::banana::banana::banana:ing about (for a good reason).

I understand on its own its does have fps improvements, but how much really when you add in everything its supposed to be rendering? :shrug:


so why do they lock this "bug" if people actually want it?
doesnt make any sense...My guess is b/c its missing items that were supposed to be on screen and others that were NOT supposed to be on screen were showing. Its borked.........like they repeatedly keep saying that some just refuse to believe for one reason or another.

STaRGaZeR
05-30-2008, 08:28 AM
Yes, like I said in my edit in post #13....right?

Its faster on its own (10.1 w/AA), but you add in the 'broken' part (dust and particle effects), and it would slow it back down...how much, no idea.

You edited yours while I was writing mine :p:

Exactly. Remove the broken part and voilà: DX10.1 is something good, and for all. Well, for all DX10.1 cards :eek: :rolleyes:

DilTech
05-30-2008, 09:10 AM
Possible yes, but the benchmarks show that the bugs don't add any performance, but enabling AA does. (DX 10.1 works)
So why would Ubi say the boost for ati hardware is an inaccurate result?

You haven't been paying attention, have you?

The only way DX10.1 is in use IS when AA is on! Says it in the article clear as day. As such, the bug only occurs when AA is on, and not having to render the particle effects WILL lead to higher performance. That's why the boost is an inaccurate result, because while there is a performance boost for using DX10.1, a majority of the performance boost is due to incorrect rendering, plain and simple. I've also already explained why it isn't being put back in.

Either way, with this first DX10.1 title having to take dx10.1 out of it, it's probably going to be even longer now before there's another one, especially due to the extra amount of code required to use DX10.1.

STaRGaZeR
05-30-2008, 09:20 AM
because while there is a performance boost for using DX10.1, a majority of the performance boost is due to incorrect rendering, plain and simple.

C'mon man, plain and simple is also the fact that you don't know a :banana::banana::banana::banana: (only Ubi knows it) about what part of the perfomance boost is due to the bugs and what is due to DX10.1 :rolleyes:

Jakko
05-30-2008, 02:47 PM
You haven't been paying attention, have you?

There's no need to be all hysterical about this now, especially since you are wrong.


The only way DX10.1 is in use IS when AA is on! Says it in the article clear as day. As such, the bug only occurs when AA is on, and not having to render the particle effects WILL lead to higher performance. That's why the boost is an inaccurate result, because while there is a performance boost for using DX10.1, a majority of the performance boost is due to incorrect rendering, plain and simple. I've also already explained why it isn't being put back in.

You are partly right, the dustbug only appears on a SP1 system when AA is enabled. The different lighting though, appears on SP1 systems that do not use AA, and also on systems that use DX 9.
I know what Ubisoft says, but since they are full of crap (as you will soon find out), using them as the only source for this kind of info is silly.

Since you didn't read the follow up article on Rage3d I will quote some stuff for ya:

http://rage3d.com/articles/assassinscreed-addendum/index.php?p=3


Did we find the glitches that everyone has been talking about when making reference to the 10.1 path? In a way - we do have that missing dust that qualifies for that category, albeit we don't know yet if it's simply a driver bug or something wrong with the pathway itself. Other than that, there's the more intense lighting, but that actually seems to show that there's a bug with the DX10 pathway, given the fact that DX9 has the same, more intense, lighting as 10.1, and UBi said that 9 and 10 (and by extension 10.1) should be nearly identical visually (shadowing is different between versions).


The good news is that the dust-bug affects only a few scenarios and that, after testing with a run through Acre that avoids any areas which include the troublesome dust, we've determined that the performance benefits associated with 10.1 remain: it wasn't a case of the missing effect causing the performance improvements.

So yes, Ubi is full of crap, Ati's DX 10.1 boost is not due to the bugs, and Ubi calling the dx 10.1 benchmarks "inaccurate" is shady to say the least.

One other thing, in the article I linked to, a known 3d coder makes a guess as to what's causing the dustbug.

"Interesting. I checked out the dust and it looks very much like it’s soft particles. This means they require the depth buffer for this effect. That it doesn’t work with AA makes sense too. If they simply bound the depth buffer as a texture they would need to use the Load() function rather than Sample() to fetch the depth value when it’s multisampled (any single sample in the AA depth buffer should be good enough for soft shadows, no need to average). They would also need a separate shader for each AA setting, like one for 2x, 4x and 8x. I would guess it’s broken because this wasn’t done and they are trying to sample the MSAA buffer with Sample() using the same shader as in the NoAA / plain DX10 case. Given that the effect is quite subtle it could easily have been overlooked. So fixing this (and anything else that might be broken) is probably the reason why they said they needed to redo the DX10.1 path."

If this guy is right, Ubi made a relatively simple mistake, and could fix this easily. Yet they choose not to.
Oh and I added the "another jakko thread!" tag to this thread, to save you guys some trouble. :)

zanzabar
05-30-2008, 02:50 PM
C'mon man, plain and simple is also the fact that you don't know a :banana::banana::banana::banana: (only Ubi knows it) about what part of the perfomance boost is due to the bugs and what is due to DX10.1 :rolleyes:

the best part is that affter patch 1.2 i reinstalled to see the difference and the animus has stuff in front of it, and the game crashes all day + lag when it goes to pre assassination videos

and there is artifacts in the flags (not the stuff that should be there but black squares with no white numerics)


ati has funding now though so we should see more ati unlocked games (high shader optimization + 10.1, it shouldent lock down NV since it wouldent work that way and ati has class and it should help via/s3), but were is the EU probe on this comeon amd files grevances over intel all day its time for NV to get whats comeing

v_rr
05-30-2008, 03:17 PM
There's no need to be all hysterical about this now, especially since you are wrong.



You are partly right, the dustbug only appears on a SP1 system when AA is enabled. The different lighting though, appears on SP1 systems that do not use AA, and also on systems that use DX 9.
I know what Ubisoft says, but since they are full of crap (as you will soon find out), using them as the only source for this kind of info is silly.

Since you didn't read the follow up article on Rage3d I will quote some stuff for ya:

http://rage3d.com/articles/assassinscreed-addendum/index.php?p=3





So yes, Ubi is full of crap, Ati's DX 10.1 boost is not due to the bugs, and Ubi calling the dx 10.1 benchmarks "inaccurate" is shady to say the least.

One other thing, in the article I linked to, a known 3d coder makes a guess as to what's causing the dustbug.


If this guy is right, Ubi made a relatively simple mistake, and could fix this easily. Yet they choose not to.
Oh and I added the "another jakko thread!" tag to this thread, to save you guys some trouble. :)

Nice post, but you know some people have astrological powers and before all this was known they trashed and predicted giant things about DX_10.1 and invented all escuses about Assacin creed.

But now that all this is explored they keep pushing the same button over and over again and spread false information into all treads about it (look in his signature. Because 1 post he made 5 of it and put into signature), now it´s pushing the same wrong button over and over again and everyone should have this in their signature just to copy his previous act.

Astrological powers are good (not), but when the truth is already known and some people keep pushing the button is just boring and worse, it spread wrong information into the forum. There are "n" reviews stating that Ubisoft is not telling the truth and there is zero reviews supporting Ubisoft but in the foruns fanboys appear and what happens next is the usual....

Jakko
05-31-2008, 04:00 AM
Well in all honesty, it could have been possible that the performance improvements are due to the bugs.
But since it's been pointed out in the rage3d article that the bugs do not cause the performance increases, I don't see why anyone, especially Ubisoft, would suggest such things.

Luka_Aveiro
05-31-2008, 04:08 AM
Well in all honesty, it could have been possible that the performance improvements are due to the bugs.
But since it's been pointed out in the rage3d article that the bugs do not cause the performance increases, I don't see why anyone, especially Ubisoft, would suggest such things.

Because they are the developers and they know what they are talking about?

Because nVidia payed them to do so?

What do you think?

Macadamia
05-31-2008, 04:41 AM
Because they are the developers and they know what they are talking about?

Because nVidia payed them to do so?

What do you think?

You think official statements are the truth? :rofl:

C'mon, EVERY PR statement contains spin.

And Dil, really. You haven't even used a DX10 Radeon before, and you're being snobbish about the majority of stuff you said.

I'll patch AC to 1.2 on my friend's 2900PRO later (I might buy it from him since my XT is RMAd) and see if it's oh- MOAR STABEL or something.

jimmyz
05-31-2008, 05:02 AM
No, he should just say "ATi hasn't supplied us with any hardware to test our game code in a DX10.1 environment. As such, we don't have the resources to implement the patch to re-enable a working DX10.1 extension to our game-code. They also refuse to work with us on this matter."

That would be a realistic statement compared to "The way it's meant to be played!". Just ask anyone who's ever worked for a development house. :yepp:

They have ATI cards cards to test it on. keep the fanboi crap out of it. you are just grasping at straws. just face the fact your beloved card maker doesn't have a DX10.1 card and apparently won't this year as their new card doesn't have 10.1 support. It is time they catch up.

BenchZowner
05-31-2008, 05:18 AM
Guys I'm watching you fighting over this thing like somebody said his penis is bigger than yours.
After all, what do you really care about ? The game itself or the "excitement" of fighting over a stupid issue ?
How many of you here have played the game ? How many of you finished the game ? How many of you liked the game/gameplay ?

Just get over it and move on.

Extelleron
05-31-2008, 05:27 AM
Guys I'm watching you fighting over this thing like somebody said his penis is bigger than yours.
After all, what do you really care about ? The game itself or the "excitement" of fighting over a stupid issue ?
How many of you here have played the game ? How many of you finished the game ? How many of you liked the game/gameplay ?

Just get over it and move on.

Whether or not the game has DX10.1 support isn't the issue, although it would be nice.

The issue is that technology is being held back by nVidia, just because they don't have a DX10.1 card they seem to be pushing for DX10.1 to not be used. Before you say it is insignificant, there is an easy 15-20% performance boost w/ 4xAA going from DX10 to DX10.1 on AMD cards in Assassin's Creed. Going from DX9 -> DX10 doesn't even improve performance by that amount, so you could argue DX10.1 is a bigger deal than DX10 in regards to performance.

I currently own an nVidia card (8800GTS 512MB) but stuff like this makes me want to buy a 4870 instead of a GTX 260.

BenchZowner
05-31-2008, 05:35 AM
Whether or not the game has DX10.1 support isn't the issue, although it would be nice.

[quote]The issue is that technology is being held back by nVidia, just because they don't have a DX10.1 card they seem to be pushing for DX10.1 to not be used.

Flashback... nVIDIA SM3.0 ATi SM2.0b

P.S. THEY SEEM differs significantly from THEY ADMITTEDLY & SURELY DO.


Before you say it is insignificant, there is an easy 15-20% performance boost w/ 4xAA going from DX10 to DX10.1 on AMD cards in Assassin's Creed.

In my testing there was no 15 to 20% performance boost.
You're exaggerating.


I currently own an nVidia card (8800GTS 512MB) but stuff like this makes me want to buy a 4870 instead of a GTX 260.

With or without DX10.1 the 4870 will be slower than the GTX.

p.s. How many pure DX10.1 games you think we'll see before "DX11" ? ;)
p.s. Once again, how many of you liked the game ?
p.s. If DX10.1 is such a big deal, why doesn't AMD something about it ? Let's say that the very very dark evil nVIDIA monster is preventing them with the TWIMTBP for Assasins Creed, why don't they use some money and make some agreements with other software houses with their own GITG program ?

STaRGaZeR
05-31-2008, 05:35 AM
How many of you here have played the game ? How many of you finished the game ? How many of you liked the game/gameplay ?

Me, me (twice) and me. ;)

Jakko
05-31-2008, 05:46 AM
Whether or not the game has DX10.1 support isn't the issue, although it would be nice.

The issue is that technology is being held back by nVidia, just because they don't have a DX10.1 card they seem to be pushing for DX10.1 to not be used. Before you say it is insignificant, there is an easy 15-20% performance boost w/ 4xAA going from DX10 to DX10.1 on AMD cards in Assassin's Creed. Going from DX9 -> DX10 doesn't even improve performance by that amount, so you could argue DX10.1 is a bigger deal than DX10 in regards to performance.

I currently own an nVidia card (8800GTS 512MB) but stuff like this makes me want to buy a 4870 instead of a GTX 260.

QFT
Allthough it has to be said, technology is not just being held back by nvidia, but also by Ubisoft, as they are the ones unwilling to fix 10.1 pathways rather than remove them, for whatever reason.

And Bench, your questions can easily be answered, money.
AMD does not have the money to pay many gamedeveloppers at this point. I understand they helped on mass effect, which is great news I think. Looks like a much better game than AC.

Oh and I don't think it matters whether or not someone plays or likes AC. You can hate that game and still have the opinion that Ubisofts anti-innovative decisions suck.

Extelleron
05-31-2008, 05:51 AM
[QUOTE=Extelleron;3027421]Whether or not the game has DX10.1 support isn't the issue, although it would be nice.



Flashback... nVIDIA SM3.0 ATi SM2.0b

P.S. THEY SEEM differs significantly from THEY ADMITTEDLY & SURELY DO.



In my testing there was no 15 to 20% performance boost.
You're exaggerating.



With or without DX10.1 the 4870 will be slower than the GTX.

p.s. How many pure DX10.1 games you think we'll see before "DX11" ? ;)
p.s. Once again, how many of you liked the game ?
p.s. If DX10.1 is such a big deal, why doesn't AMD something about it ? Let's say that the very very dark evil nVIDIA monster is preventing them with the TWIMTBP for Assasins Creed, why don't they use some money and make some agreements with other software houses with their own GITG program ?

It would have been nice to have SM3 on ATI cards in 2004, but I don't think it was necessary. ATI supported SM3, they just didn't have a card out w/ SM3 support until 2005. Considering I can't remember a clear example of a game that ever used SM3 and it had a noticeable effect (except Far Cry) until the ~2006-2007 timeframe (when games began to require it) I don't think ATI was wrong in waiting.

The difference here is nVidia seems to never be interested in supporting DX10.1, which is a bit different. If this was the same as the ATI SM3 issue, nVidia would be supporting DX10.1 a year later than ATI, in 2008 with G200. But that's not the case.

IMO from what I see, DX10.1 shouldn't take so much coding over DX10 (compared to the benefits at least). So I believe if the API had nVidia's support, we would see a large number of games supporting DX10.1 and DX10.0 being forgotten.

Yes there is a 15-20% boost, in fact the difference is 25% comparing Minimum framerates:
http://www.rage3d.com/articles/assassinscreed/index.php?p=3

And before you say it is because of SP1 not DX10.1, look on the next page and you see the HD 2900XT sees no benefit w/ 4xAA moving from Vista -> SP1.

I haven't played the game on PC, but I've played it a bit on 360 and it seemed pretty cool. I would't spend $40-50 on it, but I'll pick it up when I can get it for $20 or so.

And suggesting AMD actually market something is a lost cause. :p: ATI has never had the marketing nVidia had. Why do you think nVidia did well even during the GeForce FX days? We see ATI on their knees right now because R600 wasn't competitive, but when nVidia runs into problems, they can just rely on marketing.

BenchZowner
05-31-2008, 06:20 AM
[QUOTE=BenchZowner;3027431]

It would have been nice to have SM3 on ATI cards in 2004, but I don't think it was necessary. ATI supported SM3, they just didn't have a card out w/ SM3 support until 2005.

They didn't support or applaud it, actually they were letting slides out with "SM3 hate" if my memory serves me right ( this time I'm quite sure it does ).


Considering I can't remember a clear example of a game that ever used SM3 and it had a noticeable effect (except Far Cry) until the ~2006-2007 timeframe (when games began to require it) I don't think ATI was wrong in waiting.

Once again, 1 game back then as you say ( FarCry ), 1 game now ( Assasins Creed ).


The difference here is nVidia seems to never be interested in supporting DX10.1, which is a bit different. If this was the same as the ATI SM3 issue, nVidia would be supporting DX10.1 a year later than ATI, in 2008 with G200. But that's not the case.

And why do you think nVIDIA is trying to prevent others from using DX10.1 while they could just reword their reworked architecture and with little to none changes ( hardware-wise DX10.1 isn't that much different ) and just make the performance gap between their products and the competitor's products even bigger ? ( since nVIDIA is already in the lead, even when comparing their card with DX10 against the AMD card with DX10.1 )

By the way, how many DX10.1 games have you heard about coming ?
How many DX10 games ? ( even those are few, very few )


Yes there is a 15-20% boost, in fact the difference is 25% comparing Minimum framerates:
http://www.rage3d.com/articles/assassinscreed/index.php?p=3

I do not trust rage3d at all.
And also, if you've been following this thing from the beginning you'd already know that I've took several measurements in DX10 & DX10.1 comparing nV & AMD cards ;)


And suggesting AMD actually market something is a lost cause. :p: ATI has never had the marketing nVidia had. Why do you think nVidia did well even during the GeForce FX days? We see ATI on their knees right now because R600 wasn't competitive, but when nVidia runs into problems, they can just rely on marketing.

That's their problem. If they can't market their products and targets right, too bad.

p.s. @ Jakko about money... well... if AMD's employees worked really hard, and finally put out a decent product, then they'd have more money and such.

With the G80 nVIDIA made a big step forwarding and really swept ATi ( now AMD ) off the floor.
The next card was also a disappointment and also failed to compete even with the 'pre'-high end G80 part.
It took AMD a full year+ to come up with something that beats the initial G80 lineup and some of their refreshes.

Like in a car race, you have to keep up...otherwise you'll be left behind, and when you're behind you need to work hard and make your way to the "opponent".

v_rr
05-31-2008, 06:31 AM
BenchZowner strike back :rolleyes:
After so much discution in the other tread, again the same story? Read the reviews all over the web.

You don´t trust rage3d. Take [H]:
http://enthusiast.hardocp.com/article.html?art=MTQ5MywxLCxoZW50aHVzaWFzdA==

And there are plently website saying the same.
Why should we trust on your biased comments whe we have lots of info in the web about AC DX_10.1? It´s just needed 10 minutes to see that your comment are always Nvidia biased. Even in the 260/280GTX and HD 4800 that are cards not even on the market you spread fud in the treads saying that Nvidia is much better and all sort of things.

Leave this DX_10.1 tread and stop recicling arguments already proven to be false.

perkam
05-31-2008, 06:36 AM
We`ve already had an entire discussion on this...why restart the flame war.

OT: George Foreman Grill

http://www.saltoninc.com/images/products/foreman/gforeman.jpg
''The Way Its Meant To Be Cooked'' :lol:

Perkam

Extelleron
05-31-2008, 06:41 AM
That's their problem. If they can't market their products and targets right, too bad.

p.s. @ Jakko about money... well... if AMD's employees worked really hard, and finally put out a decent product, then they'd have more money and such.

With the G80 nVIDIA made a big step forwarding and really swept ATi ( now AMD ) off the floor.
The next card was also a disappointment and also failed to compete even with the 'pre'-high end G80 part.
It took AMD a full year+ to come up with something that beats the initial G80 lineup and some of their refreshes.

Like in a car race, you have to keep up...otherwise you'll be left behind, and when you're behind you need to work hard and make your way to the "opponent".

Now you are showing how much of a fanboy you are with these statements.

You talk as if ATI/AMD never made a "decent product," when that is the farthest thing from the truth.

ATI had a solid performance lead over nVidia (barring a few months time) from 2002 - 2006. The Radeon 9000 series blew the GeForce FX away, and the X800/X850 was able to outperform the GeForce 6 as well. X1800XT beat 7800GTX, X1900XTX beat the 7900GTX. In fact if you look at the performance of the X1900/7900 in 2007/2008 games, you see that even cards like the X1950 Pro are faster than the 7900GTX.

With GeForce 8, nVidia finally hit it right and AMD messed up with R600, giving us the situation we have now. It hasn't even been two years that nVidia has been in the performance lead, but you seem to be forgetting the 4+ years that ATI had the advantage. Assuming HD 4870 X2 brings back performance parity, which I think it will, nVidia's current stretch of domination will be far shorter than ATI's past one.

Why is nVidia not supporting DX10.1? Well why they originally decided not to, I have no idea, but it's not as if they can suddenly change their minds and in late 2007 decide to implement DX10.1 in G200. nVidia likely made that kind of decision back when G80 was shipping, and perhaps they underestimated the level of performance DX10.1 could bring to the table. Now they must cover themselves with marketing.

And why should I trust you over rage3d? And as v_rr posted, HardOCP has found the same results (in fact they showed an HD 3870 got a 34% boost from 2xAA, so even better than the HD 3870 X2 rage3d results).

Morgoth Bauglir
05-31-2008, 06:42 AM
I do not trust rage3d at all.
And also, if you've been following this thing from the beginning you'd already know that I've took several measurements in DX10 & DX10.1 comparing nV & AMD cards ;)


LOL. I guess you don't trust the other sites that verified it, eh?Like [H] or PCGH?Bit-tech?Or UBi themselves admitting that there is a performance benefit?All of those pale in comparison with your awesome investigation that took 2 weeks to produce 2 floppy shots, weeks throughout which you couldn't figure out how to enable AF on the Radeons/report that there's a bug with App-controlled AF(which was fixed in later drivers, BTW) and couldn't even see, that AA quality was actually superior even in those wimpy shots.

Clearly, we are all biased idiots out to misguide the good ppl, and you, an obviously impartial and, most importantly, educated partie, have really proved all of us wrong. In an epic way. With 2 wimpy screenshots and a lot of imagination. Good job!

STaRGaZeR
05-31-2008, 06:54 AM
LOL. I guess you don't trust the other sites that verified it, eh?Like [H] or PCGH?Bit-tech?Or UBi themselves admitting that there is a performance benefit?All of those pale in comparison with your awesome investigation that took 2 weeks to produce 2 floppy shots, weeks throughout which you couldn't figure out how to enable AF on the Radeons/report that there's a bug with AP-controlled AA(which was fixed in later drivers, BTW) and couldn't even see, that AA quality was actually superior even in those wimpy shots.

QFT. Right BZ? ;)

BenchZowner
05-31-2008, 07:37 AM
Now you are showing how much of a fanboy you are with these statements.

When you don't like the truth, reject it.
Well done.


You talk as if ATI/AMD never made a "decent product," when that is the farthest thing from the truth.

You actually missed my post once again, and actually the part that says:

ATi was better back then with the X1950XTX both in image quality & performance.
They were in the lead again with the 9700Pro over the Ti 4600... they were also ahead in the GeForce Suck FX era with the 9800Pro/XT.

What you fail to see is that nVIDIA at the present time IS FASTER & BETTER.
Get that PRESENT TIME.


ATI had a solid performance lead over nVidia (barring a few months time) from 2002 - 2006. The Radeon 9000 series blew the GeForce FX away, and the X800/X850 was able to outperform the GeForce 6 as well.

Ok, and S3 was beating Tseng Labs back in the past, should I buy S3 when there's a faster & better product available ?


With GeForce 8, nVidia finally hit it right and AMD messed up with R600, giving us the situation we have now. It hasn't even been two years that nVidia has been in the performance lead, but you seem to be forgetting the 4+ years that ATI had the advantage. Assuming HD 4870 X2 brings back performance parity, which I think it will, nVidia's current stretch of domination will be far shorter than ATI's past one.

Now that IS fanboyism or even worse, hate towards a company ( just like you should love them... they all want the same f*ing thing, your money )

P.S. You seem to be sure about AMD taking the lead now... do you have any valid info or just daydreaming about it ?


And why should I trust you over rage3d? And as v_rr posted, HardOCP has found the same results (in fact they showed an HD 3870 got a 34% boost from 2xAA, so even better than the HD 3870 X2 rage3d results).

If you've taken a look at the pictures, on the HD card the AA wasn't applied everywhere.
And still you're making ASSUMPTIONS based on conspiracy theories about nVIDIA forcing nUbiSoft to remove DX10.1

All that, when there are several images showing various bugs & not rendered items & effects, and even objects rendered incorrectly.


LOL. I guess you don't trust the other sites that verified it, eh?Like [H] or PCGH?Bit-tech?Or UBi themselves admitting that there is a performance benefit?All of those pale in comparison with your awesome investigation that took 2 weeks to produce 2 floppy shots, weeks throughout which you couldn't figure out how to enable AF on the Radeons/report that there's a bug with App-controlled AF(which was fixed in later drivers, BTW) and couldn't even see, that AA quality was actually superior even in those wimpy shots.

First of all you are not my employer neither my boss, neither my time management advisor, so let alone the crybaby "it took you 2 weeks to do that and that".
Second, I intentionally left AF off for both cards.
Third, I had very limited time and did some quick tests, I wasn't and never will be bothered to check this sucky game again.


Clearly, we are all biased idiots out to misguide the good ppl, and you, an obviously impartial and, most importantly, educated partie, have really proved all of us wrong. In an epic way. With 2 wimpy screenshots and a lot of imagination. Good job!

I refuse to reply to your ironic rant.


...
BASTA.

jas420221
05-31-2008, 09:41 AM
just face the fact your beloved card maker doesn't have a DX10.1 card and apparently won't this year as their new card doesn't have 10.1 support. It is time they catch up.Can you remind me what they need to catch up in...Im having trouble trying to come up with a 'used right now' or 'will be needed' or even 'wanted' feature by the end of the year.

Does anyone remember tessellation on R600 cards? That highly touted feature hasnt panned out to well either ehh (correct me if I am wrong on that one)? Are there ANY games out that use it...Im curious. Do these new cards still have it? :confused:

Anyway, as an enthusiast, Im not too concerned now with what *may* be. M$, Crytek, and Nvidia all came out a few months ago and more or less said DX10.1 is useless now, and will be for at least a year. That should be the end of 2008. What are the other DX10.1 games coming out this year? If its a handful or less and not any major titles, is this really a good feature to bring forward?

Luka_Aveiro
05-31-2008, 09:42 AM
I would love someone could prove me nVidia actually paid Ubisoft to remove DX10.1 from AC.

And I would love Ubisoft had implemented correctly Dx10.1 path, so we all could see the real deal about it.

Until it happens, you all just can stay with the official statements, the rest is pure speculation and fantasy, as NOTHING can be proved, considering the Dx10.1 path with AC is REALLY broken, which screenshots from Dx10.1 with HD cards prove.

Until something official comes out, you can fantasize anything you want, unless you can really prove what you are saying :rolleyes:

jas420221
05-31-2008, 09:49 AM
I would love someone could prove me nVidia actually paid Ubisoft to remove DX10.1 from AC.

And I would love Ubisoft had implemented correctly Dx10.1 path, so we all could see the real deal about it.

Until it happens, you all just can stay with the official statements, the rest is pure speculation and fantasy, as NOTHING can be proved, considering the Dx10.1 path with AC is REALLY broken, which screenshots from Dx10.1 with HD cards prove

Until something official comes out, you can fantasize anything you want, unless you can really prove what you are saying :rolleyes:QFT. Great post.

Jakko
05-31-2008, 11:49 AM
Well Luka, in the several articles containing benchmarks and tests, several things are being proven. One of them being that directx 10.1 really did bring a performance gain that wasn't caused by the bugs.

And comparing directx 10.1 to tesselation is ridiculous. One being a feature and the other being an api one having no support whatsoever and the other allready having support in a title even though 10.0 is hardly supported and last but not least, tesselation being an ati initiative, and directx 10.1 being an industry initiative.

That's right, ati didn't create directx 10.1, microsoft did.
The time where you could say nvidia supporting no directx 10.1 being sensible is over, as this whole AC thing proved quite convincingly directx 10.1 is an important and welcome update to directx 10.

I am amazed at how long it takes nvidia to implement it. Are they stubborn? Is their architecture not ready for it?
I dunno, but I sure know it's stupid of them.

v_rr
05-31-2008, 11:53 AM
I would love someone could prove me nVidia actually paid Ubisoft to remove DX10.1 from AC.

And I would love Ubisoft had implemented correctly Dx10.1 path, so we all could see the real deal about it.

Until it happens, you all just can stay with the official statements, the rest is pure speculation and fantasy, as NOTHING can be proved, considering the Dx10.1 path with AC is REALLY broken, which screenshots from Dx10.1 with HD cards prove.

Until something official comes out, you can fantasize anything you want, unless you can really prove what you are saying :rolleyes:

Read the reviews and then come here again saying something with some logic.
And the IQ quality is just OK in the reviews. Read [H] and others.

jas420221
05-31-2008, 12:30 PM
Well Luka, in the several articles containing benchmarks and tests, several things are being proven. One of them being that directx 10.1 really did bring a performance gain that wasn't caused by the bugs.

And comparing directx 10.1 to tesselation is ridiculous. One being a feature and the other being an api one having no support whatsoever and the other allready having support in a title even though 10.0 is hardly supported and last but not least, tesselation being an ati initiative, and directx 10.1 being an industry initiative.

That's right, ati didn't create directx 10.1, microsoft did.
The time where you could say nvidia supporting no directx 10.1 being sensible is over, as this whole AC thing proved quite convincingly directx 10.1 is an important and welcome update to directx 10.

I am amazed at how long it takes nvidia to implement it. Are they stubborn? Is their architecture not ready for it?
I dunno, but I sure know it's stupid of them.Did I compare dx10.1 and tessellation together?? I dont think I posted anything like that. Tessellation was an example of (another)a wasted implementation, like dx10.1. Im not comparing them in the manner which you posted. But I appreciate the completely unneccesary lesson in any case. :up:

Tell me why its stupid for nvidia to not have a feature that isnt and wont be used for MONTHS to come (in which time 2 sets of cards will have been released with the latter possibly having 10.1 when it usefull)?

...Im not worried that nvidia doesnt have dx10.1. Its completely useless at this point in time. When there are more titles that support it, Im certain they will come up with cards supporting that 'upgrade'. I said it before at R3D a few months ago, by the time the feature set/api will be implemented by the devs there will be other WAY better cards out there anyway. Its like equiping a car with navigation but saying you wont be able to use it at all until they get satellites up in over a year. Whats the point? (ok car is a bad example b/c you are likely to keep it a lot longer than an enthusiast keeps a PC/vidcard...i hope you get my point though. :p:).

I still have some unanswered questions...:


Are there ANY games out that use it (tessellation)...Im curious. Do these new cards still have it? :confused:

What are the other DX10.1 games coming out this year?

I guess there are 2 extreme sides to every story...I just dont agree with all the conspiracy BS going around. You guys need to lay off t3h w33d and stop being so paranoid I think. ;)

Jakko
05-31-2008, 12:39 PM
Tell me why its stupid for nvidia to not have a feature that isnt and wont be used for MONTHS to come (in which time 2 sets of cards will have been released with the latter possibly having 10.1 when it usefull)?



Well that answer is simple.
If I buy a card now, I want it to support DX 10.1, as I am going to use it for at least 6 months, in which many games could show up that benefit from 10.1 in the way AC did.

Of course the key is could, noone knows what games will support 10.1, and it might take ages.
It is however, not unreasonable to expect games coded for 10.0 to also support 10.1 at some point.

20% better performance for a relatively small fix. I would do it if I were a developer.

jas420221
05-31-2008, 12:47 PM
Well that answer is simple.
If I buy a card now, I want it to support DX 10.1, as I am going to use it for at least 6 months, in which many games could show up that benefit from 10.1 in the way AC did.

Of course the key is could, noone knows what games will support 10.1, and it might take ages.
It is however, not unreasonable to expect games coded for 10.0 to also support 10.1 at some point.

20% better performance for a relatively small fix. I would do it if I were a developer.Your answer is what I expected...

but...what games are coming out using either 10.1 or tesselation by EOY 08? How many in fact by 09 will use it? Certainly WAY more than 08. But way more is awfully subjective considering anything is way more than *possibly* a couple.

20% in this, ONE, title. Here is to hoping that is linear across the board...though I see significant differences in that. :clap:

Well Jakko, again, I guess we should just agree to disagree. Fact remains though, that tesselation feature and 10.1 are both incredibly useless now and for MONTHS to come. Here is to hoping they get it out soon for.......ATI card owners' sake anyway. :yepp:

Luka_Aveiro
05-31-2008, 12:48 PM
Funny, I think I am not going to need Dx10 at all, because I am still a WinXp user and I am very pleased with Dx9 performance of my 8800GT, and of course, I haven't seen a plausible reason to move to WinVista and Dx10...

I found it really funny while reading all this things about AC Dx10.1 vs. DX10 and red somewhere AC devs were trying to make Dx10 version(s) as good as Dx9. :p:

Shintai
05-31-2008, 12:57 PM
DX10.1 was FLAWED in this title. And the gains showed on a 2900XT also invalidates any numbers from rage3d.

So in short, hold your breath. Dont waste more time until a real non flawed DX10.1 title comes.

And before anyone defends Ubisoft. Try check a title called Pool of radiance :p:

Oh, and be careful about uninstalling their games :rofl:

v_rr
05-31-2008, 01:13 PM
DX10.1 was FLAWED in this title. And the gains showed on a 2900XT also invalidates any numbers from rage3d.

http://enthusiast.hardocp.com/article.html?art=MTQ5MywxLCxoZW50aHVzaWFzdA==

Dig a Hole and put yourself in there with your astrological powers + trolling :p:
There are no gains with 2900XT. Recicling arguments is just sily.

Shintai
05-31-2008, 01:41 PM
http://enthusiast.hardocp.com/article.html?art=MTQ5MywxLCxoZW50aHVzaWFzdA==

Dig a Hole and put yourself in there with your astrological powers + trolling :p:
There are no gains with 2900XT. Recicling arguments is just sily.

Can you read the Rage3D part again? Or did you just want to flame and defend ATI as usual at all cost? Knight of Rubi? :rolleyes:

And again you keep holding on to a flawed implementation to somehow justify that DX10.1 is worth it. Unless your motto is; "DX10.1, if you want lower quality" :p:

And since Ubisoft removed it from the game. I think that quite settles it.

Talk about troll...try a mirror

Morgoth Bauglir
05-31-2008, 01:43 PM
DX10.1 was FLAWED in this title. And the gains showed on a 2900XT also invalidates any numbers from rage3d.

Are you being dense for a purpose, or just out of a calling?I can understand you have considerable issues understanding how a percentile increase is calculated(now, that is some really advanced math there), but not being able to read the FPS numbers themselves and do a rather trivial difference should worry you a bit.

You're telling all of us that the gain of 1.25 FPS without AA and the loss of 0.07 FPS with 4X AA prove that the 2900XT benefits from SP1 and some other magical mechanism is at play, thus the rendering the numbers invalid?Really?Compared to the behavior the 10.1 cards have?Are you familiar with the term of statistical significance?

perkam
05-31-2008, 02:13 PM
I don't think anyone will disagree with the fact that Ubisoft could have given ATI users the choice whether or not to enable DX 10.1 and provide a disclaimer saying that it is beta and currently under development. Remember, when a game company takes away choices from the user, that is when we the customer lose, especially when they stop development on a patch that ultimately Nvidia users would have benefited from down the road.

However, at the same time, Ubisoft's decision to take it out out of Assasin's Creed does not negate their ability to offer DX10.1 support in future games that may support it better. I don't really get the arguments here against Nvidia since ATI users moaned and whined when Ghost Recon allowed a sub-menu for settings only for those users with SM3 cards; it was wrong then, and it would've been wrong now.

As a principle, no one gamer should have an inherent graphical advantage if they choose one company's latest hardware over another's, and the inability to play DX10.1 Assasin's Creed upholds that principle. Nevertheless, no proof is necessary when saying that the multi-billion dollar industry that is games and the multi-billion dollar High End GPU industry that rests (primarily) on its ability to play those games creates an environment where game developers feel pressures from many external parties every day, and it would not be to hard to imagine Ubisoft not wanting to allocate extra resources to bring DX10.1 gameplay for a GPU company that has no interest in supporting game development the same way Nvidia does.

Perkam

Jakko
05-31-2008, 02:21 PM
As a principle, no one gamer should have an inherent graphical advantage if they choose one company's latest hardware over another's, and the inability to play DX10.1 Assasin's Creed upholds that principle.

Just because nvidia made the choice of being stubborn and not supporting 10.1, does not mean games shouldn't support 10.1 and run faster on it.
There is nothing wrong with supporting a feature that only one brand of technology offers, it's the way it always works.

Someone begins, a part of the industry follows, and eventually there is either a flop or a new industry standard.

As for Shintai, what can I say, I just hope other people ignore you like I do. You spread so much flawed information that it's starting to become impressive. :down:

perkam
05-31-2008, 02:25 PM
Just because nvidia made the choice of being stubborn and not supporting 10.1, does not mean games shouldn't support 10.1 and run faster on it.
There is nothing wrong with supporting a feature that only one brand of technology offers, it's the way it always works.

Someone begins, a part of the industry follows, and eventually there is either a flop or a new industry standard.

As for Shintai, what can I say, I just hope other people ignore you like I do. You spread so much flawed information that it's starting to become impressive. :down:Read the last sentence in entire post that you quoted. :cool:


and it would not be to hard to imagine Ubisoft not wanting to allocate extra resources to bring DX10.1 gameplay for a GPU company that has no interest in supporting game development the same way Nvidia does.

Perkam

v_rr
05-31-2008, 02:34 PM
And since Ubisoft removed it from the game. I think that quite settles it.

Removes if you install the patch. And no one with inteligence is going to install the patch on ATI.

Also patch bring lots of problems if you see the forums. Creates mores problems that those that fixes. The hurry to remove DX_10.1 was so mutch that messed up on the real important bugs :rofl:

Luka_Aveiro
05-31-2008, 02:38 PM
Removes if you install the patch. And no one with inteligence is going to install the patch on ATI.

Also patch bring lots of problems if you see the forums. Creates mores problems that those that fixes. The hurry to remove DX_10.1 was so mutch that messed up on the real important bugs :rofl:

Yeah, I found it funny also the patch created a whole buggy new game.

Do you know if that problems occur using a savegame from an unpatched version or do the bugs appear with a clean savegame from the patched version?

SnipingWaste
05-31-2008, 02:48 PM
Why are you fighting over the HD2900XT. The HD2900XT has no DX10.1 support. Only the HD3XXX has DX10.1 support.

Macadamia
05-31-2008, 05:42 PM
Why are you fighting over the HD2900XT. The HD2900XT has no DX10.1 support. Only the HD3XXX has DX10.1 support.

Because Shintai doesn't care. He probably just wants to get attention, while everything non-Intel from his posts range from quite uninsightful to brain-diarrhea.


it would not be to hard to imagine Ubisoft not wanting to allocate extra resources to bring DX10.1 gameplay for a GPU company that has no interest in supporting game development the same way Nvidia does.

Perkam

As far I heard it, ATi coded the whole 10.1 path themselves. They always worked with Ubi for compatibility and all. Just because they didn't pull Ubi into the marketing/$ farce doesn't mean they don't help.

TWIMTBP however, is BS upon layers of BS, so many games already had vendorID checks to enable features like PCF, extra shader code etc.

What about ATI? Remember Truform? It was in Unreal Engine 2, but since UT2004 was TWIMTBP, it was quietly hidden away in an inf. Far Cry's HDR still doesn't work proerly with ATI cards, and a LOT of previous EA games are horrible in this aspect too.

Krizby87
05-31-2008, 09:12 PM
Woa, it's probably wise for any Nvidia or non-biased member not to participate in this thread, cause this thread is by AMD and for AMD fan only, it's one of those way they pleasure themself I guess. Why bother defend Nvidia over a lamea$$ game in which Nvidia still dominates in performance with or without DX10.1

Jakko
06-01-2008, 02:33 AM
Woa, it's probably wise for any Nvidia or non-biased member not to participate in this thread, cause this thread is by AMD and for AMD fan only, it's one of those way they pleasure themself I guess. Why bother defend Nvidia over a lamea$$ game in which Nvidia still dominates in performance with or without DX10.1

What the hell?
It's posts like this that surprise me, even most nvidia fans dislike the way Ubisoft did this. Directx 10.1 is very useful if it does what was seen in AC and anyone who is not completely dense agrees.

Noone has to bother defend nvidia, because nvidia is not your mommy.
They made a stupid choice not supporting 10.1 and now ubisoft made a stupid choice removing it from their game.

Whether or not it's a good game is not relevant in any way.
We allready know nvidia has the performance lead atm, rubbing that in my face in this thread makes you look pathetic.

Linchpin
06-01-2008, 02:56 AM
Seriously enough is enough, I don't think we need at least 5 threads of people moaning about this because guess what: moaning is not going to change anything. Jakko no offence but do you intend to keep posting a new thread about this every week for the rest of your life or something?

Jakko
06-01-2008, 03:22 AM
Seriously enough is enough, I don't think we need at least 5 threads of people moaning about this because guess what: moaning is not going to change anything. Jakko no offence but do you intend to keep posting a new thread about this every week for the rest of your life or something?

This thread was made after news came out regarding Ubisofts decisions.
If you are getting tired of this subject stay out of this thread.
:rolleyes:

halo112358
06-01-2008, 09:31 AM
Congratulations, this issue has crossed the threshold into the realm of conspiracy theory!

Here, have a cookie.

Warboy
06-01-2008, 10:12 AM
Jakko and the other AMD fanboys really need to grow up here.

DX10.1 was permanently removed, Cry me a river, So i can use your tears to watercool my rig.... Having a flame war and Flame baiting isn't the way to go to prove your point. Maybe if you cry enough. Maybe you'll see DX10.1 in AC2. Now a Mod really needs to close this thread. Before more AMD Fanboys that are KNOW IT ALLS attack more high ranking members.

LowRun
06-01-2008, 10:54 AM
Warboy;3029828']Jakko and the other AMD fanboys really need to grow up here.

DX10.1 was permanently removed, Cry me a river, So i can use your tears to watercool my rig.... Having a flame war and Flame baiting isn't the way to go to prove your point. Maybe if you cry enough. Maybe you'll see DX10.1 in AC2. Now a Mod really needs to close this thread. Before more AMD Fanboys that are KNOW IT ALLS attack more high ranking members.

:clap: You've just join the flame fest :rolleyes:

Jakko
06-01-2008, 11:02 AM
Warboy;3029828']Jakko and the other AMD fanboys really need to grow up here.

DX10.1 was permanently removed, Cry me a river, So i can use your tears to watercool my rig....
Lol!


Having a flame war and Flame baiting isn't the way to go to prove your point.

DX10.1 was permanently removed, Cry me a river, So i can use your tears to watercool my rig....
Lol!


Maybe if you cry enough. Maybe you'll see DX10.1 in AC2. Now a Mod really needs to close this thread. Before more AMD Fanboys that are KNOW IT ALLS attack more high ranking members.

Yes!
Please mods, help the poor "high ranking members" like Warboy! They are constantly being attacked by AMD fanboys and just want to have a "grown up" discussion! :ROTF:

STaRGaZeR
06-01-2008, 11:28 AM
http://img78.imageshack.us/img78/6653/dibujotm9.png

:rofl:

Shintai
06-01-2008, 11:35 AM
Only people that seems to have owned anyone is those owning themself. Its abit sad to see certain people hold so thight into something thats even removed. Simply to try constantly to make something out of nothing.

1. Denial
2. Anger
3. Bargaining
4. Depression
5. Acceptance

And we are on stage 2 it seems.

Jakko
06-01-2008, 12:13 PM
Hahaha
Those tags almost made me fall off my chair laughing.
What a mess!

Maybe it's about time a mod closes this thread, everything has been said I think. Unless someone has something interesting on-topic to add?

DilTech
06-02-2008, 11:40 AM
No difference in IQ, eh? From HardOCP's own review.

http://enthusiast.hardocp.com/image.html?image=MTIwOTMyNzEzNkxiVDRZd2YyaWJfN184X 2wucG5n

Take a look at the wood on the bottom left-hand corner. Care to tell me that the NVidia image isn't MUCH better than the ATi one?

Vapor
06-02-2008, 11:42 AM
Dil, this thread was locked almost 24 hours ago :p: