Page 1 of 4 1234 LastLast
Results 1 to 25 of 77

Thread: Ubisoft: no plans to re-implement DX10.1 in Assassin's Creed

  1. #1
    Banned
    Join Date
    Jul 2007
    Posts
    264

    Ubisoft: no plans to re-implement DX10.1 in Assassin's Creed

    http://www.pcgameshardware.de/aid,64...Creed_planned/
    PCGH: D3D 10.1 support in Assassin's Creed was a hidden feature. Why do you choose not to announce this groundbreaking technology?

    Ubisoft: The support for DX10.1 was minimal. When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API.


    PCGH: What features from Direct 3D 10.1 do you use with the release version? Why do they make Assassin's Creed faster? And why do FSAA works better on D3D 10.1 hardware?

    Ubisoft: The re-usage of the depth buffer makes the game faster. However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
    This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.


    PCGH: Why do you plan to remove the D3D 10.1 support?

    Ubisoft: Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.


    PCGH: Are there plans to implement D3D 10.1 again?

    Ubisoft: There is currently no plan to re-implement support for DX10.1.
    So just to resume, this is what Ubisoft says:

    1. DX 10.1 makes the game faster
    2. Our implementation was sucky
    3. Performance gains seen on ati hardware are inaccurate
    4. We didn't fix it, we removed it
    5. We will not bring it back

    You would expect a gamedeveloper to be proud of making the first title that has support for the latest api, but I guess it is understandable Ubisoft is not going to spend time and resources on fixing features in their TWIMTBP title that only ATI hardware will use.

    Last edited by Jakko; 05-30-2008 at 06:54 AM.

  2. #2
    Xtreme Member
    Join Date
    Sep 2007
    Location
    Montreal, Canada
    Posts
    263
    Wow, they didn't even give a lame excuse. They're just pulling a pokerface and sticking it to ATI.

  3. #3
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    I always feel a bit sad when a purely economical move stops technological advance.





    I won't buy assassin's creed.

  4. #4
    Xtreme Addict
    Join Date
    Jun 2004
    Location
    US, Virginia
    Posts
    1,513
    Does anyone know how exactly it was 'broken' or 'buggy?' That could be lame excuse if it's untrue, which I'm inclined to say it is since they didn't specify what exactly was broken about it. With the unpatched version it runs faster, looks the same. What's buggy about that?

    Who cares if it's 'implemented poorly.' Fact is it's already implemented and the game is better with than without it (for ati owners).
    E8400 @ 3600mhz
    4870 @ 790/1100
    2x2GB DDR2

  5. #5
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    279
    they said that a render pass was not used so they removed it ...could it not be fixed? was it too time consuming for them? after all it could allow the dx10.1 usage and more to mess w/

  6. #6
    Xtreme Enthusiast
    Join Date
    Sep 2004
    Posts
    650
    he should just say: "The way it's meant to be played!"

  7. #7
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    DX10.1 Battle Ground:

    nVidia 1 - ATI 0



    [/flame mode]

    IMO, they should have fixed it completely, I would like to see what REALLY Dx10.1 brings better than DX10.
    Are we there yet?

  8. #8
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by fcry64 View Post
    they said that a render pass was not used so they removed it ...could it not be fixed? was it too time consuming for them? after all it could allow the dx10.1 usage and more to mess w/
    First, it would require new shaders to be written for the particles at every level of AA(that's why in DX10.1 on the 3870 the dust and other particle effects don't show up), which would take quite a bit of time in the first place. Then they'd have to fix the lights from bleeding through certain walls(think the infamous torches bleeding through the floor SS).

    There's two biggies and I haven't even touched it in DX10.1 mode.

    Quote Originally Posted by kryptobs2000 View Post
    Does anyone know how exactly it was 'broken' or 'buggy?' That could be lame excuse if it's untrue, which I'm inclined to say it is since they didn't specify what exactly was broken about it. With the unpatched version it runs faster, looks the same. What's buggy about that?

    Who cares if it's 'implemented poorly.' Fact is it's already implemented and the game is better with than without it (for ati owners).
    See above. There are more than a few things wrong with DX10.1 mode on it.

    Quote Originally Posted by Loque View Post
    he should just say: "The way it's meant to be played!"
    No, he should just say "ATi hasn't supplied us with any hardware to test our game code in a DX10.1 environment. As such, we don't have the resources to implement the patch to re-enable a working DX10.1 extension to our game-code. They also refuse to work with us on this matter."

    That would be a realistic statement compared to "The way it's meant to be played!". Just ask anyone who's ever worked for a development house.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  9. #9
    Xtreme Mentor
    Join Date
    Apr 2003
    Location
    Ankara Turkey
    Posts
    2,631
    [QUOTE=Luka_Aveiro;3025446]DX10.1 Battle Ground:

    nVidia 1 - ATI 0 - Ubisoft "-1"


    [QUOTE]


    When i'm being paid i always do my job through.

  10. #10
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by kromosto View Post
    Quote Originally Posted by Luka_Aveiro View Post
    DX10.1 Battle Ground:

    nVidia 1 - ATI 0 - Ubisoft "-1"

    Thanks for the correction
    Are we there yet?

  11. #11
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    1. DX 10.1 makes the game faster
    b/c its broken..true.
    2. Our implementation was sucky
    See #1
    3. Performance gains seen on ati hardware are inaccurate
    its broken. People cried when Nvidia's drivers had a bug in Crysis that messed up some rendering that gave them a couple fps increase as well. Would you rather see it (pun intended) the way its meant to be played (particles/dust/flames in the right spot), or a couple fps faster. If you prefer the latter, turn down the ingame details...
    4. We didn't fix it, we removed it
    5. We will not bring it back
    Its a shame...See DilTech's post.

  12. #12
    Banned
    Join Date
    Jul 2007
    Posts
    264
    Jas, according to Ubisoft, DX 10.1 increases performance whether or not the implementation gets screwed up.
    At least that is how I understand the interview.

    Which makes it especially weird that Ubisoft later on says they know for sure the performance improvements seen on ATI hardware are due to bugs.

    So 10.1 does improve performance, unless on ati hardware, in which case performance improvements are only caused by ty implementation?
    Hmmmk

  13. #13
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by Jakko View Post
    Jas, according to Ubisoft, DX 10.1 increases performance whether or not the implementation gets screwed up.
    At least that is how I understand the interview.


    Which makes it especially weird that Ubisoft later on says they know for sure the performance improvements seen on ATI hardware are due to bugs.
    I didnt see that anywhere. Now, that doesnt mean it doesnt exist. But please feel free to quote Ubi where they said that and the inverse (already done with this thread and excerpt you posted), and you will have a point. Thanks for refreshing my failing (old) mind.

    EDIT: After thinking a bit, even if it does improve performance on its own, wouldnt adding in the rendering of the dust and particles etc just slow it down??? I mean, who knows how much it would slow it down, but more stuff on the screen generally equals to lower fps.
    Last edited by jas420221; 05-30-2008 at 08:10 AM.

  14. #14
    Banned
    Join Date
    Jul 2007
    Posts
    264
    When they talk about the reason for using 10.1 they say:
    When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API. The re-usage of the depth buffer makes the game faster.
    And then they say:

    the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
    How do they know what performance gain was result of the bugs and what performance gain was result of using dx 10.1?
    Are they talking ?
    The clue comes in the next comment:

    This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.
    But benchmarks reveal that the performance gain is only there when AA is used. In other words, when using DX 10.1 and not using AA there is no performance gain even though all the dx 10.1 bugs are still there.

    This means that Ubisoft is talking out of their ass. The performance gain seen on ATI hardware is not at all result of the buggy implementation (otherwise a gain could be seen even when not using AA), it is result of DX 10.1.
    What could be the reason for falsely condemning benchmarks that favor ati?
    Last edited by Jakko; 05-30-2008 at 08:15 AM.

  15. #15
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by Jakko View Post
    When they talk about the reason for using 10.1 they say:


    And then they say:


    How do they know what performance gain was result of the bugs and what performance gain was result of using dx 10.1?
    Are they talking ?
    The clue comes in the next comment:



    But benchmarks reveal that the performance gain is only there when AA is used. In other words, when using DX 10.1 and not using AA there is no performance gain even though all the dx 10.1 bugs are still there.

    This means that Ubisoft is talking out of their ass. The performance gain seen on ATI hardware is not at all result of the buggy implementation, it is result of DX 10.1.
    They are biased.
    OK... thanks!

    But arent those 2 seperate things? Its not like you cant have one with out the other....right?

  16. #16
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949


    This is boring. It was already clear that they will not reimplement DX10.1.

    DX10.1 IS faster than DX10, but only when AA is on. Some of you please check what DX10.1 is and what it does please.

    Quote Originally Posted by jas420221 View Post
    b/c its broken..true.
    Yes and no. See above. In this concrete game you end with: faster because is faster + faster because is broken. Two different things. If you fix the implementation it'll be slower than it's now, but still faster than DX10 (only with AA on, again).

    I don't think this is soooo difficult to understand... people will not accept it until they see some proper DX10.1 implementations.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  17. #17
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Yes, like I said in my edit in post #13....right?

    Its faster on its own (10.1 w/AA), but you add in the 'broken' part (dust and particle effects), and it would slow it back down...how much, no idea.

  18. #18
    Banned
    Join Date
    Jul 2007
    Posts
    264
    Quote Originally Posted by jas420221 View Post
    OK... thanks!

    But arent those 2 seperate things? Its not like you cant have one with out the other....right?
    Possible yes, but the benchmarks show that the bugs don't add any performance, but enabling AA does. (DX 10.1 works)
    So why would Ubi say the boost for ati hardware is an inaccurate result?

  19. #19
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Because it really was. At least it was for me, using multiple cards. Since DX10.1 removal, the game plays far smoother than before, and has far less rendering errors. Boosts made by improper rendering do not equal better performance, unless maybe benching!

  20. #20
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    so why do they lock this "bug" if people actually want it?
    doesnt make any sense...

  21. #21
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by cadaveca View Post
    Because it really was. At least it was for me, using multiple cards. Since DX10.1 removal, the game plays far smoother than before, and has far less rendering errors. Boosts made by improper rendering do not equal better performance, unless maybe benching!
    That was exactly my point with the Nvidia/Crysis bug reference and its improvements that people were ing about (for a good reason).

    I understand on its own its does have fps improvements, but how much really when you add in everything its supposed to be rendering?

    Quote Originally Posted by saaya View Post
    so why do they lock this "bug" if people actually want it?
    doesnt make any sense...
    My guess is b/c its missing items that were supposed to be on screen and others that were NOT supposed to be on screen were showing. Its borked.........like they repeatedly keep saying that some just refuse to believe for one reason or another.

  22. #22
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by jas420221 View Post
    Yes, like I said in my edit in post #13....right?

    Its faster on its own (10.1 w/AA), but you add in the 'broken' part (dust and particle effects), and it would slow it back down...how much, no idea.
    You edited yours while I was writing mine

    Exactly. Remove the broken part and voilà: DX10.1 is something good, and for all. Well, for all DX10.1 cards
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  23. #23
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Jakko View Post
    Possible yes, but the benchmarks show that the bugs don't add any performance, but enabling AA does. (DX 10.1 works)
    So why would Ubi say the boost for ati hardware is an inaccurate result?
    You haven't been paying attention, have you?

    The only way DX10.1 is in use IS when AA is on! Says it in the article clear as day. As such, the bug only occurs when AA is on, and not having to render the particle effects WILL lead to higher performance. That's why the boost is an inaccurate result, because while there is a performance boost for using DX10.1, a majority of the performance boost is due to incorrect rendering, plain and simple. I've also already explained why it isn't being put back in.

    Either way, with this first DX10.1 title having to take dx10.1 out of it, it's probably going to be even longer now before there's another one, especially due to the extra amount of code required to use DX10.1.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  24. #24
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by DilTech View Post
    because while there is a performance boost for using DX10.1, a majority of the performance boost is due to incorrect rendering, plain and simple.
    C'mon man, plain and simple is also the fact that you don't know a (only Ubi knows it) about what part of the perfomance boost is due to the bugs and what is due to DX10.1
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  25. #25
    Banned
    Join Date
    Jul 2007
    Posts
    264
    Quote Originally Posted by DilTech View Post
    You haven't been paying attention, have you?
    There's no need to be all hysterical about this now, especially since you are wrong.

    The only way DX10.1 is in use IS when AA is on! Says it in the article clear as day. As such, the bug only occurs when AA is on, and not having to render the particle effects WILL lead to higher performance. That's why the boost is an inaccurate result, because while there is a performance boost for using DX10.1, a majority of the performance boost is due to incorrect rendering, plain and simple. I've also already explained why it isn't being put back in.
    You are partly right, the dustbug only appears on a SP1 system when AA is enabled. The different lighting though, appears on SP1 systems that do not use AA, and also on systems that use DX 9.
    I know what Ubisoft says, but since they are full of crap (as you will soon find out), using them as the only source for this kind of info is silly.

    Since you didn't read the follow up article on Rage3d I will quote some stuff for ya:

    http://rage3d.com/articles/assassins.../index.php?p=3

    Did we find the glitches that everyone has been talking about when making reference to the 10.1 path? In a way - we do have that missing dust that qualifies for that category, albeit we don't know yet if it's simply a driver bug or something wrong with the pathway itself. Other than that, there's the more intense lighting, but that actually seems to show that there's a bug with the DX10 pathway, given the fact that DX9 has the same, more intense, lighting as 10.1, and UBi said that 9 and 10 (and by extension 10.1) should be nearly identical visually (shadowing is different between versions).
    The good news is that the dust-bug affects only a few scenarios and that, after testing with a run through Acre that avoids any areas which include the troublesome dust, we've determined that the performance benefits associated with 10.1 remain: it wasn't a case of the missing effect causing the performance improvements.
    So yes, Ubi is full of crap, Ati's DX 10.1 boost is not due to the bugs, and Ubi calling the dx 10.1 benchmarks "inaccurate" is shady to say the least.

    One other thing, in the article I linked to, a known 3d coder makes a guess as to what's causing the dustbug.
    "Interesting. I checked out the dust and it looks very much like it’s soft particles. This means they require the depth buffer for this effect. That it doesn’t work with AA makes sense too. If they simply bound the depth buffer as a texture they would need to use the Load() function rather than Sample() to fetch the depth value when it’s multisampled (any single sample in the AA depth buffer should be good enough for soft shadows, no need to average). They would also need a separate shader for each AA setting, like one for 2x, 4x and 8x. I would guess it’s broken because this wasn’t done and they are trying to sample the MSAA buffer with Sample() using the same shader as in the NoAA / plain DX10 case. Given that the effect is quite subtle it could easily have been overlooked. So fixing this (and anything else that might be broken) is probably the reason why they said they needed to redo the DX10.1 path."
    If this guy is right, Ubi made a relatively simple mistake, and could fix this easily. Yet they choose not to.
    Oh and I added the "another jakko thread!" tag to this thread, to save you guys some trouble.
    Last edited by Jakko; 05-30-2008 at 02:59 PM.

Page 1 of 4 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •