Ubisoft: no plans to re-implement DX10.1 in Assassin's Creed
http://www.pcgameshardware.de/aid,64...Creed_planned/
Quote:
PCGH: D3D 10.1 support in Assassin's Creed was a hidden feature. Why do you choose not to announce this groundbreaking technology?
Ubisoft: The support for DX10.1 was minimal. When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API.
PCGH: What features from Direct 3D 10.1 do you use with the release version? Why do they make Assassin's Creed faster? And why do FSAA works better on D3D 10.1 hardware?
Ubisoft: The re-usage of the depth buffer makes the game faster. However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.
PCGH: Why do you plan to remove the D3D 10.1 support?
Ubisoft: Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.
PCGH: Are there plans to implement D3D 10.1 again?
Ubisoft: There is currently no plan to re-implement support for DX10.1.
So just to resume, this is what Ubisoft says:
1. DX 10.1 makes the game faster
2. Our implementation was sucky
3. Performance gains seen on ati hardware are inaccurate
4. We didn't fix it, we removed it
5. We will not bring it back
You would expect a gamedeveloper to be proud of making the first title that has support for the latest api, but I guess it is understandable Ubisoft is not going to spend time and resources on fixing features in their TWIMTBP title that only ATI hardware will use.
:down: