Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 77

Thread: Ubisoft: no plans to re-implement DX10.1 in Assassin's Creed

  1. #26
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by STaRGaZeR View Post
    C'mon man, plain and simple is also the fact that you don't know a (only Ubi knows it) about what part of the perfomance boost is due to the bugs and what is due to DX10.1
    the best part is that affter patch 1.2 i reinstalled to see the difference and the animus has stuff in front of it, and the game crashes all day + lag when it goes to pre assassination videos

    and there is artifacts in the flags (not the stuff that should be there but black squares with no white numerics)


    ati has funding now though so we should see more ati unlocked games (high shader optimization + 10.1, it shouldent lock down NV since it wouldent work that way and ati has class and it should help via/s3), but were is the EU probe on this comeon amd files grevances over intel all day its time for NV to get whats comeing
    Last edited by zanzabar; 05-30-2008 at 02:53 PM.
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  2. #27
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Jakko View Post
    There's no need to be all hysterical about this now, especially since you are wrong.



    You are partly right, the dustbug only appears on a SP1 system when AA is enabled. The different lighting though, appears on SP1 systems that do not use AA, and also on systems that use DX 9.
    I know what Ubisoft says, but since they are full of crap (as you will soon find out), using them as the only source for this kind of info is silly.

    Since you didn't read the follow up article on Rage3d I will quote some stuff for ya:

    http://rage3d.com/articles/assassins.../index.php?p=3





    So yes, Ubi is full of crap, Ati's DX 10.1 boost is not due to the bugs, and Ubi calling the dx 10.1 benchmarks "inaccurate" is shady to say the least.

    One other thing, in the article I linked to, a known 3d coder makes a guess as to what's causing the dustbug.


    If this guy is right, Ubi made a relatively simple mistake, and could fix this easily. Yet they choose not to.
    Oh and I added the "another jakko thread!" tag to this thread, to save you guys some trouble.
    Nice post, but you know some people have astrological powers and before all this was known they trashed and predicted giant things about DX_10.1 and invented all escuses about Assacin creed.

    But now that all this is explored they keep pushing the same button over and over again and spread false information into all treads about it (look in his signature. Because 1 post he made 5 of it and put into signature), now it´s pushing the same wrong button over and over again and everyone should have this in their signature just to copy his previous act.

    Astrological powers are good (not), but when the truth is already known and some people keep pushing the button is just boring and worse, it spread wrong information into the forum. There are "n" reviews stating that Ubisoft is not telling the truth and there is zero reviews supporting Ubisoft but in the foruns fanboys appear and what happens next is the usual....
    Last edited by v_rr; 05-30-2008 at 03:21 PM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  3. #28
    Banned
    Join Date
    Jul 2007
    Posts
    264
    Well in all honesty, it could have been possible that the performance improvements are due to the bugs.
    But since it's been pointed out in the rage3d article that the bugs do not cause the performance increases, I don't see why anyone, especially Ubisoft, would suggest such things.

  4. #29
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by Jakko View Post
    Well in all honesty, it could have been possible that the performance improvements are due to the bugs.
    But since it's been pointed out in the rage3d article that the bugs do not cause the performance increases, I don't see why anyone, especially Ubisoft, would suggest such things.
    Because they are the developers and they know what they are talking about?

    Because nVidia payed them to do so?

    What do you think?
    Are we there yet?

  5. #30
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by Luka_Aveiro View Post
    Because they are the developers and they know what they are talking about?

    Because nVidia payed them to do so?

    What do you think?
    You think official statements are the truth?

    C'mon, EVERY PR statement contains spin.

    And Dil, really. You haven't even used a DX10 Radeon before, and you're being snobbish about the majority of stuff you said.

    I'll patch AC to 1.2 on my friend's 2900PRO later (I might buy it from him since my XT is RMAd) and see if it's oh- MOAR STABEL or something.

  6. #31
    Xtreme 3D Team Member
    Join Date
    Jun 2006
    Location
    small town in Indiana
    Posts
    2,285
    Quote Originally Posted by DilTech View Post

    No, he should just say "ATi hasn't supplied us with any hardware to test our game code in a DX10.1 environment. As such, we don't have the resources to implement the patch to re-enable a working DX10.1 extension to our game-code. They also refuse to work with us on this matter."

    That would be a realistic statement compared to "The way it's meant to be played!". Just ask anyone who's ever worked for a development house.
    They have ATI cards cards to test it on. keep the fanboi crap out of it. you are just grasping at straws. just face the fact your beloved card maker doesn't have a DX10.1 card and apparently won't this year as their new card doesn't have 10.1 support. It is time they catch up.
    QX 9650 5ghz with 1.55v 4.8ghz with 1.5v 24/7 in a VAPOLI V-2000B+ Single stage phase cooling.
    DFI LP LT X-38 T2R
    2X HD4850's water cooled , volt modded
    Thermaltake 1KW Psu
    4x Seagate 250GB in RAID 0
    8GB crucial ballistix ram

  7. #32
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Guys I'm watching you fighting over this thing like somebody said his penis is bigger than yours.
    After all, what do you really care about ? The game itself or the "excitement" of fighting over a stupid issue ?
    How many of you here have played the game ? How many of you finished the game ? How many of you liked the game/gameplay ?

    Just get over it and move on.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  8. #33
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    290
    Quote Originally Posted by BenchZowner View Post
    Guys I'm watching you fighting over this thing like somebody said his penis is bigger than yours.
    After all, what do you really care about ? The game itself or the "excitement" of fighting over a stupid issue ?
    How many of you here have played the game ? How many of you finished the game ? How many of you liked the game/gameplay ?

    Just get over it and move on.
    Whether or not the game has DX10.1 support isn't the issue, although it would be nice.

    The issue is that technology is being held back by nVidia, just because they don't have a DX10.1 card they seem to be pushing for DX10.1 to not be used. Before you say it is insignificant, there is an easy 15-20% performance boost w/ 4xAA going from DX10 to DX10.1 on AMD cards in Assassin's Creed. Going from DX9 -> DX10 doesn't even improve performance by that amount, so you could argue DX10.1 is a bigger deal than DX10 in regards to performance.

    I currently own an nVidia card (8800GTS 512MB) but stuff like this makes me want to buy a 4870 instead of a GTX 260.

  9. #34
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    [QUOTE=Extelleron;3027421]Whether or not the game has DX10.1 support isn't the issue, although it would be nice.

    The issue is that technology is being held back by nVidia, just because they don't have a DX10.1 card they seem to be pushing for DX10.1 to not be used.
    Flashback... nVIDIA SM3.0 ATi SM2.0b

    P.S. THEY SEEM differs significantly from THEY ADMITTEDLY & SURELY DO.

    Before you say it is insignificant, there is an easy 15-20% performance boost w/ 4xAA going from DX10 to DX10.1 on AMD cards in Assassin's Creed.
    In my testing there was no 15 to 20% performance boost.
    You're exaggerating.

    I currently own an nVidia card (8800GTS 512MB) but stuff like this makes me want to buy a 4870 instead of a GTX 260.
    With or without DX10.1 the 4870 will be slower than the GTX.

    p.s. How many pure DX10.1 games you think we'll see before "DX11" ?
    p.s. Once again, how many of you liked the game ?
    p.s. If DX10.1 is such a big deal, why doesn't AMD something about it ? Let's say that the very very dark evil nVIDIA monster is preventing them with the TWIMTBP for Assasins Creed, why don't they use some money and make some agreements with other software houses with their own GITG program ?
    Last edited by BenchZowner; 05-31-2008 at 05:37 AM.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  10. #35
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by BenchZowner View Post
    How many of you here have played the game ? How many of you finished the game ? How many of you liked the game/gameplay ?
    Me, me (twice) and me.
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  11. #36
    Banned
    Join Date
    Jul 2007
    Posts
    264
    Quote Originally Posted by Extelleron View Post
    Whether or not the game has DX10.1 support isn't the issue, although it would be nice.

    The issue is that technology is being held back by nVidia, just because they don't have a DX10.1 card they seem to be pushing for DX10.1 to not be used. Before you say it is insignificant, there is an easy 15-20% performance boost w/ 4xAA going from DX10 to DX10.1 on AMD cards in Assassin's Creed. Going from DX9 -> DX10 doesn't even improve performance by that amount, so you could argue DX10.1 is a bigger deal than DX10 in regards to performance.

    I currently own an nVidia card (8800GTS 512MB) but stuff like this makes me want to buy a 4870 instead of a GTX 260.
    QFT
    Allthough it has to be said, technology is not just being held back by nvidia, but also by Ubisoft, as they are the ones unwilling to fix 10.1 pathways rather than remove them, for whatever reason.

    And Bench, your questions can easily be answered, money.
    AMD does not have the money to pay many gamedeveloppers at this point. I understand they helped on mass effect, which is great news I think. Looks like a much better game than AC.

    Oh and I don't think it matters whether or not someone plays or likes AC. You can hate that game and still have the opinion that Ubisofts anti-innovative decisions suck.
    Last edited by Jakko; 05-31-2008 at 05:48 AM.

  12. #37
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    290
    [QUOTE=BenchZowner;3027431]
    Quote Originally Posted by Extelleron View Post
    Whether or not the game has DX10.1 support isn't the issue, although it would be nice.



    Flashback... nVIDIA SM3.0 ATi SM2.0b

    P.S. THEY SEEM differs significantly from THEY ADMITTEDLY & SURELY DO.



    In my testing there was no 15 to 20% performance boost.
    You're exaggerating.



    With or without DX10.1 the 4870 will be slower than the GTX.

    p.s. How many pure DX10.1 games you think we'll see before "DX11" ?
    p.s. Once again, how many of you liked the game ?
    p.s. If DX10.1 is such a big deal, why doesn't AMD something about it ? Let's say that the very very dark evil nVIDIA monster is preventing them with the TWIMTBP for Assasins Creed, why don't they use some money and make some agreements with other software houses with their own GITG program ?
    It would have been nice to have SM3 on ATI cards in 2004, but I don't think it was necessary. ATI supported SM3, they just didn't have a card out w/ SM3 support until 2005. Considering I can't remember a clear example of a game that ever used SM3 and it had a noticeable effect (except Far Cry) until the ~2006-2007 timeframe (when games began to require it) I don't think ATI was wrong in waiting.

    The difference here is nVidia seems to never be interested in supporting DX10.1, which is a bit different. If this was the same as the ATI SM3 issue, nVidia would be supporting DX10.1 a year later than ATI, in 2008 with G200. But that's not the case.

    IMO from what I see, DX10.1 shouldn't take so much coding over DX10 (compared to the benefits at least). So I believe if the API had nVidia's support, we would see a large number of games supporting DX10.1 and DX10.0 being forgotten.

    Yes there is a 15-20% boost, in fact the difference is 25% comparing Minimum framerates:
    http://www.rage3d.com/articles/assas.../index.php?p=3

    And before you say it is because of SP1 not DX10.1, look on the next page and you see the HD 2900XT sees no benefit w/ 4xAA moving from Vista -> SP1.

    I haven't played the game on PC, but I've played it a bit on 360 and it seemed pretty cool. I would't spend $40-50 on it, but I'll pick it up when I can get it for $20 or so.

    And suggesting AMD actually market something is a lost cause. ATI has never had the marketing nVidia had. Why do you think nVidia did well even during the GeForce FX days? We see ATI on their knees right now because R600 wasn't competitive, but when nVidia runs into problems, they can just rely on marketing.
    Last edited by Extelleron; 05-31-2008 at 05:55 AM.

  13. #38
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    [QUOTE=Extelleron;3027452]
    Quote Originally Posted by BenchZowner View Post

    It would have been nice to have SM3 on ATI cards in 2004, but I don't think it was necessary. ATI supported SM3, they just didn't have a card out w/ SM3 support until 2005.
    They didn't support or applaud it, actually they were letting slides out with "SM3 hate" if my memory serves me right ( this time I'm quite sure it does ).

    Quote Originally Posted by Extelleron
    Considering I can't remember a clear example of a game that ever used SM3 and it had a noticeable effect (except Far Cry) until the ~2006-2007 timeframe (when games began to require it) I don't think ATI was wrong in waiting.
    Once again, 1 game back then as you say ( FarCry ), 1 game now ( Assasins Creed ).

    Quote Originally Posted by Extelleron
    The difference here is nVidia seems to never be interested in supporting DX10.1, which is a bit different. If this was the same as the ATI SM3 issue, nVidia would be supporting DX10.1 a year later than ATI, in 2008 with G200. But that's not the case.
    And why do you think nVIDIA is trying to prevent others from using DX10.1 while they could just reword their reworked architecture and with little to none changes ( hardware-wise DX10.1 isn't that much different ) and just make the performance gap between their products and the competitor's products even bigger ? ( since nVIDIA is already in the lead, even when comparing their card with DX10 against the AMD card with DX10.1 )

    By the way, how many DX10.1 games have you heard about coming ?
    How many DX10 games ? ( even those are few, very few )

    Quote Originally Posted by Extelleron
    Yes there is a 15-20% boost, in fact the difference is 25% comparing Minimum framerates:
    http://www.rage3d.com/articles/assas.../index.php?p=3
    I do not trust rage3d at all.
    And also, if you've been following this thing from the beginning you'd already know that I've took several measurements in DX10 & DX10.1 comparing nV & AMD cards

    Quote Originally Posted by Extelleron
    And suggesting AMD actually market something is a lost cause. ATI has never had the marketing nVidia had. Why do you think nVidia did well even during the GeForce FX days? We see ATI on their knees right now because R600 wasn't competitive, but when nVidia runs into problems, they can just rely on marketing.
    That's their problem. If they can't market their products and targets right, too bad.

    p.s. @ Jakko about money... well... if AMD's employees worked really hard, and finally put out a decent product, then they'd have more money and such.

    With the G80 nVIDIA made a big step forwarding and really swept ATi ( now AMD ) off the floor.
    The next card was also a disappointment and also failed to compete even with the 'pre'-high end G80 part.
    It took AMD a full year+ to come up with something that beats the initial G80 lineup and some of their refreshes.

    Like in a car race, you have to keep up...otherwise you'll be left behind, and when you're behind you need to work hard and make your way to the "opponent".
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  14. #39
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    BenchZowner strike back
    After so much discution in the other tread, again the same story? Read the reviews all over the web.

    You don´t trust rage3d. Take [H]:
    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==

    And there are plently website saying the same.
    Why should we trust on your biased comments whe we have lots of info in the web about AC DX_10.1? It´s just needed 10 minutes to see that your comment are always Nvidia biased. Even in the 260/280GTX and HD 4800 that are cards not even on the market you spread fud in the treads saying that Nvidia is much better and all sort of things.

    Leave this DX_10.1 tread and stop recicling arguments already proven to be false.
    Last edited by v_rr; 05-31-2008 at 06:38 AM.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  15. #40
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    We`ve already had an entire discussion on this...why restart the flame war.

    OT: George Foreman Grill


    ''The Way Its Meant To Be Cooked''

    Perkam
    Last edited by perkam; 05-31-2008 at 06:40 AM.

  16. #41
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    290
    That's their problem. If they can't market their products and targets right, too bad.

    p.s. @ Jakko about money... well... if AMD's employees worked really hard, and finally put out a decent product, then they'd have more money and such.

    With the G80 nVIDIA made a big step forwarding and really swept ATi ( now AMD ) off the floor.
    The next card was also a disappointment and also failed to compete even with the 'pre'-high end G80 part.
    It took AMD a full year+ to come up with something that beats the initial G80 lineup and some of their refreshes.

    Like in a car race, you have to keep up...otherwise you'll be left behind, and when you're behind you need to work hard and make your way to the "opponent".
    Now you are showing how much of a fanboy you are with these statements.

    You talk as if ATI/AMD never made a "decent product," when that is the farthest thing from the truth.

    ATI had a solid performance lead over nVidia (barring a few months time) from 2002 - 2006. The Radeon 9000 series blew the GeForce FX away, and the X800/X850 was able to outperform the GeForce 6 as well. X1800XT beat 7800GTX, X1900XTX beat the 7900GTX. In fact if you look at the performance of the X1900/7900 in 2007/2008 games, you see that even cards like the X1950 Pro are faster than the 7900GTX.

    With GeForce 8, nVidia finally hit it right and AMD messed up with R600, giving us the situation we have now. It hasn't even been two years that nVidia has been in the performance lead, but you seem to be forgetting the 4+ years that ATI had the advantage. Assuming HD 4870 X2 brings back performance parity, which I think it will, nVidia's current stretch of domination will be far shorter than ATI's past one.

    Why is nVidia not supporting DX10.1? Well why they originally decided not to, I have no idea, but it's not as if they can suddenly change their minds and in late 2007 decide to implement DX10.1 in G200. nVidia likely made that kind of decision back when G80 was shipping, and perhaps they underestimated the level of performance DX10.1 could bring to the table. Now they must cover themselves with marketing.

    And why should I trust you over rage3d? And as v_rr posted, HardOCP has found the same results (in fact they showed an HD 3870 got a 34% boost from 2xAA, so even better than the HD 3870 X2 rage3d results).

  17. #42
    Registered User
    Join Date
    Apr 2008
    Posts
    23
    Quote Originally Posted by BenchZowner View Post
    I do not trust rage3d at all.
    And also, if you've been following this thing from the beginning you'd already know that I've took several measurements in DX10 & DX10.1 comparing nV & AMD cards
    LOL. I guess you don't trust the other sites that verified it, eh?Like [H] or PCGH?Bit-tech?Or UBi themselves admitting that there is a performance benefit?All of those pale in comparison with your awesome investigation that took 2 weeks to produce 2 floppy shots, weeks throughout which you couldn't figure out how to enable AF on the Radeons/report that there's a bug with App-controlled AF(which was fixed in later drivers, BTW) and couldn't even see, that AA quality was actually superior even in those wimpy shots.

    Clearly, we are all biased idiots out to misguide the good ppl, and you, an obviously impartial and, most importantly, educated partie, have really proved all of us wrong. In an epic way. With 2 wimpy screenshots and a lot of imagination. Good job!
    Last edited by Morgoth Bauglir; 05-31-2008 at 07:20 AM. Reason: Typos

  18. #43
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Morgoth Bauglir View Post
    LOL. I guess you don't trust the other sites that verified it, eh?Like [H] or PCGH?Bit-tech?Or UBi themselves admitting that there is a performance benefit?All of those pale in comparison with your awesome investigation that took 2 weeks to produce 2 floppy shots, weeks throughout which you couldn't figure out how to enable AF on the Radeons/report that there's a bug with AP-controlled AA(which was fixed in later drivers, BTW) and couldn't even see, that AA quality was actually superior even in those wimpy shots.
    QFT. Right BZ?
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  19. #44
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by Extelleron View Post
    Now you are showing how much of a fanboy you are with these statements.
    When you don't like the truth, reject it.
    Well done.

    Quote Originally Posted by Extelleron
    You talk as if ATI/AMD never made a "decent product," when that is the farthest thing from the truth.
    You actually missed my post once again, and actually the part that says:

    ATi was better back then with the X1950XTX both in image quality & performance.
    They were in the lead again with the 9700Pro over the Ti 4600... they were also ahead in the GeForce Suck FX era with the 9800Pro/XT.

    What you fail to see is that nVIDIA at the present time IS FASTER & BETTER.
    Get that PRESENT TIME.

    Quote Originally Posted by Extelleron
    ATI had a solid performance lead over nVidia (barring a few months time) from 2002 - 2006. The Radeon 9000 series blew the GeForce FX away, and the X800/X850 was able to outperform the GeForce 6 as well.
    Ok, and S3 was beating Tseng Labs back in the past, should I buy S3 when there's a faster & better product available ?

    Quote Originally Posted by Extelleron
    With GeForce 8, nVidia finally hit it right and AMD messed up with R600, giving us the situation we have now. It hasn't even been two years that nVidia has been in the performance lead, but you seem to be forgetting the 4+ years that ATI had the advantage. Assuming HD 4870 X2 brings back performance parity, which I think it will, nVidia's current stretch of domination will be far shorter than ATI's past one.
    Now that IS fanboyism or even worse, hate towards a company ( just like you should love them... they all want the same f*ing thing, your money )

    P.S. You seem to be sure about AMD taking the lead now... do you have any valid info or just daydreaming about it ?

    Quote Originally Posted by Extelleron
    And why should I trust you over rage3d? And as v_rr posted, HardOCP has found the same results (in fact they showed an HD 3870 got a 34% boost from 2xAA, so even better than the HD 3870 X2 rage3d results).
    If you've taken a look at the pictures, on the HD card the AA wasn't applied everywhere.
    And still you're making ASSUMPTIONS based on conspiracy theories about nVIDIA forcing nUbiSoft to remove DX10.1

    All that, when there are several images showing various bugs & not rendered items & effects, and even objects rendered incorrectly.

    Quote Originally Posted by Morgoth Bauglir View Post
    LOL. I guess you don't trust the other sites that verified it, eh?Like [H] or PCGH?Bit-tech?Or UBi themselves admitting that there is a performance benefit?All of those pale in comparison with your awesome investigation that took 2 weeks to produce 2 floppy shots, weeks throughout which you couldn't figure out how to enable AF on the Radeons/report that there's a bug with App-controlled AF(which was fixed in later drivers, BTW) and couldn't even see, that AA quality was actually superior even in those wimpy shots.
    First of all you are not my employer neither my boss, neither my time management advisor, so let alone the crybaby "it took you 2 weeks to do that and that".
    Second, I intentionally left AF off for both cards.
    Third, I had very limited time and did some quick tests, I wasn't and never will be bothered to check this sucky game again.

    Clearly, we are all biased idiots out to misguide the good ppl, and you, an obviously impartial and, most importantly, educated partie, have really proved all of us wrong. In an epic way. With 2 wimpy screenshots and a lot of imagination. Good job!
    I refuse to reply to your ironic rant.

    Quote Originally Posted by STARgazer
    ...
    BASTA.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  20. #45
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by jimmyz View Post
    just face the fact your beloved card maker doesn't have a DX10.1 card and apparently won't this year as their new card doesn't have 10.1 support. It is time they catch up.
    Can you remind me what they need to catch up in...Im having trouble trying to come up with a 'used right now' or 'will be needed' or even 'wanted' feature by the end of the year.

    Does anyone remember tessellation on R600 cards? That highly touted feature hasnt panned out to well either ehh (correct me if I am wrong on that one)? Are there ANY games out that use it...Im curious. Do these new cards still have it?

    Anyway, as an enthusiast, Im not too concerned now with what *may* be. M$, Crytek, and Nvidia all came out a few months ago and more or less said DX10.1 is useless now, and will be for at least a year. That should be the end of 2008. What are the other DX10.1 games coming out this year? If its a handful or less and not any major titles, is this really a good feature to bring forward?

  21. #46
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    I would love someone could prove me nVidia actually paid Ubisoft to remove DX10.1 from AC.

    And I would love Ubisoft had implemented correctly Dx10.1 path, so we all could see the real deal about it.

    Until it happens, you all just can stay with the official statements, the rest is pure speculation and fantasy, as NOTHING can be proved, considering the Dx10.1 path with AC is REALLY broken, which screenshots from Dx10.1 with HD cards prove.

    Until something official comes out, you can fantasize anything you want, unless you can really prove what you are saying
    Are we there yet?

  22. #47
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by Luka_Aveiro View Post
    I would love someone could prove me nVidia actually paid Ubisoft to remove DX10.1 from AC.

    And I would love Ubisoft had implemented correctly Dx10.1 path, so we all could see the real deal about it.

    Until it happens, you all just can stay with the official statements, the rest is pure speculation and fantasy, as NOTHING can be proved, considering the Dx10.1 path with AC is REALLY broken, which screenshots from Dx10.1 with HD cards prove

    Until something official comes out, you can fantasize anything you want, unless you can really prove what you are saying
    QFT. Great post.

  23. #48
    Banned
    Join Date
    Jul 2007
    Posts
    264
    Well Luka, in the several articles containing benchmarks and tests, several things are being proven. One of them being that directx 10.1 really did bring a performance gain that wasn't caused by the bugs.

    And comparing directx 10.1 to tesselation is ridiculous. One being a feature and the other being an api one having no support whatsoever and the other allready having support in a title even though 10.0 is hardly supported and last but not least, tesselation being an ati initiative, and directx 10.1 being an industry initiative.

    That's right, ati didn't create directx 10.1, microsoft did.
    The time where you could say nvidia supporting no directx 10.1 being sensible is over, as this whole AC thing proved quite convincingly directx 10.1 is an important and welcome update to directx 10.

    I am amazed at how long it takes nvidia to implement it. Are they stubborn? Is their architecture not ready for it?
    I dunno, but I sure know it's stupid of them.

  24. #49
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Luka_Aveiro View Post
    I would love someone could prove me nVidia actually paid Ubisoft to remove DX10.1 from AC.

    And I would love Ubisoft had implemented correctly Dx10.1 path, so we all could see the real deal about it.

    Until it happens, you all just can stay with the official statements, the rest is pure speculation and fantasy, as NOTHING can be proved, considering the Dx10.1 path with AC is REALLY broken, which screenshots from Dx10.1 with HD cards prove.

    Until something official comes out, you can fantasize anything you want, unless you can really prove what you are saying
    Read the reviews and then come here again saying something with some logic.
    And the IQ quality is just OK in the reviews. Read [H] and others.
    Quote Originally Posted by Shintai View Post
    And AMD is only a CPU manufactor due to stolen technology and making clones.

  25. #50
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,386
    Quote Originally Posted by Jakko View Post
    Well Luka, in the several articles containing benchmarks and tests, several things are being proven. One of them being that directx 10.1 really did bring a performance gain that wasn't caused by the bugs.

    And comparing directx 10.1 to tesselation is ridiculous. One being a feature and the other being an api one having no support whatsoever and the other allready having support in a title even though 10.0 is hardly supported and last but not least, tesselation being an ati initiative, and directx 10.1 being an industry initiative.

    That's right, ati didn't create directx 10.1, microsoft did.
    The time where you could say nvidia supporting no directx 10.1 being sensible is over, as this whole AC thing proved quite convincingly directx 10.1 is an important and welcome update to directx 10.

    I am amazed at how long it takes nvidia to implement it. Are they stubborn? Is their architecture not ready for it?
    I dunno, but I sure know it's stupid of them.
    Did I compare dx10.1 and tessellation together?? I dont think I posted anything like that. Tessellation was an example of (another)a wasted implementation, like dx10.1. Im not comparing them in the manner which you posted. But I appreciate the completely unneccesary lesson in any case.

    Tell me why its stupid for nvidia to not have a feature that isnt and wont be used for MONTHS to come (in which time 2 sets of cards will have been released with the latter possibly having 10.1 when it usefull)?

    ...Im not worried that nvidia doesnt have dx10.1. Its completely useless at this point in time. When there are more titles that support it, Im certain they will come up with cards supporting that 'upgrade'. I said it before at R3D a few months ago, by the time the feature set/api will be implemented by the devs there will be other WAY better cards out there anyway. Its like equiping a car with navigation but saying you wont be able to use it at all until they get satellites up in over a year. Whats the point? (ok car is a bad example b/c you are likely to keep it a lot longer than an enthusiast keeps a PC/vidcard...i hope you get my point though. ).

    I still have some unanswered questions...:

    Quote Originally Posted by jas420221 View Post
    Are there ANY games out that use it (tessellation)...Im curious. Do these new cards still have it?

    What are the other DX10.1 games coming out this year?
    I guess there are 2 extreme sides to every story...I just dont agree with all the conspiracy BS going around. You guys need to lay off t3h w33d and stop being so paranoid I think.
    Last edited by jas420221; 05-31-2008 at 12:39 PM.

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •