haven't you guys learnt something by now about the modern world?
the one who is an ass always wins, example's? nvidia and apple would be one's that feature in tech. It applies itself to people too, there was a survey about a month back saying that people who were complete![]()
![]()
holes in the office got payed more than people who were polite.
Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
3x2048 GSkill pi Black DDR3 1600, Quadro 600
PCPower & Cooling Silencer 750, CM Stacker 810
Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
3x4096 GSkill DDR3 1600, PNY 660ti
PCPower & Cooling Silencer 750, CM Stacker 830
AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
2x2gb Patriot DDR2 800, PowerColor 4850
Corsair VX450
btw, thread title is "Nvidia responds to Batman" lol
And Batman responds:
SWEAR TO ME!!!!!!!!!
![]()
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
With some big, mission critical, piece of software then yeah, debugging on every possible hardware/software config is important. But games are programed on industry standards, tested on the most common hardware, and sent out to the world. How much more in depth the testing goes depends on the developer. But having a feature in a game and not bothering to test if it on the only other major (discreet) graphics vendor's hardware speaks of one thing - laziness. And I program in x86 assembly for fun, so you won't get much sympathy out of me for lazy developers.
The problem is that the developer takes the same amount of money for the game from NV and ATI customers, but doesn't give the latter half the features in the game. This isn't something like physics where they had to decide beforehand what API to use - which then excluded other hardware automatically. This is a feature that both the manufacturers handle just fine (AA). But the developers decided it wasn't worth the effort to include that feature for their ATI customers.
Big difference... DX11 is an api. You see, as long as your hardware is 100% compliant with the standard then it will work down the line.
Meanwhile, according to eidos and rocksteady, they implemented a different form of AA to make it work in Batman, as the engine does NOT support AA by default. It wasn't a DX standard form of AA, and as such would require verification to make sure it works as required.
See the difference here?
DX11 is an API, but whatever code you write over that API to implement whatever features you want, is your work.
The AA which we are talking about here, is programmed over DX9/10 (dunno which). So it's exactly the same. It's a feature you're coding over the API you are using.
Every code you have to write (whatever you get previously written) is code you have to write. Be it to implement features based on the new possibilities of a new API, be it to implement features based on the old possibilities of the old API.
The fact that the AA didn't come default in UE3 isn't any different than the new DX11 based effects being programmed on GRID2 didn't come default on their DX9/10 version of the game.
But unless they used some NV hardware specific AA method they probably used industry standard function calls. If they used standard shader calls then there is no reason that it shouldn't work on both manufacturer's cards. And being able to change the card ID and enable AA on ATI cards in the demo bears this out.
The difference here is that the code for the DX11 path of Grid 2 still run via the layer known as dx11. If the AA used in Batman: AA truly isn't standard DX 10 specification AA(if the game runs DX9 then there's NO WAY it's DX spec, UE3 uses deferred rendering which cannot support AA naturally) then it truly is a different situation. I'm pretty sure B: AA is DX10 though.
It IS. How do you think it runs on ATi cards if it's not written over an standard layer known as DX9/10? Have they implemented on GT200 assembler and it runs on ATi cards by chance, or what?
Writting a shader or a piece of code over the DX9/10 API to implement an AA filter is not different than writing a shader in DirectCompute 5.0 or Domain/Hull Shaders to implement any other effect (AA or what you want).
It's no a different situation. It's implementing a piece of code, and you are using to it the standard API you want to use to do it. Nothing else.
spoken like a true worker bee..
the point is nobody is innocent.
and the saying goes " innocent until proven guilty" only applies to US
real world everybody is guilty until proven innocent.
as always its hard to explain ppl who dont do business on large scale
side note
ps so u still think iraq had biochemical weapons ready to bomb the US??
lol..wonder wheres the " innocent until proven guilty " part applies here.![]()
Eh? Thats a very sweeping statementand the saying goes " innocent until proven guilty" only applies to US
CPU: Intel 2500k (4.8ghz)
Mobo: Asus P8P67 PRO
GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
Sound: Corsair SP2500 with X-Fi
Storage: Intel X-25M g2 160GB + 1x1TB f1
Case: Sivlerstone Raven RV02
PSU: Corsair HX850
Cooling: Custom loop: EK Supreme HF, EK 6970
Screens: BenQ XL2410T 120hz
Help for Heroes
yeah i was thinking that could be the case, but then the survey would really be, those you hate will earn more than you, or those hated the most earn the most. not jerks earn more. to be done right, each person would need a psychological profile, which would determine that people who are jerks, are probably more aggressive in asking for increases they probably dont deserve. and the same is for the opposite, passive people are scared to ask for more.
ok end off topic, i would post more on topic stuff, but honestly this is the same argument every 5 posts and i gave up on page 3. (reminds me of the new bud light commercials where first they show too light, then too heavy, which is exactly whats going on here. some think nvidia went too far, other thing they didnt.)
anyway before this run out of topic and i get a perm ban..
nvidia vs ati ..amd vs intel.. microsoft vs apple.. osama bin laden vs the world..asus vs gigabyte..
all are guilty of self justification of " the way to do business"
heck hitler himself " the way to do business" differs from any sane person.
that's the point of contention really,it would work,it should work,it could work.
But since we don't have any nvidia dx11 hardware to verify it on why should we let nvidia hardware enable it down the road when ati has given us hardware to verify,who knows what kinda bugs might be out with nvidia's hardware.It's not like that every game works flawlessly on every hardware even if it's made on a common API or their(ati & nvidia) dx11 implementation is so similar that you can simply assume that there won't be any problems.
Case in point nfs:shift-
now if the game developers are good enough probably they can update the game with a patch but who's to say that similar thing won't happen with DIRT2.Need for Speed: Shift
In another TWIMTBP title, we submitted a list of issues that we discovered during the games’ development. These issues include inefficiencies in how the game engine worked with our hardware in addition to real bugs, etc.. We have sent this list to the developer for review. .
And regarding ati's dev relations,contrary to popular perception based upon the TWIMTBP logo at the start of many popular titles,ati do have good game developer relations.Why don't they put up a logo like nvidia does-
http://www.hexus.net/content/item.php?item=794&page=2We try to make this a pure win-win for the games developers and players, and we do this without throwing marketing money around. This means that you don't usually see many ATI logos at the start of games, because paying for that is just advertising, and GITG is not an advertising campaign.
Instead you see recommendations on our web site, you see frequent driver releases on there too, all of which are tuned to give the smoothest and highest quality playing experience, and you see games developers and publishers walking around with smiles on their faces because we reduce QA problems for them.
ATI's Richard Huddy talks about Get In The Game
That's from 2004,and his recent interview posted few pages before reiterated the same fact, he even mentions that a game developer refused their help saying that nvidia was there before and they would do just fine w/o them.
Godwin's law at its best![]()
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
Heh , last time I checked the Rights act applies to everyone in the world and your comparison with Iraq makes no scene, obviously there where hidden agendas in it and "they said they had proof " fake or not once you have proof you are guilty xD . Also its out of topic and explain to me whats your issue " Worker bee " Are you some kind of big corporate guy that looks down on decent hard working human being ?
The point of the topic is , why should they had to do such things , I understand they used money and time but still , you can at least make it run with all hardware and show that not only you can make it happen and have the resources to it but also you are willing to cooperate with others to make gaming better and for the consumer too . I don't see any loss of profit by making a their speciall AA stardard to everyone , if anything makes consumers happier and increase their profits. About AMD not being 100 % peferct helping out game developers its understandable they arent as powerfull as Nvidia and sometimes im sure its hard for them to approach and or they cant help out.
They should just make it so that hardware companies if they want to help out game developers , they can only to fix bugs and test hardware but they are not permitted to influence the game developer to modify the game so it runs less on the competitors hardware.
Last edited by LC_Nab; 10-02-2009 at 09:08 AM.
NFS: Shift, the game plays fine on ATi hardware. ATi's complaint is rather that it doesn't use all of it's shader power because code has to be very specialized to take full advantage of the shader power because of their complexity. Code just works out-right on the nvidia parts because of the simplicity of the shaders themselves. Why do you think NVidia have done so well against AMD with only a fraction of the shaders?
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
I was referring to the mention of bugs when running the game on ati HW despite the fact that it's build on a API supported by both vendors and that dirt2 can turn out to be a smilar case.
As for simplicity of shaders and the shader power,I can also ask why does AMD does so well against nvidia with half the ROPs and TMUs?
fwiw the compiler takes care of coding for the shaders,the game engine's inefficiencies don't have to be a malicious code where the compiler goes haywire it can be related to other hardware components or simply not using a faster method/optimisation on a given hardware .
shenanigans!!!
AMD Phenom II 550
4x 2GB Coursair XMS2 800 MHz
Cooler Master 750W PSU
XFX HD4870 1GB
Samsung SpinPoint 1TB HD
Samsung 22x Dual Layer DVD Burner
HAF-932
Bookmarks