Well, for people who really cares about games with GPU PhysX, you can always do something like this:
http://img134.imageshack.us/img134/38/18bat.png
hmm, HD5870 + GTX275 faster than 2xGTX285 SLI at a GPU PhysX game.
:D
Printable View
Well, for people who really cares about games with GPU PhysX, you can always do something like this:
http://img134.imageshack.us/img134/38/18bat.png
hmm, HD5870 + GTX275 faster than 2xGTX285 SLI at a GPU PhysX game.
:D
I wrote Physics.
Does it matter if I wrote Physics or PhysX? Its OBVIOUS we're talking about 1 setting.
The point is, thousands of games (minus the dozen that use PhysX) didn't need nVidia fancy PhysX to make rigid bodies and cloth and blowing papers.
In 2002, Hitman2 had cloth simulation. Half-Life 2 floating barrels and colliding objects didn't require a nVidia card. And ofcourse Crysis runs just as well on a Radeon as a GeForce.
But, Batman is a TWIMTBP game. nVidia wants only their loyal gamers to have access to all features. I'm all for acceleration of existing content, but requiring PhysX for additional game content (cloth, debris, papers etc) is just wrong.. And if Crysis managed to model village houses collapsing from exploding barrels with a CPU, why other than "cheating" would some crumpled paper in Batman cause it to always go down to 15fps with a Radeon?
FYI: in older post I have link to tomshardware which shows like 30% CPU (i7) both WITH AND WITHOUT PHYSX - most cores idle.. clearly its artificially generating low results.
no, using partially functioning chips on low/middle end parts is to save money. they have also been tenacious moving to new processes for the past few years. why are you so focused on frequency? g80 had 24 rops and gt200 had 32. they can also do 4 multisamples per clock.
i would take this as sensationalism. half of the "facts" you have stated are wrong or just made up. they are in better shape than amd right now too. its a 384 bit bus so my estimate for the die size is less than 500mm2. that would put clocks at a g80 level. gf100 is simply a monster of a chip and there is no denying that.Quote:
And power budget is clearly no prob for a 32sp 40nm nVidia chip... so how does AMD make 850Mhz 2B tran and nVidia only 600Mhz and shaders running barely as high as original 90nm GTX....?
1. They do it on purpose, so Fermi looks good compared to GT220 /GT240:ROTF::rofl:
2. Fermi delay will be announced and they'll ship a 40nm G92 in the meantime :shocked:
3. Design not scalable. :shrug: Could be a prob if even crippled cut-down Fermi can only run (extrapolating)... 400Mhz.
4. Simply inexperienced and incompetent engineers. Can't be the "process" since 4770 was getting great clocks early on before things were ironed out.
5. Management.:mad:
Hate to repeat it so many times, but nVidia Fermi is way late, their DX10.1 cards are a joke, huge die GT200s are sinking profits, and dont even have Intel/AMD chipset business to fall back on. If they dont get PERFECT Fermi out with whole lineup down to bottom, it could be not just NV30, but more like 3Dfx time
its not a PHYSX game then...
and a lot more than 14 do it
eeeeeh no. only 14 GPU PhysX-accelerated titles. The rest of the titels is CPU accelerated.
I'm sorry. I thought there were a LOT more than that. My mistake.
While I do think theres something wrong with Radeons and the performance levels with Phsyx, I'm not sure what it is. The code itself is being run on the cpu, which has done this sort of stuff for years. Its like its purposely horribly coded for cpus, and then told not to work when in the presence of a radeon gpu (hence why hacks can get physx working with great performance on radeons). It just seems anti-competitive. I don't have anything wrong with physx but it just seems its being used for 'silly' things and not used to its full potential - especially since its proprietary.
So are you saying nVidia will be fine if Fermi isn't launched Q1 and perfect?
or
Are you giving nVidia the benefit of the doubt and re-assuring me that a non-perfect Fermi is impossible?
I'm not particularly concerned about ultra-high end Fermi products, but more worried about lack of anything else other that Fermi on roadmaps. Whats gonna replace the 8800GT->GTS250?
I wouldnt be surprised in the least. G92 is probably as cheap as dirt now and still matches up with HD 57xx performance wise. It'll be a tough sell without even DX10.1 support but since there's no word of anything faster than GT216 on the horizon they probably have no choice but to rebrand (or just keep selling) old reliable.
uh trinibwoy, g92 isn't even really close to 57xx series
thats like saying the 8800 ultra = 4870 which is just a blatant lie
Fermi better be the second coming of the 8800 or I will be giving up on nvidia.
even if fermi sucks for gaming, so what? itll only take nvidia 6-12 months to get back on its feet, and they have a big enough gobbly fat belly to make it through that time easily without going bankrupt :D
i really hope fermi will deliver... i dont even want to think about what will happen if they dont...
ati prices will get out of control, nvidia will go crazy pr wise, trying to make up for bad performance with more pr, they will go for lots of perf tweaks again, be even more agressive in trying to lock ati out of games they support during the development... it would get real ugly...
it matches the 5750, but can't match the 5770 and can't match the 5750 and 5770 power consumption, OC potential, features etc...
Maybe that's why even the most expensive PCs at the time couldn't run Crysis 60+FPS stable on enthusiast settings. Have you heard the phrase "Can this run Crysis?" Wouldn't you like being able to run this game at 100FPS with all eye candy enabled by only adding a 60$ secondary card?
Look, I totally understand all this NVdia is evil deal and frustration caused by their business practices but the bottom line is, they invented the technology they can use however they want. You don't like how they market their technology? Solution is simple: Just don't purchase their cards and the games that only support their technology. Let your wallet do the talking. Even this thread, 6 pages of nonsense speculation over a stupid picture on facebook is nothing but fuel to the NVdia hype machine. Besides, AMD has directx11 cards and some games will show more features to ATI cards and not to NVdia cards (like tesselation). Just buy a 5870 and enjoy your games.
Nvidia invented Physx? ;)
Well I think the PPU cards floating about out there are part of the gripe, ppl purchased them and used them with their Ati GPU for years and now they can't. The Ati gpu Aegia ppu combination must have been the QA and Support nightmare that drove Aegia to auction itself off ;)