Page 4 of 5 FirstFirst 12345 LastLast
Results 76 to 100 of 117

Thread: nVidia plays dirty "again"...

  1. #76
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Czech Republic, 50°4'52.22"N, 14°23'30.45"E
    Posts
    474
    Holly true. But for me personally, NVIDIA is not an option when I see how they screw people. Yet, AMD is loosing more and more for me because they are begining to screw me, too (good functions don't work anymore with new drivers, new functions which are possible don't work cause AMD is trying to push people into buying new cards etc.). The way AMD behaves last months is just making another NVIDIA out of 'em.

    So what's the option now?!

  2. #77
    Xtreme Mentor
    Join Date
    Jun 2008
    Location
    France - Bx
    Posts
    2,601
    Quote Originally Posted by SKYMTL View Post
    Seems ATI says alot but does nothing.

    Bullet Physics? Supported by NVIDIA, debugging on NVIDIA hardware, persented some great OpenCL stuff at the GTC. Yet, ATI used them in their PR slides.


    Pixelux? Not a full physics library by a long shot. Once again a talking point by ATI but not directly supported by them.


    Havok? Don't hold your breath.


    I'm in the same boat as everyone else here when I say that ATI needs to stop talking smack and start supporting ALTERNATIVES to PhysX. Their entire game-oriented PR strategy these days seems to be centered around bashing PhysX without offering anything in return.

    The same goes for supporting AA in the Unreal Engine games. NVIDIA has the opportunity to do it and they do...by spending countless man hours as well. Then AMD whines about it while claiming they're locked out.

    As a gamer first and foremost, I don't care about closed standards. All I care about is the advancement of the PC gaming experience. If AMD / ATI spent as much time working with developers as they do complaining, we'd have some truly unique games and great performance across ALL GPUs to boot. Not what happens again and again as recently as the debacle with The Saboteur.

    Considering the BS I get in the way of email saying "you should publish this" from certain parties, this crap gets real old very fast.


    It's good to read fair opinion, yeah, really ...

  3. #78
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by Behemot View Post
    Holly true. But for me personally, NVIDIA is not an option when I see how they screw people. Yet, AMD is loosing more and more for me because they are begining to screw me, too (good functions don't work anymore with new drivers, new functions which are possible don't work cause AMD is trying to push people into buying new cards etc.). The way AMD behaves last months is just making another NVIDIA out of 'em.

    So what's the option now?!

    540gtx
    http://www.s3graphics.com/en/product...x?productId=19
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  4. #79
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Ottawa, Canada
    Posts
    2,443
    Quote Originally Posted by zanzabar View Post
    HAHAHA even they have DX 10.1 support! Might actually be an alternative if the crap keeps going the way it is anymore. It's pathetic!

  5. #80
    Registered User
    Join Date
    Jan 2007
    Posts
    94
    Isn't it far more likely that developers making PhysX-enabled games just aren't bothering to implement a multi-threaded CPU path anymore? Why would they? Multi-threaded software development is relatively expensive, and it will still have inferior performance to the GPU-accelerated mode.

    If the GPU acceleration isn't important to you as a developer, wouldn't you probably just be going with Havok?

    As an aside, software-based physics does offer an inferior experience to PhysX at this point in time. Pretty much every Havok-based game I've played with has pretty jarring glitches due to the lack of physics precision necessary to make it playable on a cpu. Most of us have seen corpses occasionally launching 300 feet in the air in Oblivion and Fallout 3.

    Games that use Havok physics for core gameplay are even worse. Red Faction: Guerrilla is probably the most aggressive implementation of Havok as it has fully deformable structures. While this is a completely awesome concept, the player is constantly presented with scenarios where you have entire multi-story buildings that still stand when only a tiny sliver of a single wall remains intact.
    i7 920 @ 4.2Ghz
    Asus P6T6 Revolution
    3x GTX260
    6x2GB Corsair DDR3-1600
    G.Skill Falcon 128GB SSD
    G.SKill Titan 128GB SSD
    Segate 7200.11 1.5TB
    Vista 64 Ultimate

  6. #81
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    phsyX games are almost all TWIMTBP i cant actually think of any that arnt (i guess that there are a few) but with things like with borderlands having ati optimization disabled in a config i amuse that NV dose not push for multi threaded support and would even say not to do it
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  7. #82
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by zanzabar View Post
    phsyX games are almost all TWIMTBP i cant actually think of any that arnt (i guess that there are a few) but with things like with borderlands having ati optimization disabled in a config i amuse that NV dose not push for multi threaded support and would even say not to do it
    I've gone through the Borderlands config files with a fine toothed comb over the last month and I don't recall seeing anything that would be considered an "ATI optimization" that was disabled.

    Maybe you can enlighten us?

  8. #83
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    from the willowengine config

    [Engine.ISVHacks]
    DisableATITextureFilterOptimizationChecks=True
    UseMinimalNVIDIADriverShaderOptimization=True

    changing them to false with a 4890 i got a good 20-30% jump and that was affter the huge jump from disabled smooth frames, changing the max refresh to 100 and upping the max fps to 101
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  9. #84
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Vancouver,British Columbia, Canada
    Posts
    1,178
    Quote Originally Posted by SKYMTL View Post



    DiRT 2 does not use Havok. DR does not use GPU accelerated physics through Havok.

    That sure looks like a Havok logo here:
    Attached Images Attached Images


    World Community Grid's mission is to create the world's largest public computing grid to tackle projects that benefit humanity.
    Our success depends upon individuals collectively contributing their unused computer time to change the world for the better.

  10. #85
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by aka1nas View Post
    Isn't it far more likely that developers making PhysX-enabled games just aren't bothering to implement a multi-threaded CPU path anymore? Why would they? Multi-threaded software development is relatively expensive, and it will still have inferior performance to the GPU-accelerated mode.

    If the GPU acceleration isn't important to you as a developer, wouldn't you probably just be going with Havok?

    As an aside, software-based physics does offer an inferior experience to PhysX at this point in time. Pretty much every Havok-based game I've played with has pretty jarring glitches due to the lack of physics precision necessary to make it playable on a cpu. Most of us have seen corpses occasionally launching 300 feet in the air in Oblivion and Fallout 3.

    Games that use Havok physics for core gameplay are even worse. Red Faction: Guerrilla is probably the most aggressive implementation of Havok as it has fully deformable structures. While this is a completely awesome concept, the player is constantly presented with scenarios where you have entire multi-story buildings that still stand when only a tiny sliver of a single wall remains intact.
    60fps is inferior performance to 240fps, but it does not make 60fps unplayable or not worthwhile.
    Its all about being sufficient enough to get the job done, it does not always & must be the superior or nothing at all.
    Last edited by Final8ty; 01-21-2010 at 09:53 PM.

  11. #86
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by aka1nas View Post
    Isn't it far more likely that developers making PhysX-enabled games just aren't bothering to implement a multi-threaded CPU path anymore? Why would they? Multi-threaded software development is relatively expensive, and it will still have inferior performance to the GPU-accelerated mode.
    Yep, I'm wondering when the anti-PhysX brigade will realize that Havok games aren't using more than one or two cores either.

    Games that use Havok physics for core gameplay are even worse. Red Faction: Guerrilla is probably the most aggressive implementation of Havok as it has fully deformable structures. While this is a completely awesome concept, the player is constantly presented with scenarios where you have entire multi-story buildings that still stand when only a tiny sliver of a single wall remains intact.
    Red Faction just has a lot of premodelled breakable structures. If they were simulating physics then hitting a building with a sledgehammer wouldn't cause it to crumble, and it wouldnt be able to stand without a foundation It looks great and is a lot of fun but it isn't a physics simulation.

  12. #87
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by SKYMTL View Post
    Old news. However, it is up to the DEVELOPER to enable multiple cores in PhysX according to the SDK.

    Batman: AA doesn't do this but Dark Void does...

    However, what NVIDIA hasn't answered is why PhysX works so well on consoles but poorly on the PC.
    Yeah, that's exactly the way I thought it works (there's been some time since the last time I checked the PhysX library), and that's the usual way to do it when a physics library offers multi core compatibility. It's surprising though finding game developers that chose to not use multithreading for physics calculations when offered nowadays. Even more surprising when the same developers are marketing a running mode "massively multi threaded" (which is what is CUDA in the end). I wonder why they do it...

    Quote Originally Posted by Dami3n View Post
    GPU maker investing in games to sell more gpus? where is the problem?
    AMD could learn something of them and start to invest in games and give us better drivers and crossfire support.
    Wow, that seems to have become a great mantra for some people... but I don't know how that it relates with the topic. Maybe you are suggesting that the reason why developers chose to not use multi threading with PhysX is because NVIDIA is "investing" in them in order to do so? And that that is a good way to invest money? No, I definately think I'm not following your reasoning.

    Quote Originally Posted by SKYMTL View Post
    Bullet Physics? Supported by NVIDIA, debugging on NVIDIA hardware, persented some great OpenCL stuff at the GTC. Yet, ATI used them in their PR slides.
    Formerly: Bullet Physics is supported mainly neither by NVIDIA or AMD, but by Sony. Second: NVIDIA does support Bullet, and AMD does support Bullet. You say not, Erwin Coumans (the main developer behind the library) says yes. I prefer to believe in what Erwin says. Not that I don't trust you, but Erwin is the one who writes the code after all.

    I think you're saying that because Bullet Physics currently support CUDA, and only supports OpenCL through a more than provisional, in developement, and not that stable patch right now. But you should think that CUDA is amongst us for way longer than OpenCL. Even so, Erwin says that there is a running (just not final) OpenCL multithreading module for Bullet Physics, and that it's planned to be merged to the branch Bullet for the 3.0 version (in developement) amongst some other changes. And yes, he has said several times that the project it is being supported by AMD.

    Quote Originally Posted by Luka_Aveiro View Post
    Business is business.

    If I bought a product to improve the quality of my own product, why should I give that product to my competitor's and improve the quality of their own products?

    Does this make any business sense? No.

    So, nVidia handles PhysX the way they want, because it's their product, not everyone's product.

    Anyway, I thought PhysX was just gimmicks and stuff, is it that important, really?
    I agree with most of your comment. It's logical that NVIDIA do with their own sw what it's better for their own interests. It's also logical that then the consumer wants to value what they get with that product, and what are the downsides it has, to see if it's good for his/her own interests, and buy it or not.

    But I want to comment on your last phrase about PhysX and its gimmicky consideration by many people. The thing is not that PhysX is just gimmicks by itself.

    PhysX is nothing more than a fairly decent real time physics simulation library focused on games developing, as many others (some better libraries like Havok or Bullet). The thing that makes PhysX something special compared to other libraries is its support of GPGPU processing, which is a great and interesting feature because it should allow more complex calculations for more complex simulations (leaving out the topic about the convenience of moving work load from CPU to GPU in a close to completely GPU bottlenecked software type, at least currently).

    The trap with that PhysX GPGPU support which it offers, is that it's done through CUDA, which is a GPGPU propietary API for what is only compatible hw from a single vendor (NVIDIA). That means that there are 2 options:

    1) leaving out a lot of potential clients making the sw compatible only with NVIDIA hw (mmm, it doesn't sound very good for the game developer, from my POV).

    2) creating optional content that is only displayed on compatible hw. If you do so, you will realise that you can't do anything that affects gameplay, anything that is more than an ornamental add on. Furthermore, every single bit of time or resources you invest in that content could be invested in other things that would be attractive for users of every hw vendor, so... most of developers using the GPGPU capabilities of PhysX are going to be doing the bare minimum to claim about it and put a PhysX sticker in the box of the game, or in other words, gimmicks and stuff. Even Mirror's Edge and Batman AA, the 2 main PhysX games, have to be made compatible with non-NVIDIA graphic hw for PC, and even more, their main markets are not PC market but videoconsole market, and no single console platform supports CUDA. How could it be otherwise then?

    Importance here is not about the importance of PhysX, but importance of (supposedly) NVIDIA purposedly slowing performance of their library when running it on a CPU (by not using its multi threading capabilities). Which is not the real question IMO, since I believe, as SKYMTL previously pointed, NVIDIA offers the multithreading option with the library. The astonishing question (IMO again) is why the hell should a developer choose to run it single threaded when a better option is given. And why the hell they do it.

    EDIT: I have reordered the quotes to leave the longest response at the end.
    Last edited by Farinorco; 01-22-2010 at 05:18 AM.

  13. #88
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by SocketMan View Post
    That sure looks like a Havok logo here:
    That's in Dragon Rising and its accelerated on the CPU. Not the GPU.

  14. #89
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by zanzabar View Post
    from the willowengine config

    [Engine.ISVHacks]
    DisableATITextureFilterOptimizationChecks=True
    UseMinimalNVIDIADriverShaderOptimization=True

    changing them to false with a 4890 i got a good 20-30% jump and that was affter the huge jump from disabled smooth frames, changing the max refresh to 100 and upping the max fps to 101
    Thanks for that. Definately food for thought for future reviews and whatnot.


    Formerly: Bullet Physics is supported mainly neither by NVIDIA or AMD, but by Sony. Second: NVIDIA does support Bullet, and AMD does support Bullet. You say not, Erwin Coumans (the main developer behind the library) says yes. I prefer to believe in what Erwin says. Not that I don't trust you, but Erwin is the one who writes the code after all.
    Thngs may have changed from the last time I talked to Erwin then. Back then NVIDIA was giving him direct support in terms of hardware and error checking. On the other hand AMD's support was passive at best and Erwin had to buy ATI-based cards out of pocket. This was right before the HD 5870 launch so it is possible that AMD started more active support right after they decided to trumpet Bullet in their PR slides.

  15. #90
    Xtreme Member
    Join Date
    Jul 2007
    Location
    Inside a floppy drive
    Posts
    366
    Quote Originally Posted by Farinorco View Post
    Wow, that seems to have become a great mantra for some people... but I don't know how that it relates with the topic. Maybe you are suggesting that the reason why developers chose to not use multi threading with PhysX is because NVIDIA is "investing" in them in order to do so? And that that is a good way to invest money? No, I definately think I'm not following your reasoning.
    Topic is about AMD whining about how nvidia promote the use of his physx engine, so my point fits perfectly in the topic.
    Since Nvidia is a GPU maker an the owner of the physx engine, is obvious that they will invest their money into make GPUs more important than CPUs in games in order to gain sales. Me and you would do the same if we were in the nvidia shoes because is a business.

    If this is good or bad for pc gamers is another debate.

  16. #91
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Farinorco View Post
    PhysX is nothing more than a fairly decent real time physics simulation library focused on games developing, as many others (some better libraries like Havok or Bullet).
    I'm sorry, but I don't know which deep dark hole you pulled that gem out of. You should try browsing some game developer forums to see what the people who use these libraries actually think of them. Bullet is better than PhysX? lolz...

  17. #92
    Xtreme Member
    Join Date
    Jul 2009
    Location
    Madrid (Spain)
    Posts
    352
    Quote Originally Posted by SKYMTL View Post
    Thngs may have changed from the last time I talked to Erwin then. Back then NVIDIA was giving him direct support in terms of hardware and error checking. On the other hand AMD's support was passive at best and Erwin had to buy ATI-based cards out of pocket. This was right before the HD 5870 launch so it is possible that AMD started more active support right after they decided to trumpet Bullet in their PR slides.
    Well, I don't have any kind of personal relationship with Erwin (aside from a great respect and a great gratitude for giving us such an awesome library not only for free but in open source), so maybe if you actually may talk to him, you can have more first hand information than I do.

    What I say is based on what he writes in forums and so, for example:

    Quote Originally Posted by Erwin Coumans
    AMD indeed offers support in accelerating the OpenCL version of Bullet. Aside from AMD, we also we are involved in several collaborations with professional game and movie companies and hardware vendors, including Sony, Intel and NVidia.

    The OpenCL acceleration, planned for Bullet 3.x release is cross-platform and works with all compliant OpenCL implementations. We make sure our implementation is optimized and compatible with NVidia Geforce, ATI Radeon and Intel Larrabee GPUs, Intel and AMD CPU and Apple Snow Leopard.
    There will be also continued support for PlayStation 3 Cell SPU acceleration, and possibly other platform-specific optimizations.
    Thanks,
    Erwin
    from here (I don't think it's a bad thing to link to this forum from here, since it's a forum for developers using Bullet or even physics simulation in general, but if any moderator think it's a problem, I'll remove the link).

    I believe that, as you say, AMD has begun supporting this project since that "OpenCL in physics supporting program" has been running and advertised, not before. But AMD is advertising that support because they are actually giving it, AFAIK.

    Quote Originally Posted by Dami3n View Post
    Topic is about AMD whining about how nvidia promote the use of his physx engine, so my point fits perfectly in the topic.
    Since Nvidia is a GPU maker an the owner of the physx engine, is obvious that they will invest their money into make GPUs more important than CPUs in games in order to gain sales. Me and you would do the same if we were in the nvidia shoes because is a business.

    If this is good or bad for pc gamers is another debate.
    But if NV is right about doing what they want with their own products as they are their own products, I would say that AMD is right about whining what they want about the competitors products, because they are competitors products.

    Obviously if NV spots a way of making more money with their products (be it good or bad for pc gamers) they are going to do it. And obviously, if AMD spots a way of counter this by making people see why (or how) it's a bad thing for them, they are going to do it also...

    EDIT:
    Quote Originally Posted by trinibwoy View Post
    I'm sorry, but I don't know which deep dark hole you pulled that gem out of. You should try browsing some game developer forums to see what the people who use these libraries actually think of them. Bullet is better than PhysX? lolz...
    I do browsee and use some game developer forums, like the OGRE 3D rendering engine forums and gamedev. Indeed, I've made a lot of research about those engines to choose one to use myself, and I've downloaded both the PhysX and the Bullet libraries, and I've experimented a little with both (but very little, I wouldn't give any personal opinion about them at this point). Don't make assumptions so quickly about people, and even less laughing about what they are saying.
    Last edited by Farinorco; 01-22-2010 at 07:30 AM.

  18. #93
    Xtreme Member
    Join Date
    Jul 2007
    Location
    Inside a floppy drive
    Posts
    366
    Quote Originally Posted by Farinorco View Post
    Obviously if NV spots a way of making more money with their products (be it good or bad for pc gamers) they are going to do it. And obviously, if AMD spots a way of counter this by making people see why (or how) it's a bad thing for them, they are going to do it also...
    Yes, but the first action is totally justified and the second is very childish and doesn´t make any sense at all.
    How many times will AMD whine about physx before they do something serious about it? Their politic of whining is becoming ridiculous

  19. #94
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Farinorco View Post
    I do browsee and use some game developer forums, like the OGRE 3D rendering engine forums and gamedev. Indeed, I've made a lot of research about those engines to choose one to use myself, and I've downloaded both the PhysX and the Bullet libraries, and I've experimented a little with both (but very little, I wouldn't give any personal opinion about them at this point). Don't make assumptions so quickly about people, and even less laughing about what they are saying.
    I'm not sure what my post has to do with assumptions about you as a person. You stated something as fact which is completely false. Bullet has neither the coverage, performance or toolset that PhysX provides (and I'm not even talking about the GPU bits).

  20. #95
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    637
    Quote Originally Posted by Dami3n View Post
    Yes, but the first action is totally justified and the second is very childish and doesn´t make any sense at all.
    How many times will AMD whine about physx before they do something serious about it? Their politic of whining is becoming ridiculous
    What do you want them to do? Buy Havok and make the same thing Nvidia does with Physx, thus doing exactly what they are denouncing?

    Physics on GPU is a nice advantage, but not when half the user base get a castrated version if ran in the CPU.

    If i need to run Physx on my CPU, i expect it to work it as efficiently as possible, without imposed limitations (one thread).

  21. #96
    Xtreme Member
    Join Date
    Jul 2007
    Location
    Inside a floppy drive
    Posts
    366
    Quote Originally Posted by Pontos View Post
    What do you want them to do? Buy Havok and make the same thing Nvidia does with Physx, thus doing exactly what they are denouncing?

    Physics on GPU is a nice advantage, but not when half the user base get a castrated version if ran in the CPU.

    If i need to run Physx on my CPU, i expect it to work it as efficiently as possible, without imposed limitations (one thread).
    Intel is the owner of Havok so is difficult that AMD can buy it.
    But anyways, stop whining and give us better drivers and crossfire support should be enough to start. And they can promote an alternative and open way of make physics also (not only showing it in papers).

  22. #97
    Xtreme Addict
    Join Date
    Jul 2005
    Posts
    1,646
    Quote Originally Posted by Dami3n View Post
    Intel is the owner of Havok so is difficult that AMD can buy it.
    But anyways, stop whining and give us better drivers and crossfire support should be enough to start. And they can promote an alternative and open way of make physics also (not only showing it in papers).
    Aren't they constantly improving drivers and crossfire support already? Seems like a silly and broad request.

  23. #98
    I am Xtreme
    Join Date
    Feb 2007
    Posts
    5,413
    Do you know how I know this thread is retarded? The guy who actually KNOWS what he is talking about speaks up.
    http://blogs.nvidia.com/ntersect/
    PhysX, the Preferred Solution for all Platforms
    By Nadeem Mohammad, posted Jan 20 2010 at 05:39:33 PM

    Recently, an interview ran with an AMD developer relations manager, who claimed that NVIDIA (after acquiring Ageia) had purposely reduced the performance and scalability of NVIDIA PhysX technology, with regards to CPU core utilization.

    I have been a member of the PhysX team, first with AEGIA, and then with NVIDIA, and I can honestly say that since the merger with NVIDIA there have been no changes to the SDK code which purposely reduces the software performance of PhysX or its use of CPU multi-cores.

    Our PhysX SDK API is designed such that thread control is done explicitly by the application developer, not by the SDK functions themselves. One of the best examples is 3DMarkVantage which can use 12 threads while running in software-only PhysX. This can easily be tested by anyone with a multi-core CPU system and a PhysX-capable GeForce GPU. This level of multi-core support and programming methodology has not changed since day one. And to anticipate another ridiculous claim, it would be nonsense to say we “tuned” PhysX multi-core support for this case.

    PhysX is a cross platform solution. Our SDKs and tools are available for the Wii, PS3, Xbox 360, the PC and even the iPhone through one of our partners. We continue to invest substantial resources into improving PhysX support on ALL platforms--not just for those supporting GPU acceleration.

    As is par for the course, this is yet another completely unsubstantiated accusation made by an employee of one of our competitors. I am writing here to address it directly and call it for what it is, completely false. NVIDIA PhysX fully supports multi-core CPUs and multithreaded applications, period. Our developer tools allow developers to design their use of PhysX in PC games to take full advantage of multi-core CPUs and to fully use the multithreaded capabilities.

    There is lot more I say on this topic; however, I really have to get back to my day job, which is working to help make gaming great for all users! And today that includes cracking open a new copy of Dark Void, the latest PhysX title, which incorporates some awesome particle weapon effects, an insane Disintegrator gun with fluid particles and jetpack with physical smoke turbulence . I know, hard work, right? But someone has to do it!

    Happy 3D gaming!
    "Thing is, I no longer consider you a member but, rather a parasite...one that should be expunged."

  24. #99
    Xtreme Addict
    Join Date
    Jul 2005
    Posts
    1,646
    That sums it up then, the AMD fella got it wrong.

  25. #100
    I am Xtreme
    Join Date
    Feb 2007
    Posts
    5,413
    Quote Originally Posted by MrMojoZ View Post
    That sums it up then, the AMD fella got it wrong.
    and it shows the Nvidia-hating <<insert description here>> got it wrong as well. anyone who jumps on the anti-Nvidia or anti-AMD bandwagon just to be a hater is an <<insert description here>> Holding their feet to the fire is good as both companies should be held accountable to the consumers. If you want to be a fanboi and talk GOOD about your company that is one thing but when you have to talk crap about your less-preferred company that is crossing the line. The bottom line is these are just video-cards . . .none of us making a living from them. They will not cure cancer or make your wife have DD cup size so please chill out people. It is just a piece of hardware you will upgrade in a year or so anyways. Get off your fat arses and go jogging or hit the gym. It will relieve some of that stress (not pointing at you Mojo in paticular)
    Last edited by DarthBeavis; 01-22-2010 at 08:41 AM.
    "Thing is, I no longer consider you a member but, rather a parasite...one that should be expunged."

Page 4 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •