Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Take it to PM you two
Anandtech's poll is VERY interesting:
And what is surprising to see is that the majority of those who responded were in fact Nvidia users:
Perkam
Your telling me to take it to PM, and posting a poor PhysX poll?
What PhysX games were they playing when that valuable poll was made?
"Before we get to the questions, last week saw the announcement of several upcoming titles that will support PhysX:
Terminator Salvation
Dark Void
Darkest of Days
U-Wars"
At least they know the modern games are being made...
Last edited by Talonman; 05-28-2009 at 04:49 PM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
I have owned either a GTX260 or 280 since Nvidia started with this physix crap and have never played a game that uses physix and don't have an urge to pick up any game that uses it. Mirrors edge sounds like a bore fest and Cryostasis dosen't sound much better. All Physix did was bug up Nvidias drivers for MONTHS. Physix is a joke that will NEVER take off. I'll pick up one of those stupid Killer NIC cards before I buy a dedicated Physix card. Ambient Occlusion is a bigger deal to me than Physix and I have only used that once but it was pretty cool.
I don't respond to posts like that m8. You should know better.
@ babbaloey, I agree. I believe the next step in the evolution of gaming is not in the hands of the hardware makers, but in the hands of the software developers. Crysis is merely one small step in the history books. The only game that might show new breakthroughs graphics wise on the PC in 2009 would probably be Operation Flashpoint: Dragon Rising, if what I've seen from the previews ends up in the final game:
http://www.gametrailers.com/video/we...ashpoint/48947
Perkam
Last edited by perkam; 05-28-2009 at 07:23 PM.
Please look again, that farking GTX260 will beat even the GTX275 due to its clockspeed.
http://www.hardforum.com/showpost.ph...86&postcount=7
Both the 4870 and GTX 260 are overclocked. When we setup this review initially prices were different, for one the GTX 260 wasn't as cheap. One thing we wanted to know was how an OC 4870 would compare to a 4890, since the only difference is clock speed. Obviously the 4890 will far surpass the 4870 if overclocked itself to 1GHz+. But at stock frequencies, we wanted to show that comparison. Our ASUS 4890 we used in this evaluation uses AMD stock frequencies, most sites used the OC SKU 4890, which has higher clolcks.
This was pretty much are last look at the 4870 and GTX 260 cards, we've done them to death, we'll be moving forward with 4890, GTX 275 and 4770 evaluations from here. We will be taking a look at an XFX and a highly overclockable ASUS 4890 we hope to see big returns on when overclocking past 1GHz.
Posting results from [H] throws credibility out the window.
I personally consider Phsyx GPU acceleration and DX 10.1 as gimmick, and anybody who's putting too much weight on either of them as FANBOYS themselves. I only take notice to hardware accelerated physic solution that works on ALL IHVs (including Intel too), and DX API that's supported by all players.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
To a certain extent, I beg to differ about these being gimmicks.
While you will find me to be a VERY large opponent to the current crop of PhysX games, I find that the overall goal of PhysX and other physics APIs has merit. Why? Because at this point the majority of the games out there will continue to be co-developed for multiple platforms. This means that the graphics themselves are usually designed to perform well with the current crop of consoles and (even with graphical tweaks) a modern PC will have no issue handling them. Sure there are some exceptions but they seem to be few and far between these days. Case and point; check out the first 10 pages of PC trailers on Gametrailers.com, it's brutal other than Diablo III.
What I am trying to say here is that due to the fact that the next generation of consoles isn't due out until 2011 / 2012, the graphics quality of many future PC games could very well be based on those from consoles that are 5-6 years old. A 2012 release date could mean graphical stagnation for another 3 years. Both Nvidia and ATI have saw this long ago and have worked pretty hard to make sure their GPUs would be more than graphics crunching machines. Hence why you see Nvidia pimping CUDA so much and to a lesser extent ATI pushes their Stream GPGPU technology.
So where does that leave PC gamers? Considering our technological advantage over the current crop of consoles and the fact that many games won't be using our system to the fullest of its capabilities. Yes, I know DX11 is coming but the number of games to actually use it has yet to be determined. Without many games being on the cutting edge graphics-wise for the next while, us PC gamers will need something else to take advantage of our expensive hardware. What better way is there to do this than add realism not through cutting-edge graphics but through physics? At this point it hasn't been done well but when you look at games like the new Ghost Busters or even an older title Company of Heroes, it is obvious that the potential is there. Whether the future is PhysX, Havok or whatever Intel is developing has yet to be seen but I am personally pretty excited for better physics in my games.
As for DX10.1, I think many people are missing the point. While there are some very minor graphical tweaks and additions, the main selling point of DX10.1 is its rendering efficiency particularly when AA is enabled. Anyone who has played HawX with an ATI card in Vista will know what I am talking about. There is barely any impact on framerates when going from 0xAA to 4xAA; that in itself will have a huge impact on the overall look and feel of a game.
Whoa SKY, that's one long reply that deserve further commment and clarification by me -BTW, i kinda miss your posts and PSU reviews in jG forum.To clarify it, i don't dislike Physx acceleration in GPU, but i do hate the propietary solution that's monopolized by one IHV, not a standard that's followed by the industry as a whole. Regarding DX 10.1, all the advantages of this feature become moot with limited support from the developers. I know, not really ATi fault, but atleast for now, before this API become a part of DX 11 which will be supported by all IHV, atleast momentarily, it's just a gimmicky selling point for ATi's product sold out there.
im going to disagree with that point, there is really no need for a new console until there is new technology to put it on. the jump to max out details on 1080p with just some new cool effects i believe will not be nearly enough. if we have 2160p tvs in 2012 that use a new cable type, then i could see a desire for a new gaming console.
I guess it all depends on perspective. For someone who loves the Call of Duty series or plays Left 4 Dead like it was going out of style, it could sure be considered a gimmick. However, games like Assassin's Creed and HawX have proved that the technology actually works and people playing those games can benefit.
You are completely right though; the number of games with DX10.1 is absolutely pathetic and many that have been released lately are pure crap. Case and point: Stormrise and Battle Forge.
Last edited by SKYMTL; 05-29-2009 at 06:47 AM.
I was basing that statement off of rumors circulating and reports from JPR. It wasn't based in fact and I completely agree with you. However, my point is still the same: more and more PC games are nothing more but ports of console games. Without a major advancement in console technology, many future PC games won't stress even today's mid-range PCs (unless they are poorly coded). Therefore, Nvidia, ATI and soon Intel will need some other selling points (physics, GPGPU, etc.) for their upcoming technologies.
100% agreeable, current consoles hold back PCs within 2 years of their release. (and the Wii looks like crap on any hd tv) and it seems to be getting worse with each new console. the PS2 seemed to hold its own for 3-4 years. and before then i didnt know what a gaming computer was like.
i think they should really work on a scalable console, imagine if you could take your xbox or ps3 and upgrade the internals with parts that use up less heat, or pack in a few more cards. and with the way the computer and console are merging closer and closer, its quite possible one will be obsolete by the time we see the next gens come out.
Sky, Thanks for your thoughts on the matter...
I found it to be a good read.![]()
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
ATI Radeon HD 4890 Roundup (ASUS, Diamond, HIS, Sapphire, XFX)
http://www.hardwarecanucks.com/forum...phire-xfx.html
Overclocking Extravaganza: GTX 275's Complex Characteristics
http://anandtech.com/video/showdoc.aspx?i=3575After our in depth look at overclocking with AMD's Radeon HD 4890, many of our readers wanted to see the same thing done with NVIDIA's GTX 275. We had planned on looking at both parts from the beginning, but we knew each review would take a bit of time and effort to design and put together. Our goal has been to try and design tests that would best show the particular overclocking characteristics of the different hardware, and shoehorning all that into one review would be difficult. Different approaches are needed to evaluate overclocking with AMD and NVIDIA hardware.
For our AMD tests, we only needed to worry about memory and core clock speed. This gave us some freedom to look at clock scaling in order to better understand the hardware. On the other hand, NVIDIA divides their GPU up a bit more and has another, higher speed, clock domain for shader hardware. Throwing another variable in there has a multiplicative impact on our testing, and we had a hard time deciding what tests really mattered. If we had simply used the same approach we did with the 4890 article, we would have ended up with way too much data to easily present or meaningfully analyze......
Far Cry 2 DX9 1920 4xAA
Far Cry 2 DX10 1920 4xAA
And here I thought FPS would be higher on DX9. Wow 0_0
Also, what is with the FPS hit with AA with the Nvidia cards? Some of them lose close to 30FPS when 4xAA is enabled.
Perkam
Last edited by perkam; 06-04-2009 at 08:07 AM.
AA is small potatos...
PhysX will add more realism to the gaming experience, than anti-aliasing ever dreamed about...
http://www.youtube.com/watch?v=mo54DJHBZWk
ATI guys try and find jaggies...
Nvidia boys look for realistic interactive in-game PhysX effects.
Ya got to have priorities... Your call!
Last edited by Talonman; 06-04-2009 at 08:22 AM.
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Bookmarks