Silverstone Temjin TJ-09BW w/ Silverstone DA750
Asus P8P67
2600K w/ Thermalright Venomous X Black w/ Sanyo Denki San Ace 109R1212H1011
8GB G.Skill DDR-1600 7-8-7-24
Gigabyte GTX 460 1G
Modded Creative X-Fi Fatal1ty w/ Klipsch Promedia 2.1
1 X 120GB OCZ Vertex
1 X 300GB WD Velociraptor HLFS
1 X Hitachi 7K1000 1TB
Pioneer DVR-216L DVD-RW
Windows 7 Ultimate 64
Terrible performance with HD4870 CrossFire with the hotfix. Ultra high at 1920x1200, 4xAA 16xAF and I get about 20fps.
Update: Disable AI Catalyst, Adaptive AA and I have 35 fps in game average. Look like CrossFire doesn't scale that well though since eventhough I disable CrossFire I still get similar result.
Last edited by iTravis; 10-23-2008 at 09:13 AM.
CPU: Core i7-2600K@4.8Ghz Mobo: Asus Sabertooth P67 Case: Corsair 700D w/ 800D window
CPU Cooler: Corsair H70 w/ 2 GTs AP-15 GPU: 2xGigabyte GTX 670 WindForce OC SLI
RAM: 2x8GB G.Skill Ripjaws PSU: Corsair AX850W Sound card: Asus Xonar DX + Fiio E9
HDD: Crucial M4 128GB + 4TB HDD Display: 3x30" Dell UltraSharp 3007WFP-HC
Speakers: Logitech Z-5500 Headphone: Sennheiser HD650
I'm really disliking this game so far. I enjoy the physics and gameplay a bit, but the graphics really blow.
Running all settings at MAX (Ultra High and the last few at Very High (The ones that cant go higher)) with Bloom and HDR at 1680x1050 2xAA, and it runs great twice at sgood at crysis at mid settings in 1200x1000. All smooth!
Old 175.19 driver 8800GT 835/1970/1905
Last edited by nullface; 10-23-2008 at 09:20 AM.
Intel Core i7 6900K
Noctua NH-D15
Asus X99A II
32 GB G.Skill TridentZ @ 3400 CL15 CR1
NVidia Titan Xp
Creative Sound BlasterX AE-5
Sennheiser HD-598
Samsung 960 Pro 1TB
Western Digital Raptor 600GB
Asus 12x Blu-Ray Burner
Sony Optiarc 24x DVD Burner with NEC chipset
Antec HCP-1200w Power Supply
Viewsonic XG2703-GS
Thermaltake Level 10 GT Snow Edition
Logitech G502 gaming mouse w/Razer Exact Mat
Logitech G910 mechanical gaming keyboard
Windows 8 x64 Pro
I'll be giving it a go in the next few days, when I finally get home. Gaming Rig in Sig, 1920x1200.
My question is, why did they only test GTX260 SLI? Because of the price point? (~$500 vs $500 for 4870 x2)
Just odd, if they were testing SLI, why didn't they test GTX280 SLI and 4870 x2 CrossfireX?
Donate to XtremeSystems!
Workstation: Intel Core i7 4770, Asus Maximus VI Gene, 32GB Corsair Dominator Platinum DDR3-1866, eVGA SC GTX Titan, 256GB Crucial M4, Corsair HX850, Corsair H100i. Corsair Carbide 350D
Fileserver: 2x AMD Opteron 2425HE, Supermicro H8DME-2, 24GB DDR2-667, Supermicro 846TQ 24bay Chassis, Redundant 920w, 256 Crucial M4 boot, 20TB Storage
Notebook Asus Zenbook UX32VD-DH71, Intel Core i7 3517u, 10GB DDR3-1600, 256GB Crucial M4, Geforce GT 620M
CPU - Intel Core i7-7700 3.6GHz
CPU Cooler - Enermax ETS-T40F-BK
Memory - Corsair 8GB DDR4-2400
Mobo - GIGABYTE GA-B250-HD3P
GPU - Gigabyte Windforce OC 8G GTX 1080
OS - Windows 10
HDD - Crucial MX300, 275GB SSD M.2 2280
Case - Enermax Ostrog ADV Blue
PSU - Corsair CX650M
Oh sry, forgot i removed my sig.
3.6GHz E3110 and 2GB Dominator 6400C4D 800MHz 4-4-4-12 on a Commando, nothing special.
I just cancelled my order because of the dodgy widescreen support.
If someone wanted to make nVidia look better, they might run without AA. Nobody with high end gear is going to want to run with 0xAA, and ATI's 4k series pretty much gets AA for free. At 1600x1200, anything up to 8xAA doesn't chop off much framerate.
Particle's First Rule of Online Technical Discussion:
As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.
Rule 1A:
Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.
Rule 2:
When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.
Rule 2A:
When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.
Rule 3:
When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.
Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!
Random Tip o' the Whatever
You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.
Running pretty well on G80 so far. Not any better than Crysis for the visual quality you get IMO, but the game with 180.42 certainly runs pretty well.
Currently doing 1680/1050, Very High graphics, DX9, no AA. That's with the 8800 GTS 640mb in my sig of course.
This is all of course with none of the jittering/stuttering that you get with lack of VRAM.
Gotta love that extra RAM on old G80, I'm raping my friend's 8800 GT. Hah, G92 is trash
As for FPS, if you're curious, 31 FPS minimum, 67 FPS maximum, 38 FPS average. This is from playing with Fraps on last night and just running all over
the place.
Run 1 is dx9 and Run 2 is dx10. Exact same settings (1680*1050 and ultra high preset) and using hotfix drivers. What's going on? Does dx10 perform better or did I make a mistake?
Here are the original result files:
http://dl-client.getdropbox.com/u/20...2021.18.43.rar
System in sig.
Last edited by ghost101; 10-23-2008 at 12:40 PM.
Q9300 l 4GB DDR2 l HD 4850 l GA-X38-DQ6 l 2.5TB HD l VX550 l Dell S2409W l Vista X64
.
No.with the hotfix DX10 performs better but the game it's buggy with the hotfix drivers on Vista![]()
Before you complain about lag, think about Jesus. He lagged three days before respawning.
Yes DX10 performs better, but the HDR option in the menu is greyed out. That does not happen with DX9. But the game looks like if it had HDR with DX10 too. And of course you have those annoying lags everywhere.
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
Q9300 l 4GB DDR2 l HD 4850 l GA-X38-DQ6 l 2.5TB HD l VX550 l Dell S2409W l Vista X64
.
I let these results speak for themselves
Test Setup:
Intel Core 2 Duo E8400 @ 4.0 GHz (500x8)
Cellshock 2x1GB @ DDR2-1000 4-4-4-8
NVIDIA 8800GT 512 MB @ 705/1750/1015 MHz
NVIDIA ForceWare 180.42
Windows XP SP3 32bit & Windows Vista SP1 32bit
Config:
1280x960, AA2x and settings in pic. From "Ultra High" config I've set Shading, Geometry and Shadows to "Very High" as I find this optimal for my comp and I've compared the IQ and there's no noticable difference for me by lowering these settings (in fact it almost looks like it looks better with Shading on Very high on my 8800GT) and I gain a couple of FPS by it. I tested both DX9 and DX10 mode in Vista vs XP and the performance results are very interesting.
3 Loops of "Small Ranch" were used and the graph using the average result between the 3 runs were used.
![]()
Results:
(Windows XP, Windows Vista DX9 and Windows Vista DX10, in that order)
![]()
![]()
This table shows the %-advantage for Vista DX9 and DX10 mode compared to Windows XP FPS rate.
This is the first game that I've benchmarked where Vista has even offered an advantage over Windows XP and the advantage by using DX10 mode is huge! I guess this advantage is mostly when using Antialaising, as the game is supposed to support DX10.1. I could perhaps test 1600x1200 and AA4x for example later if I'm not too lazy as I'm willing to bet the difference might be even slightly greater then. Finally I get a good reason to dualboot with Vista.
Thumbs up for the devs for this game finally being able to properly use DX10.
EDIT: Ooops, updated the %-table to show correct FPS values for Vista DX10 mode. xD
Last edited by RPGWiZaRD; 10-23-2008 at 10:51 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
I was referring to this thread by that statement:
"NVIDIA can use the DX10.1 effects in Far Cry 2"
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Nice OS results, RPGwizard. This is giving me some reasons to upgrade to VISTA DX10 now![]()
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
I think the point is when you benchmark DX9 vs 10 is the changed API execution. And the 20% increase could account for the 20-40% smaller execution overhead on the API vs DX9. In short, devs are finally on an optimized DX10 state.
XP vs Vista could be small better run things. Disk etc.
Last edited by Shintai; 10-23-2008 at 01:58 PM.
Crunching for Comrades and the Common good of the People.
Yea that's what I meant, only a performance advantage, of course the image quality is the same and that goes for XP too what I've seen but performance is significantly worse though.
I'm pretty sure the advantage may even rise slightly if changing to higher res and higher AA mode. I'd also need to check how it compares like with AA disabled though but I'll do it sometimes this weekend when I'm not too busy (wanna play this awesome game but also gotta study for physics test next week and I got maths test tomorrow xD)
Last edited by RPGWiZaRD; 10-23-2008 at 01:50 PM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
here is an OC'd core i7 965 EE running FARCRY2:
http://www.vr-zone.com/articles/x58-...ry-2/6149.html
"Next, we checked out the performance of GeForce GTX 280 SLI on the Core i7 965XE/ ASUS P6T Deluxe X58 board setup. We can observe that at "Enthusiast Mode", Crysis Warhead is mostly GPU limited and is almost unplayable at 2560x1600 resolution. For Far Cry 2 with settings at "Ultra High", 1600x1200 and 1920x1200 resolutions are becoming CPU limited while at 2560x1600 resolution, it is balanced between the CPU and GPU. Let us know what benchmarks you would like to see.
Avg. FPS
1600x1200 1920x1200 2560x1600
Far Cry 2
98.71 91.27 69.33
Crysis Warhead
46.01 40.12 16.15"
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
If they coded for it nVidia will be able to use multisample readback in FC2.
"It's useful to point out that, in spite of the fact that NVIDIA doesn't support DX10.1 and DX10 offers no caps bits, NVIDIA does enable developers to query their driver on support for a feature. This is how they can support multisample readback and any other DX10.1 feature that they chose to expose in this manner."
Source:
http://www.anandtech.com/video/showdoc.aspx?i=3334&p=7
still trying to firgure out why it runs so poorly in dx10. Even crysis ran better in dx10 on vista 64.
ohwell on very high @1680 2xAA the game runs at around 50FPS in dx9 and looks great..will wait for patches.
whats up with that sniper in airfield though
hits you from across the map, no matter what.
~
I made it home about 2hrs ago. Played the game for an hour or so, then ran the benchmark. Lots of stuff going on in the background, so the numbers might not be accurate.
Gaming Rig in sig.
I'll try re-running it tomorrow with noting going on in the background, also might knock it down to 4xAA.
BTW, Running Vista Business x64, with hotfix drivers.
EDIT: Decided to run it quick with everything maxxed out. Also, the FOV thing is annoying in 16:10.
Ranch Small BTW.
Last edited by WesM63; 10-23-2008 at 09:30 PM.
Donate to XtremeSystems!
Workstation: Intel Core i7 4770, Asus Maximus VI Gene, 32GB Corsair Dominator Platinum DDR3-1866, eVGA SC GTX Titan, 256GB Crucial M4, Corsair HX850, Corsair H100i. Corsair Carbide 350D
Fileserver: 2x AMD Opteron 2425HE, Supermicro H8DME-2, 24GB DDR2-667, Supermicro 846TQ 24bay Chassis, Redundant 920w, 256 Crucial M4 boot, 20TB Storage
Notebook Asus Zenbook UX32VD-DH71, Intel Core i7 3517u, 10GB DDR3-1600, 256GB Crucial M4, Geforce GT 620M
Bookmarks