ha nice try.
I bet 20 bucks you couldn't tell the difference between 80 FPS and 2000 FPS. Unless you was using a Benchmarking tool.
Plus, The reason I asked. Because No matter how you look at it, He didn't word it right, which confused me. The right wording would be "No, There is a difference" or 2 "Yes, There is No difference."
Last edited by Warboy; 12-27-2007 at 09:19 AM.
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
30fps is laggy, 60fps makes a big difference to me. My 8800GT averages ~60fps with AA off, shadows and postprocessing set to medium, very high dx9 mod and 1280x1024 resolution.
Q6600 3.6Ghz @ 1.276v | DFI LP LT P35 | Ballistix Tracer PC2-8500 600Mhz @ 2.074v | GeForce 260 GTX | Auzen X-Fi Prelude | PCP&C 750 | Arch Linux ~ WinXP
AMD Athlon 64 X2 3800 | DFI LANParty NF4 SLI-DR | 2x512MB G.SKILL PC3200 BH-5 | OCZ PowerStream 520W | Palit GeForce 9600 GT | Gentoo
Long Live DFI.
Yepp! Install and run UT#1 in WinXP mode and then in Win98 mode for laughs. Against the computer you click start, bang your dead in about one 10th of a secondI got an error trying to count the FPS.
![]()
My problem is that Crysis on mine is at a Dead stop when the levels start but then is speeds up to a playable level. This makes the average frame rate look lower than it is. Crysis does feature Motion Blur so lower Frame rates aren't as bad anything near 24fps. Like old projectors back in the day. I have mine set to 1600 X 1050 and the custom setting from that Hi-Def hack.
Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
Intel Core i7 920 #3841A437 @ 3.8ghz 1.26v HT off
Thermalright True-120 Extreme
Gigabyte ex58-UD3R @ 190x20
6gb (3x2gb) G.Skill Pi @ 1520mhz 7-7-7-21-1T
PNY 8800gtx @ 640/1000
abs (tagan) 700w
Antec Nine Hundred
Seagate 320gb
LG 25.5" LCD
Logitech G11 + mx518, Logitech x530 5.1 + plantronics DSP500
Redeemed!
20 Logs on the fire for WCG:i7 920@2.8
X3220@3.0
X3220@2.4
E8400@4.05
E6600@2.4
when i play CS1.6. 100fps makes SO much difference from 60fps. Human eye can see something which appears for up to about 1/200th of a second anyway. Probably even faster. But lets us not slip into this age of discussion of the eye. It's too complex.
it depends on the game.
CSS at 45FPS is worse than crysis at 25FPS for me.
the eye doesn't see in frames, it absorbs light from the environment and then releases an electro-chemical signal to the brain
basically it fades from the previous moment to the current. Think LCD with near infinite input rate.
seriously, get in your chair and spin around really fast and move your hand around a bit while you're at it. That is how rapid movements looks like in reality.
then open your favorite game and start spinning around liek crazy... see the difference?
that's part of why crysis ith motion blurring seems smoother at 30FPS than CSS would at 100. Especially if you're on a monitor that doesn't have a high output frame rate
Last edited by xlink; 12-27-2007 at 01:53 PM.
I played Crysis SP @ average 20-25fps and I enjoyed it.
I play multiplayer (as o3bot) at the same settings and often I'm on on better side of the score charts, some time even on top. And I never use rocket launcher against infantry, nor do I snipe. Ever. I hate snipers. On top of that I never have flown air vehicles. Also, Crysis is the second FPS I've ever played in multiplayer (first was the original CounterStrike back in 2000, totalling to some couple dozen matches on LAN).
Last edited by largon; 12-27-2007 at 02:01 PM.
You were not supposed to see this.
Sorry for saying this as an ATI enthusiast... But now i see why Nvidia crapped their pants and downdrowe the forceware driver so the whole game ran faster. oO oh noes to scared of reality vs hardware to pull it ^^
-Alltho i know ATI does not have anything to pull up against 2x 8800 Ultra![]()
My *Old* rig:
Intel Core 2 Quad 6600 G0 @ 3.6ghz
Asus Rampage x48
8 GB Kingston DDR2-RAM
GTX 480 AMP!
Asus Xonar D2
Corsair Force3 120gb (or something
Loads of other drives ^^
Silverstone Fortress 2
Powered by Enermax Galaxy Dx 1000W
Or even 1x 8800Ultra.
:P
You were not supposed to see this.
Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
thats absolutely totally untrue.
and if you have gamed long enough you would know the difference.
the human eye can see 60fps and it CAN tell the difference between 30-60fps.
notice i said the human EYE AND BRAIN CAN differentiate 30 or 60fps.
this is 100% proven fact.
the gray area is around 45fps where most people lose the ability to see the difference.
i can 100% tell 30fps from 45 from 60 fps. there is an absolute visual smoothness to 45 to 60 that is undeniable.
thats why on lcd monitors since 99.99% of them can not do over 60hz you ENABLE V-Sync and lock the FPS at 60fps. so those other resources can be used for other things. you wont gain alot by doing so but it does help.
i love hearing people say you cant see over 30fps.. just total garbage spewed out by people who dont have the hardware to experience 60fps gaming and never have, so they have become accusomed to 30fps so they think its the best. its not.
try a racing game at 30fps then 60 fps and you will see.
"These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
Welcome to the Roughnecks"
"Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"
Heat Ebay Feedback
I dont know what it is with crysis, but when i run fraps at high i get 30-45 fps. Now if i was to go play COD4 and get 30 fps its laggy... Crysis runs smooth at 30fps.
weird
Those graphics look amazing!![]()
[SIGPIC][/SIGPIC]
3DMark06: 20974
If You guys think I don't know what I'm talking about, Pickup FRAPs.
Look at the default recording settings. 29.9 FPS. These is the avg for most videos. Yes, You can edit the Recording speed, and Please try, Edit it to do 60 FPS, the record, You videos record too fast and don't stay within the right time. So a 2 min video becomes about a 55 sec video.
Also, Its sorta interesting.
"and if you have gamed long enough you would know the difference."
I've gamed since I was 2, The Only difference I see between between 30 and 60 FPS, is 1) When Sprites are normally used, Like Gun Flashes and such.
There is a difference between Tearing and Low FPS., Its because Once a game gets 30 FPS Minimal. The "Jitters" will slowly go away over a Small FPS gain. Please You guys remember when FEAR came out right? It was just like crysis on hardware back then too.
Also On Halo 1 for PC, on my CRT Monitor at 1600x1200 i get about 2000-6000 FPS and there is no tearing.
Now, If You take Crysis and run it at a decent Res, There will not be lag even if your only getting 20 FPS.
But If you bump up the res to a high/highest, and get 40 FPS somehow, there will still be lag and/or tearing.
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
You can see the diffrence between 30 or 60FPS but when motion blur is enabled the diffrence is reduced...a bit....
When V-sync is enabled on LCD monitors it doesen't boost the performance (other resources being used for other things) quite the opposite it hampers the performance.
Frame tearing happens on LCD's not on CRT (that's why you don't have tearing @ 2000FPS in Halo 1 because you use a CRT) and it happens on LCD's when the video card pumps higher framerates than the refresh rate of the LCD (60hz mostly on PC monitors)
@Lestat
Here is a quoute from wiki.
Vertical synchronization (v-sync, vbl-sync) refers generally to the synchronization of frame changes with the vertical blanking interval. Since CRTs were nearly the only common video display technology prior to the widespread adoption of LCDs, the frame buffers in computer graphics hardware are designed to match the CRT characteristic of drawing images from the top down a line at a time by replacing the data of the previous frame in the buffer with that of the next frame in a similar fashion. When the display requests current contents of the frame buffer before the current frame is done being written on top of the previous one, the frame buffer gives it the current mishmash of both frames, producing a page tearing artifact partway down the image.
Vertical synchronization eliminates this by timing frame buffer fills to coincide with the display's data requests, thus ensuring that only whole frames are seen on-screen.
Computer games and other advanced programs often allow vertical synchronization as an option, but it is sometimes disabled because it often has the effect of hampering performance on slower hardware (and/or in programs that were not adequately designed for v-synced operation) to the point where frame rates drop below that of the display hardware.
Last edited by XS2K; 12-27-2007 at 06:32 PM.
Before you complain about lag, think about Jesus. He lagged three days before respawning.
Yes I know. Thats why when You record a CRT vs a LCD with a Camerea. You can see lines through the CRT, and barely none on the LCD. Because Cameras can't keep up with the high refresh rates that CRT can push out.
But My main Monitor is the one in my sig, a 42" LCD HDTV.
The Only time i get tearing is when I use 1080p with super high graphics in crysis, thats why I run 720p with higher settings, and get no tearing. It plays smooth as silk.
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
Bookmarks