Is that a agreement or a disagreement :shrug:
Printable View
ha nice try.
I bet 20 bucks you couldn't tell the difference between 80 FPS and 2000 FPS. Unless you was using a Benchmarking tool.
Plus, The reason I asked. Because No matter how you look at it, He didn't word it right, which confused me. The right wording would be "No, There is a difference" or 2 "Yes, There is No difference."
30fps is laggy, 60fps makes a big difference to me. My 8800GT averages ~60fps with AA off, shadows and postprocessing set to medium, very high dx9 mod and 1280x1024 resolution.
Yepp! Install and run UT#1 in WinXP mode and then in Win98 mode for laughs. Against the computer you click start, bang your dead in about one 10th of a second:rofl: I got an error trying to count the FPS.:D
My problem is that Crysis on mine is at a Dead stop when the levels start but then is speeds up to a playable level. This makes the average frame rate look lower than it is. Crysis does feature Motion Blur so lower Frame rates aren't as bad anything near 24fps. Like old projectors back in the day. I have mine set to 1600 X 1050 and the custom setting from that Hi-Def hack.
Redeemed!
when i play CS1.6. 100fps makes SO much difference from 60fps. Human eye can see something which appears for up to about 1/200th of a second anyway. Probably even faster. But lets us not slip into this age of discussion of the eye. It's too complex.
it depends on the game.
CSS at 45FPS is worse than crysis at 25FPS for me.
the eye doesn't see in frames, it absorbs light from the environment and then releases an electro-chemical signal to the brain
basically it fades from the previous moment to the current. Think LCD with near infinite input rate.
seriously, get in your chair and spin around really fast and move your hand around a bit while you're at it. That is how rapid movements looks like in reality.
then open your favorite game and start spinning around liek crazy... see the difference?
that's part of why crysis ith motion blurring seems smoother at 30FPS than CSS would at 100. Especially if you're on a monitor that doesn't have a high output frame rate
I played Crysis SP @ average 20-25fps and I enjoyed it.
I play multiplayer (as o3bot) at the same settings and often I'm on on better side of the score charts, some time even on top. And I never use rocket launcher against infantry, nor do I snipe. Ever. I hate snipers. On top of that I never have flown air vehicles. Also, Crysis is the second FPS I've ever played in multiplayer (first was the original CounterStrike back in 2000, totalling to some couple dozen matches on LAN).
Sorry for saying this as an ATI enthusiast... But now i see why Nvidia crapped their pants and downdrowe the forceware driver so the whole game ran faster. oO oh noes to scared of reality vs hardware to pull it ^^
-Alltho i know ATI does not have anything to pull up against 2x 8800 Ultra :(
Or even 1x 8800Ultra.
:P
thats absolutely totally untrue.
and if you have gamed long enough you would know the difference.
the human eye can see 60fps and it CAN tell the difference between 30-60fps.
notice i said the human EYE AND BRAIN CAN differentiate 30 or 60fps.
this is 100% proven fact.
the gray area is around 45fps where most people lose the ability to see the difference.
i can 100% tell 30fps from 45 from 60 fps. there is an absolute visual smoothness to 45 to 60 that is undeniable.
thats why on lcd monitors since 99.99% of them can not do over 60hz you ENABLE V-Sync and lock the FPS at 60fps. so those other resources can be used for other things. you wont gain alot by doing so but it does help.
i love hearing people say you cant see over 30fps.. just total garbage spewed out by people who dont have the hardware to experience 60fps gaming and never have, so they have become accusomed to 30fps so they think its the best. its not.
try a racing game at 30fps then 60 fps and you will see.
I dont know what it is with crysis, but when i run fraps at high i get 30-45 fps. Now if i was to go play COD4 and get 30 fps its laggy... Crysis runs smooth at 30fps.
weird :shrug:
Those graphics look amazing! :slobber:
If You guys think I don't know what I'm talking about, Pickup FRAPs.
Look at the default recording settings. 29.9 FPS. These is the avg for most videos. Yes, You can edit the Recording speed, and Please try, Edit it to do 60 FPS, the record, You videos record too fast and don't stay within the right time. So a 2 min video becomes about a 55 sec video.
Also, Its sorta interesting.
"and if you have gamed long enough you would know the difference."
I've gamed since I was 2, The Only difference I see between between 30 and 60 FPS, is 1) When Sprites are normally used, Like Gun Flashes and such.
There is a difference between Tearing and Low FPS., Its because Once a game gets 30 FPS Minimal. The "Jitters" will slowly go away over a Small FPS gain. Please You guys remember when FEAR came out right? It was just like crysis on hardware back then too.
Also On Halo 1 for PC, on my CRT Monitor at 1600x1200 i get about 2000-6000 FPS and there is no tearing.
Now, If You take Crysis and run it at a decent Res, There will not be lag even if your only getting 20 FPS.
But If you bump up the res to a high/highest, and get 40 FPS somehow, there will still be lag and/or tearing.
You can see the diffrence between 30 or 60FPS but when motion blur is enabled the diffrence is reduced...a bit....
When V-sync is enabled on LCD monitors it doesen't boost the performance (other resources being used for other things) quite the opposite it hampers the performance.
Frame tearing happens on LCD's not on CRT (that's why you don't have tearing @ 2000FPS in Halo 1 because you use a CRT) and it happens on LCD's when the video card pumps higher framerates than the refresh rate of the LCD (60hz mostly on PC monitors)
@Lestat
Here is a quoute from wiki.
Quote:
Vertical synchronization (v-sync, vbl-sync) refers generally to the synchronization of frame changes with the vertical blanking interval. Since CRTs were nearly the only common video display technology prior to the widespread adoption of LCDs, the frame buffers in computer graphics hardware are designed to match the CRT characteristic of drawing images from the top down a line at a time by replacing the data of the previous frame in the buffer with that of the next frame in a similar fashion. When the display requests current contents of the frame buffer before the current frame is done being written on top of the previous one, the frame buffer gives it the current mishmash of both frames, producing a page tearing artifact partway down the image.
Vertical synchronization eliminates this by timing frame buffer fills to coincide with the display's data requests, thus ensuring that only whole frames are seen on-screen.
Computer games and other advanced programs often allow vertical synchronization as an option, but it is sometimes disabled because it often has the effect of hampering performance on slower hardware (and/or in programs that were not adequately designed for v-synced operation) to the point where frame rates drop below that of the display hardware.
Yes I know. Thats why when You record a CRT vs a LCD with a Camerea. You can see lines through the CRT, and barely none on the LCD. Because Cameras can't keep up with the high refresh rates that CRT can push out.
But My main Monitor is the one in my sig, a 42" LCD HDTV.
The Only time i get tearing is when I use 1080p with super high graphics in crysis, thats why I run 720p with higher settings, and get no tearing. It plays smooth as silk.