Is that a agreement or a disagreement :shrug:
Printable View
ha nice try.
I bet 20 bucks you couldn't tell the difference between 80 FPS and 2000 FPS. Unless you was using a Benchmarking tool.
Plus, The reason I asked. Because No matter how you look at it, He didn't word it right, which confused me. The right wording would be "No, There is a difference" or 2 "Yes, There is No difference."
30fps is laggy, 60fps makes a big difference to me. My 8800GT averages ~60fps with AA off, shadows and postprocessing set to medium, very high dx9 mod and 1280x1024 resolution.
Yepp! Install and run UT#1 in WinXP mode and then in Win98 mode for laughs. Against the computer you click start, bang your dead in about one 10th of a second:rofl: I got an error trying to count the FPS.:D
My problem is that Crysis on mine is at a Dead stop when the levels start but then is speeds up to a playable level. This makes the average frame rate look lower than it is. Crysis does feature Motion Blur so lower Frame rates aren't as bad anything near 24fps. Like old projectors back in the day. I have mine set to 1600 X 1050 and the custom setting from that Hi-Def hack.
Redeemed!
when i play CS1.6. 100fps makes SO much difference from 60fps. Human eye can see something which appears for up to about 1/200th of a second anyway. Probably even faster. But lets us not slip into this age of discussion of the eye. It's too complex.
it depends on the game.
CSS at 45FPS is worse than crysis at 25FPS for me.
the eye doesn't see in frames, it absorbs light from the environment and then releases an electro-chemical signal to the brain
basically it fades from the previous moment to the current. Think LCD with near infinite input rate.
seriously, get in your chair and spin around really fast and move your hand around a bit while you're at it. That is how rapid movements looks like in reality.
then open your favorite game and start spinning around liek crazy... see the difference?
that's part of why crysis ith motion blurring seems smoother at 30FPS than CSS would at 100. Especially if you're on a monitor that doesn't have a high output frame rate
I played Crysis SP @ average 20-25fps and I enjoyed it.
I play multiplayer (as o3bot) at the same settings and often I'm on on better side of the score charts, some time even on top. And I never use rocket launcher against infantry, nor do I snipe. Ever. I hate snipers. On top of that I never have flown air vehicles. Also, Crysis is the second FPS I've ever played in multiplayer (first was the original CounterStrike back in 2000, totalling to some couple dozen matches on LAN).
Sorry for saying this as an ATI enthusiast... But now i see why Nvidia crapped their pants and downdrowe the forceware driver so the whole game ran faster. oO oh noes to scared of reality vs hardware to pull it ^^
-Alltho i know ATI does not have anything to pull up against 2x 8800 Ultra :(
Or even 1x 8800Ultra.
:P
thats absolutely totally untrue.
and if you have gamed long enough you would know the difference.
the human eye can see 60fps and it CAN tell the difference between 30-60fps.
notice i said the human EYE AND BRAIN CAN differentiate 30 or 60fps.
this is 100% proven fact.
the gray area is around 45fps where most people lose the ability to see the difference.
i can 100% tell 30fps from 45 from 60 fps. there is an absolute visual smoothness to 45 to 60 that is undeniable.
thats why on lcd monitors since 99.99% of them can not do over 60hz you ENABLE V-Sync and lock the FPS at 60fps. so those other resources can be used for other things. you wont gain alot by doing so but it does help.
i love hearing people say you cant see over 30fps.. just total garbage spewed out by people who dont have the hardware to experience 60fps gaming and never have, so they have become accusomed to 30fps so they think its the best. its not.
try a racing game at 30fps then 60 fps and you will see.
I dont know what it is with crysis, but when i run fraps at high i get 30-45 fps. Now if i was to go play COD4 and get 30 fps its laggy... Crysis runs smooth at 30fps.
weird :shrug:
Those graphics look amazing! :slobber:
If You guys think I don't know what I'm talking about, Pickup FRAPs.
Look at the default recording settings. 29.9 FPS. These is the avg for most videos. Yes, You can edit the Recording speed, and Please try, Edit it to do 60 FPS, the record, You videos record too fast and don't stay within the right time. So a 2 min video becomes about a 55 sec video.
Also, Its sorta interesting.
"and if you have gamed long enough you would know the difference."
I've gamed since I was 2, The Only difference I see between between 30 and 60 FPS, is 1) When Sprites are normally used, Like Gun Flashes and such.
There is a difference between Tearing and Low FPS., Its because Once a game gets 30 FPS Minimal. The "Jitters" will slowly go away over a Small FPS gain. Please You guys remember when FEAR came out right? It was just like crysis on hardware back then too.
Also On Halo 1 for PC, on my CRT Monitor at 1600x1200 i get about 2000-6000 FPS and there is no tearing.
Now, If You take Crysis and run it at a decent Res, There will not be lag even if your only getting 20 FPS.
But If you bump up the res to a high/highest, and get 40 FPS somehow, there will still be lag and/or tearing.
You can see the diffrence between 30 or 60FPS but when motion blur is enabled the diffrence is reduced...a bit....
When V-sync is enabled on LCD monitors it doesen't boost the performance (other resources being used for other things) quite the opposite it hampers the performance.
Frame tearing happens on LCD's not on CRT (that's why you don't have tearing @ 2000FPS in Halo 1 because you use a CRT) and it happens on LCD's when the video card pumps higher framerates than the refresh rate of the LCD (60hz mostly on PC monitors)
@Lestat
Here is a quoute from wiki.
Quote:
Vertical synchronization (v-sync, vbl-sync) refers generally to the synchronization of frame changes with the vertical blanking interval. Since CRTs were nearly the only common video display technology prior to the widespread adoption of LCDs, the frame buffers in computer graphics hardware are designed to match the CRT characteristic of drawing images from the top down a line at a time by replacing the data of the previous frame in the buffer with that of the next frame in a similar fashion. When the display requests current contents of the frame buffer before the current frame is done being written on top of the previous one, the frame buffer gives it the current mishmash of both frames, producing a page tearing artifact partway down the image.
Vertical synchronization eliminates this by timing frame buffer fills to coincide with the display's data requests, thus ensuring that only whole frames are seen on-screen.
Computer games and other advanced programs often allow vertical synchronization as an option, but it is sometimes disabled because it often has the effect of hampering performance on slower hardware (and/or in programs that were not adequately designed for v-synced operation) to the point where frame rates drop below that of the display hardware.
Yes I know. Thats why when You record a CRT vs a LCD with a Camerea. You can see lines through the CRT, and barely none on the LCD. Because Cameras can't keep up with the high refresh rates that CRT can push out.
But My main Monitor is the one in my sig, a 42" LCD HDTV.
The Only time i get tearing is when I use 1080p with super high graphics in crysis, thats why I run 720p with higher settings, and get no tearing. It plays smooth as silk.
:doh: :shakes:
I think I saw a guy from youtube play that level.
I think I'm going to agree with you!
Here's that problem I mentioned earlier.
http://img2.putfile.com/thumb/12/36022291139.jpg
@Warboy-Weird I get exact the opposite and that's the way it should logicaly be,higher framerates over 60 = tearing without Vsync for a LCD,lower framerates under 60=no tearing so no need for Vsync.
It may be a problem with your LCD @ 1080P.....
I have a Dell 2407 A04 1920x1200 and a 1080P LG 37" but I don't get tearing @ 1080 with the LG as long as I have under 50-60 frames.
The lines are the difference between Scanned Beam of light scanning horizontally from top to bottom on CR. Active or passive matrix on a LCD has a electic charge doing it instead.
Here's how it works.
http://electronics.howstuffworks.com/lcd4.htm
http://computer.howstuffworks.com/monitor8.htm
The bar/lines you see when video taping a CRT are the Shutter from your camera stopping them as the lines are drawn. Just think of a Strobe light flashing on a Fan, sort of like stop action.
A analog Film with only a shutter speed of 24 frames per second can seem smoother than a video game running at 48fps without motion Blur. Something else 3dfx was right about. One frame is blurred into the next frame instead of an abrupt one frame, then the other during a video game. It can't fool the eye as well.
http://www.pcworld.com/article/id,15...1/article.html
The bars on the CRT can be harder to see if the refresh rate is increased. Try it if you don't believe me? They're hard to see at 85Hz and almost impossible to see at 100Hz unless you use a new high speed shutter camera.Quote:
The resolution of the monitor--which also acts as a gauge for the amount of detail a display offers--comes from the number of pixels and lines. For example, in a CRT monitor with a 1024-by-768 resolution, the beam lights up 1024 pixels as it passes horizontally from left to right. When it reaches the edge of the screen, it stops and moves down to the next line. The beam repeats this process until it has passed over the 768 lines of pixels on the screen. When the beam reaches the bottom, it returns to the top and begins again. A monitor with a 75Hz refresh rate completes this round-trip 75 times per second. If a CRT refreshes too slowly, you'll see a flicker, which is widely believed to lead to eyestrain.
Please note that there are no bars on this Photo?
http://img2.putfile.com/thumb/7/19700170163.jpg
Well, I can play my 360 at 1080p just fine, and other games on my PC in XP at 1080p just fine.
Just when i use vista and use 1080p, most of the time I have to enable vsync. But If I don't enable vsync, and just use 720p, the FPS seems better and there is no tearing.
That picture looks like it was a digital camera, most modern digital cameras capture at a better rate, Therefore You don't see the bars. I was mainly talking about video recording. Not a Camera.
The Funny thing is, Most TVs and Movies don't break 60 FPS, Let alone, Even break 30 FPS.
So i don't think anyone runs around saying "OMG THAT TV PROGRAM (or movie) LAGS"
It does, especially in certain scenes when the camera pans...
Why do you think they are trying to up the refresh of current HDTVs? (not that it really matters since, like you mentioned most movies run much slower than that)
BTW- Gaming since two... LMAO! :rofl: :ROTF:
pfft I remember having to stick in those old school floppy disks to play this alf game back in the late 80s before NES
The theory of threading
Implementing threading is not an easy thing to do, by any means. The nature of a game is that it really wants to have 100% CPU requirement. As Tom Leonard, Valve's multi-threading guru points out, "If you're not at 100%, you're doing a disservice to the game."
Now that's something for Crytek to take note of... :D
vavle ftw :D
afaik the human eye can't see more the 75-80fps a second... so not real point of having fps above that right??
i have play ut4 @ 500fps and cant tell the diff between 80fps and 500fps
All i know is that I know when Vsync is on when I play CS within the first 2 seconds. Then I have to quit CT turn off Vsync. Then all is good again at 100fps.
Then thats going to be a coding error on the part of fraps or the codec you're using. Heck, it could even be due to the way windows is reading the timing crystal and/or pll on the motherboard.. even bios related in that case.
The difference being pointed out is that gaming at 30fps may produce peaks of 40-60fps and low-points (large scenes or battles) of 10-15fps. Unplayable. Gaming at 60fps may make the highs a bit less but lows also pick up quite significantly making a playable and enjoyable experience.Quote:
Also, Its sorta interesting.
"and if you have gamed long enough you would know the difference."
I've gamed since I was 2, The Only difference I see between between 30 and 60 FPS, is 1) When Sprites are normally used, Like Gun Flashes and such.
What are "Jitters?" If you are referring to tearing then that is due to the frames being output by the GPU and your monitor's refresh frequency not syncing. It can happen at lower than the monitors refresh rate and higher, and even when vertical syncronization is enabled... just happens less with it enabled.Quote:
There is a difference between Tearing and Low FPS., Its because Once a game gets 30 FPS Minimal. The "Jitters" will slowly go away over a Small FPS gain. Please You guys remember when FEAR came out right? It was just like crysis on hardware back then too.
FRAPS video of this FPS please?Quote:
Also On Halo 1 for PC, on my CRT Monitor at 1600x1200 i get about 2000-6000 FPS and there is no tearing.
That would be because the game programmers did well on the input side of the game's engine, though at your example of 20fps I can feel lag. Not all games are created equal.Quote:
Now, If You take Crysis and run it at a decent Res, There will not be lag even if your only getting 20 FPS.
What? Going from 20fps with no lag then to 40fps with lag? You should pick a different subject, you're not doing well on this one.Quote:
But If you bump up the res to a high/highest, and get 40 FPS somehow, there will still be lag and/or tearing.
I dnno actually, i get better fps somewhere than my friend with same computer and he has 1x 8800ultra, but on a 680i Mobo so i dont know :S
I dont say ATI is uber, i just loved the 300$ price i had to pay for 2x HD3870 Full-R Sapphire cards...
And versus a 8800ultra.... i think i got more bang for the buck....
I'm not talking about tearing, i'm talking about lag.
You doubt it?
Yes. I know that. I can't feel much lag
Your clearly not understanding, So Technically your not doing good at these subject.
When I'm running at 1280x720 with my cfg with 4xAA. It gives me the best visual performance with a rated 17-38 FPS.
but If I run it at 1920x1080 with my cfg with No AA, It gives me generally poor performance, but I still get the same FPS range as i did with 720 res. And Its Not Tearing, I tested these with Vsync on and off. still get the same problems.
I have a Digital SLR and the shutter was at 1/60 of a second and the monitor's refresh Rate is 85Hz. At 70, you get a lite Bar on the screen.
I can slow the shutter's speed down and it'd get the same result.
It does the same thing with my Analog Hi-8 Handi Cam:)
Most TV's are still doing the same thing though, 24 Frames per second with only a refresh rate of 60Hz for NTSC and 50Hz for PAL in Germany and etc.. Remember the Pixelation problems the first LCD's had?
All I'm saying that the Beam of light scanning across the screen can be seen while the whole screen being refreshed can't be as easily seen. You're right about the Frames per second on TV BUT, remember, it is Motion Blurred. 99% of Video Games are NOT. MB is just a Transition from one frame to the next to help fool the brain into thing there is not break between frames. TV's slower 24FPS is constant as well. If an average of 30 and 60 is different, its because 30 FPS average might be something min 10 and maximum 50. Now do you think you couldn't tell the difference between 10 and 50 FPS?
I'm 50. I remember when the big deal was going from 280 lines the an Hi-Def of 400 to 440 lines LOL! That's 440I not the better 440P.
Crysis v1.1 Patch
Quote:
Fixes:
Damage dealt to vehicles when shot by LAW has been made consistent
F12 (screenshot) now works in restricted mode as well
When player melees during gun raise animation, their gun will not be in a permanently raised position anymore
Memory leaks and potential crashes
Updates:
Improved SLI / Crossfire support and performance
Improved overall rendering performance (DX9 and DX10)
Enabled VSync functionality in D3D10
Tweaks:
Reduced grenade explosion radius in multiplayer
Clamped water tessellation to avoid cheating in MP
Reduced LAW splash damage vs. infantry in PowerStruggle mode
Slowed Rocket projectile speed down in MP slightly
clicks... downloads.. prays it will be good.. :D
Any Perf Improvement?? Im Dling Now But My Net So Slow!!!
Crysis review @ Escapist :up:
Look @ the other reviews to make an opinion about his style.