:doh: :shakes:
Printable View
:doh: :shakes:
I think I saw a guy from youtube play that level.
I think I'm going to agree with you!
Here's that problem I mentioned earlier.
http://img2.putfile.com/thumb/12/36022291139.jpg
@Warboy-Weird I get exact the opposite and that's the way it should logicaly be,higher framerates over 60 = tearing without Vsync for a LCD,lower framerates under 60=no tearing so no need for Vsync.
It may be a problem with your LCD @ 1080P.....
I have a Dell 2407 A04 1920x1200 and a 1080P LG 37" but I don't get tearing @ 1080 with the LG as long as I have under 50-60 frames.
The lines are the difference between Scanned Beam of light scanning horizontally from top to bottom on CR. Active or passive matrix on a LCD has a electic charge doing it instead.
Here's how it works.
http://electronics.howstuffworks.com/lcd4.htm
http://computer.howstuffworks.com/monitor8.htm
The bar/lines you see when video taping a CRT are the Shutter from your camera stopping them as the lines are drawn. Just think of a Strobe light flashing on a Fan, sort of like stop action.
A analog Film with only a shutter speed of 24 frames per second can seem smoother than a video game running at 48fps without motion Blur. Something else 3dfx was right about. One frame is blurred into the next frame instead of an abrupt one frame, then the other during a video game. It can't fool the eye as well.
http://www.pcworld.com/article/id,15...1/article.html
The bars on the CRT can be harder to see if the refresh rate is increased. Try it if you don't believe me? They're hard to see at 85Hz and almost impossible to see at 100Hz unless you use a new high speed shutter camera.Quote:
The resolution of the monitor--which also acts as a gauge for the amount of detail a display offers--comes from the number of pixels and lines. For example, in a CRT monitor with a 1024-by-768 resolution, the beam lights up 1024 pixels as it passes horizontally from left to right. When it reaches the edge of the screen, it stops and moves down to the next line. The beam repeats this process until it has passed over the 768 lines of pixels on the screen. When the beam reaches the bottom, it returns to the top and begins again. A monitor with a 75Hz refresh rate completes this round-trip 75 times per second. If a CRT refreshes too slowly, you'll see a flicker, which is widely believed to lead to eyestrain.
Please note that there are no bars on this Photo?
http://img2.putfile.com/thumb/7/19700170163.jpg
Well, I can play my 360 at 1080p just fine, and other games on my PC in XP at 1080p just fine.
Just when i use vista and use 1080p, most of the time I have to enable vsync. But If I don't enable vsync, and just use 720p, the FPS seems better and there is no tearing.
That picture looks like it was a digital camera, most modern digital cameras capture at a better rate, Therefore You don't see the bars. I was mainly talking about video recording. Not a Camera.
The Funny thing is, Most TVs and Movies don't break 60 FPS, Let alone, Even break 30 FPS.
So i don't think anyone runs around saying "OMG THAT TV PROGRAM (or movie) LAGS"
It does, especially in certain scenes when the camera pans...
Why do you think they are trying to up the refresh of current HDTVs? (not that it really matters since, like you mentioned most movies run much slower than that)
BTW- Gaming since two... LMAO! :rofl: :ROTF:
pfft I remember having to stick in those old school floppy disks to play this alf game back in the late 80s before NES
The theory of threading
Implementing threading is not an easy thing to do, by any means. The nature of a game is that it really wants to have 100% CPU requirement. As Tom Leonard, Valve's multi-threading guru points out, "If you're not at 100%, you're doing a disservice to the game."
Now that's something for Crytek to take note of... :D
vavle ftw :D
afaik the human eye can't see more the 75-80fps a second... so not real point of having fps above that right??
i have play ut4 @ 500fps and cant tell the diff between 80fps and 500fps
All i know is that I know when Vsync is on when I play CS within the first 2 seconds. Then I have to quit CT turn off Vsync. Then all is good again at 100fps.
Then thats going to be a coding error on the part of fraps or the codec you're using. Heck, it could even be due to the way windows is reading the timing crystal and/or pll on the motherboard.. even bios related in that case.
The difference being pointed out is that gaming at 30fps may produce peaks of 40-60fps and low-points (large scenes or battles) of 10-15fps. Unplayable. Gaming at 60fps may make the highs a bit less but lows also pick up quite significantly making a playable and enjoyable experience.Quote:
Also, Its sorta interesting.
"and if you have gamed long enough you would know the difference."
I've gamed since I was 2, The Only difference I see between between 30 and 60 FPS, is 1) When Sprites are normally used, Like Gun Flashes and such.
What are "Jitters?" If you are referring to tearing then that is due to the frames being output by the GPU and your monitor's refresh frequency not syncing. It can happen at lower than the monitors refresh rate and higher, and even when vertical syncronization is enabled... just happens less with it enabled.Quote:
There is a difference between Tearing and Low FPS., Its because Once a game gets 30 FPS Minimal. The "Jitters" will slowly go away over a Small FPS gain. Please You guys remember when FEAR came out right? It was just like crysis on hardware back then too.
FRAPS video of this FPS please?Quote:
Also On Halo 1 for PC, on my CRT Monitor at 1600x1200 i get about 2000-6000 FPS and there is no tearing.
That would be because the game programmers did well on the input side of the game's engine, though at your example of 20fps I can feel lag. Not all games are created equal.Quote:
Now, If You take Crysis and run it at a decent Res, There will not be lag even if your only getting 20 FPS.
What? Going from 20fps with no lag then to 40fps with lag? You should pick a different subject, you're not doing well on this one.Quote:
But If you bump up the res to a high/highest, and get 40 FPS somehow, there will still be lag and/or tearing.
I dnno actually, i get better fps somewhere than my friend with same computer and he has 1x 8800ultra, but on a 680i Mobo so i dont know :S
I dont say ATI is uber, i just loved the 300$ price i had to pay for 2x HD3870 Full-R Sapphire cards...
And versus a 8800ultra.... i think i got more bang for the buck....
I'm not talking about tearing, i'm talking about lag.
You doubt it?
Yes. I know that. I can't feel much lag
Your clearly not understanding, So Technically your not doing good at these subject.
When I'm running at 1280x720 with my cfg with 4xAA. It gives me the best visual performance with a rated 17-38 FPS.
but If I run it at 1920x1080 with my cfg with No AA, It gives me generally poor performance, but I still get the same FPS range as i did with 720 res. And Its Not Tearing, I tested these with Vsync on and off. still get the same problems.
I have a Digital SLR and the shutter was at 1/60 of a second and the monitor's refresh Rate is 85Hz. At 70, you get a lite Bar on the screen.
I can slow the shutter's speed down and it'd get the same result.
It does the same thing with my Analog Hi-8 Handi Cam:)
Most TV's are still doing the same thing though, 24 Frames per second with only a refresh rate of 60Hz for NTSC and 50Hz for PAL in Germany and etc.. Remember the Pixelation problems the first LCD's had?
All I'm saying that the Beam of light scanning across the screen can be seen while the whole screen being refreshed can't be as easily seen. You're right about the Frames per second on TV BUT, remember, it is Motion Blurred. 99% of Video Games are NOT. MB is just a Transition from one frame to the next to help fool the brain into thing there is not break between frames. TV's slower 24FPS is constant as well. If an average of 30 and 60 is different, its because 30 FPS average might be something min 10 and maximum 50. Now do you think you couldn't tell the difference between 10 and 50 FPS?
I'm 50. I remember when the big deal was going from 280 lines the an Hi-Def of 400 to 440 lines LOL! That's 440I not the better 440P.