Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
@Warboy-Weird I get exact the opposite and that's the way it should logicaly be,higher framerates over 60 = tearing without Vsync for a LCD,lower framerates under 60=no tearing so no need for Vsync.
It may be a problem with your LCD @ 1080P.....
I have a Dell 2407 A04 1920x1200 and a 1080P LG 37" but I don't get tearing @ 1080 with the LG as long as I have under 50-60 frames.
Before you complain about lag, think about Jesus. He lagged three days before respawning.
The lines are the difference between Scanned Beam of light scanning horizontally from top to bottom on CR. Active or passive matrix on a LCD has a electic charge doing it instead.
Here's how it works.
http://electronics.howstuffworks.com/lcd4.htm
http://computer.howstuffworks.com/monitor8.htm
The bar/lines you see when video taping a CRT are the Shutter from your camera stopping them as the lines are drawn. Just think of a Strobe light flashing on a Fan, sort of like stop action.
A analog Film with only a shutter speed of 24 frames per second can seem smoother than a video game running at 48fps without motion Blur. Something else 3dfx was right about. One frame is blurred into the next frame instead of an abrupt one frame, then the other during a video game. It can't fool the eye as well.
http://www.pcworld.com/article/id,15...1/article.html
The bars on the CRT can be harder to see if the refresh rate is increased. Try it if you don't believe me? They're hard to see at 85Hz and almost impossible to see at 100Hz unless you use a new high speed shutter camera.The resolution of the monitor--which also acts as a gauge for the amount of detail a display offers--comes from the number of pixels and lines. For example, in a CRT monitor with a 1024-by-768 resolution, the beam lights up 1024 pixels as it passes horizontally from left to right. When it reaches the edge of the screen, it stops and moves down to the next line. The beam repeats this process until it has passed over the 768 lines of pixels on the screen. When the beam reaches the bottom, it returns to the top and begins again. A monitor with a 75Hz refresh rate completes this round-trip 75 times per second. If a CRT refreshes too slowly, you'll see a flicker, which is widely believed to lead to eyestrain.
Please note that there are no bars on this Photo?
![]()
Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
Well, I can play my 360 at 1080p just fine, and other games on my PC in XP at 1080p just fine.
Just when i use vista and use 1080p, most of the time I have to enable vsync. But If I don't enable vsync, and just use 720p, the FPS seems better and there is no tearing.
That picture looks like it was a digital camera, most modern digital cameras capture at a better rate, Therefore You don't see the bars. I was mainly talking about video recording. Not a Camera.
The Funny thing is, Most TVs and Movies don't break 60 FPS, Let alone, Even break 30 FPS.
So i don't think anyone runs around saying "OMG THAT TV PROGRAM (or movie) LAGS"
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
pfft I remember having to stick in those old school floppy disks to play this alf game back in the late 80s before NES
Asus Striker Extreme, BIOS 1305
Intel Core2Duo E6750 @ 3.5Ghz
Crucial Ballistix Tracer 2x1GB PC2-8500 1050Mhz 4448
Zalman CNPS9700
WD 2x250 7200rpm Raid0
2xBFG 8800gt oc SLI
OCZ 850W GameXStream PSU
Raidmax Smilodon
The theory of threading
Implementing threading is not an easy thing to do, by any means. The nature of a game is that it really wants to have 100% CPU requirement. As Tom Leonard, Valve's multi-threading guru points out, "If you're not at 100%, you're doing a disservice to the game."
Now that's something for Crytek to take note of...![]()
| Intel Core i5 2500K | Asrock P67 Extreme 6 | Gskill Ripjaws 8GB CL7 |
| Sapphire HD6970 | Creative X-Fi modded | Corsair HX850 | Corsair H60 |
vavle ftw![]()
All i know is that I know when Vsync is on when I play CS within the first 2 seconds. Then I have to quit CT turn off Vsync. Then all is good again at 100fps.
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
Then thats going to be a coding error on the part of fraps or the codec you're using. Heck, it could even be due to the way windows is reading the timing crystal and/or pll on the motherboard.. even bios related in that case.
The difference being pointed out is that gaming at 30fps may produce peaks of 40-60fps and low-points (large scenes or battles) of 10-15fps. Unplayable. Gaming at 60fps may make the highs a bit less but lows also pick up quite significantly making a playable and enjoyable experience.Also, Its sorta interesting.
"and if you have gamed long enough you would know the difference."
I've gamed since I was 2, The Only difference I see between between 30 and 60 FPS, is 1) When Sprites are normally used, Like Gun Flashes and such.
What are "Jitters?" If you are referring to tearing then that is due to the frames being output by the GPU and your monitor's refresh frequency not syncing. It can happen at lower than the monitors refresh rate and higher, and even when vertical syncronization is enabled... just happens less with it enabled.There is a difference between Tearing and Low FPS., Its because Once a game gets 30 FPS Minimal. The "Jitters" will slowly go away over a Small FPS gain. Please You guys remember when FEAR came out right? It was just like crysis on hardware back then too.
FRAPS video of this FPS please?Also On Halo 1 for PC, on my CRT Monitor at 1600x1200 i get about 2000-6000 FPS and there is no tearing.
That would be because the game programmers did well on the input side of the game's engine, though at your example of 20fps I can feel lag. Not all games are created equal.Now, If You take Crysis and run it at a decent Res, There will not be lag even if your only getting 20 FPS.
What? Going from 20fps with no lag then to 40fps with lag? You should pick a different subject, you're not doing well on this one.But If you bump up the res to a high/highest, and get 40 FPS somehow, there will still be lag and/or tearing.
Last edited by STEvil; 12-28-2007 at 05:21 PM.
All along the watchtower the watchmen watch the eternal return.
I dnno actually, i get better fps somewhere than my friend with same computer and he has 1x 8800ultra, but on a 680i Mobo so i dont know :S
I dont say ATI is uber, i just loved the 300$ price i had to pay for 2x HD3870 Full-R Sapphire cards...
And versus a 8800ultra.... i think i got more bang for the buck....
My *Old* rig:
Intel Core 2 Quad 6600 G0 @ 3.6ghz
Asus Rampage x48
8 GB Kingston DDR2-RAM
GTX 480 AMP!
Asus Xonar D2
Corsair Force3 120gb (or something
Loads of other drives ^^
Silverstone Fortress 2
Powered by Enermax Galaxy Dx 1000W
Last edited by XS2K; 12-28-2007 at 05:33 PM.
Before you complain about lag, think about Jesus. He lagged three days before respawning.
I'm not talking about tearing, i'm talking about lag.
You doubt it?
Yes. I know that. I can't feel much lag
Your clearly not understanding, So Technically your not doing good at these subject.
When I'm running at 1280x720 with my cfg with 4xAA. It gives me the best visual performance with a rated 17-38 FPS.
but If I run it at 1920x1080 with my cfg with No AA, It gives me generally poor performance, but I still get the same FPS range as i did with 720 res. And Its Not Tearing, I tested these with Vsync on and off. still get the same problems.
My Rig can do EpicFLOPs, Can yours?
Once this baby hits 88 TeraFLOPs, You're going to see some serious $@#%....
Build XT7 is currently active.
Current OS Systems: Windows 10 64bit
I have a Digital SLR and the shutter was at 1/60 of a second and the monitor's refresh Rate is 85Hz. At 70, you get a lite Bar on the screen.
I can slow the shutter's speed down and it'd get the same result.
It does the same thing with my Analog Hi-8 Handi Cam![]()
Most TV's are still doing the same thing though, 24 Frames per second with only a refresh rate of 60Hz for NTSC and 50Hz for PAL in Germany and etc.. Remember the Pixelation problems the first LCD's had?
All I'm saying that the Beam of light scanning across the screen can be seen while the whole screen being refreshed can't be as easily seen. You're right about the Frames per second on TV BUT, remember, it is Motion Blurred. 99% of Video Games are NOT. MB is just a Transition from one frame to the next to help fool the brain into thing there is not break between frames. TV's slower 24FPS is constant as well. If an average of 30 and 60 is different, its because 30 FPS average might be something min 10 and maximum 50. Now do you think you couldn't tell the difference between 10 and 50 FPS?
I'm 50. I remember when the big deal was going from 280 lines the an Hi-Def of 400 to 440 lines LOL! That's 440I not the better 440P.
Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
Originally Posted by Movieman
Posted by duploxxx
I am sure JF is relaxed and smiling these days with there intended launch schedule. SNB Xeon servers on the other hand....
Posted by gallag
there yo go bringing intel into a amd thread again lol, if that was someone droping a dig at amd you would be crying like a girl.qft!
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
My *Old* rig:
Intel Core 2 Quad 6600 G0 @ 3.6ghz
Asus Rampage x48
8 GB Kingston DDR2-RAM
GTX 480 AMP!
Asus Xonar D2
Corsair Force3 120gb (or something
Loads of other drives ^^
Silverstone Fortress 2
Powered by Enermax Galaxy Dx 1000W
Bookmarks