I thought this thread was about nehalem?
Printable View
I thought this thread was about nehalem?
Enough about LCDs, CRTs, frame rates and refresh rates. Take your infinite ignorance elsewhere.
People are just to anal about lots of things and tend to discuss things to the point of tears (of boredom) :/
Who cares...
Man... take into consideration the rendering engine of the game, too.
Most games are designed to finish an animation within a certain number of frames, simply because... it's easier to do that. If the render rate is faster or slower, it might go out of sync, and either you'll feel the game is very sluggish, or the game is too fast. (famous problem with 2D games. :) I'll say quite a few Flash developers would run into this as well)
There are solutions for this, but one common one would be to use a timer to estimate how long it would take for a render to finish, then scale it accordingly. In other words... either delay the rendering of a frame, or skip a frame entirely.
So anyway, what I am trying to say is that... a game will not run any faster or smoother than the frame rate it was designed to run at. And I don't think any game developer would be so crazy as to build a game that natively runs at over 60 frames per second.
As for the responsiveness of games, that doesn't exactly depend on the frame rate. This is a developer's preference entirely, and let's just say... depending on how the game is coded, it can feel more responsive, or less responsive. Sure, higher framerates means faster event handling rate, thus meaning the game "feels" smoother, but if the event handling rate was high to begin with (input handling is handled every time a frame is finished, for instance) then it would not feel any less responsive.
So in the end, it depends solely on the game engine, and the way the developers use the timer. Your eyes are not seeing anything more than it can see. It's your body, or other senses, that feels the smooth responsiveness of the game.
P.S.: And about tearing... or unresponsive periods, well, let's just say... it's arbitrary when you have to skip 1.123 frames or so... so only 1 frame is skipped in most cases, or 2 frames, and sometimes, that means the input handling is skipped as well. About the tearing, when the tick of the game timer is not synced to the tick of the LCD refresh timer, an incomplete frame is grabbed, like Jack said, and the frame appears teared (or torn, whatever... please excuse my Engrish).
metroid... you didnt read the article i linked.. unless your in a branch of neuroscience we havent heard about, the human eye does not work in the way you are framing it. pun intended.
http://whisper.ausgamers.com/wiki/in...an_eye_can_see
That was made for a guy who wanted to prove his point, not a truth concrete evidence was showed, however I do believe about some of the things he said but that is far from a basic research. So I will still be believing that we can not see more than what the professionals call real time until the miraculous person gets a PHD award merit for its research for this topic and proves what the other professionals found until this very day was all wrong, is it hard is not?
I will be waiting, as far things go I can say more than 70 FPS is just a waste. If the monitor is 120 FPS great because it will read the same frame twice which will relieve our eyes but it does not mean you can see 120 frames at all, right now movies run with at 24 to 30 FPS so monitors refresh the same image twice making it smoother.
Like I said show me the PHD award, topic proposed and all the evidences then I will believe.
Metroid.
Metroid.
For simplicity, i ll pose an illustration that the eye can obviously see more. No phd needed.
1. you are in a pitch black room. no light what so ever.
2. I flash a bright spot light for 1/100 of a second.
3. Do you see the light? ....
that is what the navy experiment shows.. that upwards of 1/220th of a second the eye can see an image, and discern it, not just detect the change
but identify it.
Back back back... there. on topic, I guess i wonder why people are still so surprised about the results in gaming being on par with yorkfield, if you look at the specific elements intel chose to improve on their cpu design you would see that games don't yet take advantage of those types of computation. I dont think it is a fact that nehalem doesnt improve game performance, but rather a more true statement would be, given the way games are coded currently , nehalem's improvements are not fully utilized.
Can somebody post the video scores? I knew nehalem wouldnt improve much games
BUT I LOVE MY CRT AND I HATE ALL LCDS B/C MY CRT HAS A BETTER PICTURE QUALITY 'N SUM HIGHER REFRESH RATES 'N RLY GOOD DOT PITCH. LOLOLOLOLOLOLOLOLOL. I STILL PREFER TO USE A 90 POUND DISPLAY BECAUSE I'M SUCH AN IMAGE QUALITY ELITIST DOUCHE THAT I CAN'T BE BOTHERED WITH INFERIOR LCD TECHNOLOGY. ELECTRON GUNZ!
lol @ this nonsense.
Good thread. I am also thinking of going with the I7 (920). I have a good E6600 now and have some tax money coming, enough to get the I7, P6T, memory and TRUE heatsink. I could try to save it for the next gen but yeah right, like that's going to happen, heh heh. As was mentioned, there are other factors such as doing other things besides gaming. It holds it's own with games right now and I suspect will be better equiped with future games. Another factor for me in this upgrade is that I would be changing out my daughter's X2 4800 set up to my existing E6600 set up. So she would be getting a nice upgrade in the process too.
the i7 is alot faster, its just that those games cant take advantage of it, and dont really need to to be playable.
cpu's just dont do alot in these games, so it doesnt really matter how fast they are after they pass a threshold value.
a true measure of performance can be seen in the video editing section of the review.
games are basically a gpu bench, and they arent that good at comparing cpu's or memory
Very poor choice of games to run
whoa, it took me the whole first page to realize how old this thread is.
still? hmmm i would have thought that by now wed at least see some gains of 4 vs 2 cores... im really worried that intel and amd lost it... they keep doubling their cores like theres no tomorow when it actually doesnt help AT ALL 95% of the time...Quote:
Looks like video games don't make a damn bit of difference with nehalem vs. penryn or 2 vs. 4 cores.
its hard to really justify 4 cores, and we will have 8 cores, and even 16 virtual cores within a couple of months...
only if your looking at games...
most scientific apps i work with are multithreaded and they just love cores -> the more the merry.
When people only surf and occaionaly game, why even bother with a quadcore?
Imho thats one of the reasons intels mainstream cpus still will be dualcores.
Well before reading this thread, I was pretty sure that I wanted to buy an i7 system. I currently have a very poor AMD 64 x2 4600 system.
I want to get a system that gives me the best bang for my buck. I am buying everything, from the case to the dvd burner, to the monitor.
From what I read here, if I am going to be MAINLY playing games and some other projects, folding and Microsoft Office and Adobe, then maybe I shouldnt get the i7 system.
Maybe I should lean towards the new Phenom CPU's and put the other money into crossfire or sli.
Now I need to rethink my last selections, yet again. lol. I guess the adage of having too much knowledge can be a burden.