Page 4 of 6 FirstFirst 123456 LastLast
Results 76 to 100 of 126

Thread: 3.2G OC'd CORE i7 940 vs. 3.16Ghz Stock E8500 vs. 3.2Ghz Stock QX9770 Complete Review

  1. #76
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    I see your point, but who is buying a nehalem system in order to play CPU limited DX7 and DX8 games that already run in the 100's of fps though?

    The thing you *need* more performance for is current day games and if they're so GPU bound at 12x10 on average, I'd say it's probably not worth the price to someone who wants gaming performance. Just my opinion though.

    Do you mean to imply that we don't know the CPUs gaming potential purely because they tested 12x10 with GPU limited settings? If so, you might have problems in the future, because that trend is only going to continue, and the average resolution is going up, not down.

    Should they have bumped down resolution to settings that no one plays at to gauge the CPUs "gaming performance"? Sure if you're benchmarking by running a game at 800x600 you can call it "gaming performance" but it's not real-world gaming performance.

    I guess the fundamental question is.. what is "gaming performance"?
    Exactly ... to your rhetorical question ... meaning who would buy this purely for gaming (see my response to Glow above).

    I own probably 200 games from Doom to my most recent purchase Spore.... I have played maybe 10 all the way through ... but I use them to stress and play with the GPU, CPU, etc combos ... it's just what I like to do.

    I use my computer mostly for other things, some modeling, NLE editing, etc. So Nehalem is looking pretty darn nice to me.

    EDIT: In terms of gaming 'performance', i.e. what is it... that is in the eye of the beholder in my opinion. The output that we measure to ascertain a performance number is frames per second ... which actually isn't even that, your monitor will only display 60 (in some cases 75 or 100) frames per second, and anything over 60 FPS is a waste. What you are actually measuring is the number of times the frame buffer is refreshed by the GPU. Even that is meaningless, because the performance that we 'measure' is a small slice of the overall game -- typically 30, 60 or 120 seconds of a game that should last several hours .... this is not very representative (statistically) of the population anyway.

    Jack
    Last edited by JumpingJack; 10-18-2008 at 10:18 PM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  2. #77
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by JumpingJack View Post
    Precisely, if you are looking for a CPU for gaming and that is all you want your CPU for ... then the CPU in your signature line is plenty sufficient. However, if you are a hobbyist who likes to study the fundamental comp sci of the device, and how architectural differences play into the computational result ... this data set produced by this site (gaming results not the others) is worthless.
    Well gaming is pretty demanding software is it not? I'd say it's relevant otherwise why is anyone going to justify buying one of these if it can't run software better than the last gen.
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  3. #78
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Glow9 View Post
    Well gaming is pretty demanding software is it not? I'd say it's relevant otherwise why is anyone going to justify buying one of these if it can't run software better than the last gen.
    They are .. but at the moment (and the 'at the moment' clause is a different debate) games are the only software (with some notable exceptions) that are dependent upon two completely different computational resources ... and the most demanding feature is the visual acuity ... once the complete rendering pipeline moved off CPU, the CPU became secondary to gaming performance ... this happened in the late 1990's early 2000 (1998-2001 ish timeframe)....

    The eye candy is what drives the current progress in games, this is solely on the GPU.

    However, gaming is not the only software -- and in cases where the CPU is the only dependent variable, Nehalem is showing impressive 20-40% clock for clock boosts.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  4. #79
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,476
    Quote Originally Posted by JumpingJack View Post
    However, gaming is not the only software -- and in cases where the CPU is the only dependent variable, Nehalem is showing impressive 20-40% clock for clock boosts.

    Jack
    Where and what? Cause if it's like winzip and some DVD ripper then I'm going to say this is definitely not worth the cost to upgrade lol
    i3 2100, MSI H61M-E33. 8GB G.Skill Ripjaws.
    MSI GTX 460 Twin Frozr II. 1TB Caviar Blue.
    Corsair HX 620, CM 690, Win 7 Ultimate 64bit.

  5. #80
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    166
    Quote Originally Posted by Mats View Post
    the €i7 isn't made for gamers specifically.
    Quote Originally Posted by Macadamia View Post
    I see whut you did there
    Last edited by Alcibiades; 10-18-2008 at 10:40 PM.
    [SIGPIC][/SIGPIC]
    CoolerMaster Stacker 830SE|Antec Signature 850W|Gigabyte X58A-UD5 F5 slic2.1
    Intel Core i7 930 16x200@3,200Mhz|vcore 1.14|Intel Stock CPU Cooler
    GSKILL DDR3 Perfect Storm 2000 @6-6-6-16-1T 1600Mhz|ATI 5870 1024MB 850/1200
    Windows 7 Ultimate x64 bootdisk: Crucial RealSSD-C300 128GB SATA-III

  6. #81
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Glow9 View Post
    Where and what? Cause if it's like winzip and some DVD ripper then I'm going to say this is definitely not worth the cost to upgrade lol
    I flip for the 2500 buck version of Adobe creative suite .... the master collection I will probably spend close to upward near 8000-10000 on toys this year alone.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  7. #82
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by JumpingJack View Post
    Exactly ... to your rhetorical question ... meaning who would buy this purely for gaming (see my response to Glow above).

    I own probably 200 games from Doom to my most recent purchase Spore.... I have played maybe 10 all the way through ... but I use them to stress and play with the GPU, CPU, etc combos ... it's just what I like to do.

    I use my computer mostly for other things, some modeling, NLE editing, etc. So Nehalem is looking pretty darn nice to me.

    EDIT: In terms of gaming 'performance', i.e. what is it... that is in the eye of the beholder in my opinion. The output that we measure to ascertain a performance number is frames per second ... which actually isn't even that, your monitor will only display 60 (in some cases 75 or 100) frames per second, and anything over 60 FPS is a waste. What you are actually measuring is the number of times the frame buffer is refreshed by the GPU. Even that is meaningless, because the performance that we 'measure' is a small slice of the overall game -- typically 30, 60 or 120 seconds of a game that should last several hours .... this is not very representative (statistically) of the population anyway.

    Jack
    Well the other part of my question was "do you see a major benefit in everyday use for anything over current processors?" I don't think it's a noticeable performance increase at this point, because things are so fast as it is. Granted certain operations in workstation apps might process some data faster, but I can't think of much else that you'd benefit from by having one of these processors when the GPU already does some of the things better. Granted there aren't many GPGPU consumer apps out there right now, but I'm just speaking in theoreticals.. in terms of "the device best suited for workload x."

    As far as your mention of the inability for a monitor to display more than 60 frames a second, that's actually wrong.

    This is true if you have v-sync on you'll only see 60 fps, exactly, at maximum (1 frame per vertical blank period).

    But with v-sync off, if you have 120fps on a 60Hz monitor, you see parts of multiple frames.

    By this I mean that the vertical refresh is not instantaneous.. it will write out whatever is in the front buffer of the swap chain, at the time of the pixel being lit on that refresh pass.

    So let's say you have 1 frame ready to display. Your monitor starts displaying that frame 1 pixel at a time, working from left to right, and moving down row by row, displaying what's in the buffer. If the next frame is done and presented when the pixel being refreshed is halfway down the monitor in that pixel refresh process, the second half of that monitor's pixels will show the contents of this new frame, so you get actual visual feedback on your position/environment in-game faster, instead of having to wait for the next refresh cycle to see *any* of that new frame.

    Now whether you can turn this faster visual feedback into a meaningful response/reaction in-game is a different story, since it's all happening in a very short period

    If you had 180FPS you'd see roughly the top third of your monitor with data from frame 1, the middle third displaying data from frame 2, and the bottom third with data from frame 3. With Vsync on you would've only seen frame 1 and had to wait until it was done displaying the whole thing before the refresh moved back to the top of the screen, at which point it would've displayed frame 4, and you'd have never seen 2 or 3.

    My point is that your statement about a monitor not displaying more than 60 frames in a second is false.
    Last edited by Sr7; 10-18-2008 at 10:49 PM.

  8. #83
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    Well the other part of my question was "do you see a major benefit in everyday use for anything over current processors?" I don't think it's a noticeable performance increase at this point, because things are so fast as it is. Granted certain operations in workstation apps might process some data faster, but I can't think of much else that you'd benefit from by having one of these processors when the GPU already does some of the things better. Granted there aren't many GPGPU consumer apps out there right now, but I'm just speaking in theoreticals.. in terms of "the device best suited for workload x."

    As far as your mention of the inability for a monitor to display more than 60 frames a second, that's actually wrong.

    This is true if you have v-sync on you'll only see 60 fps, exactly, at maximum (1 frame per vertical blank period).

    But with v-sync off, if you have 120fps on a 60Hz monitor, you see parts of multiple frames.

    By this I mean, that the vertical refresh is not instantaneous.. it will write out whatever is in the front buffer of the swap chain, at the time of the pixel being lit on that refresh pass.

    So let's say you have 1 frame ready to display. Your monitor starts displaying that frame 1 pixel at a time, working from left to right, and moving down row by row, displaying what's in the buffer. If halfway down in that pixel refresh process the next frame is done and presented, the second half of that monitor's pixels will show the contents of this new frame, so you get actual visual feedback on your position/environment in-game faster, instead of having to wait for the next refresh cycle to see *any* of that frame.

    Now whether you can turn this faster feedback into a meaningful response/reaction is a different story, since it's all happening in a very short period.

    If you had 180FPS you'd see roughly the top third of your monitor as frame 1, the middle third as frame 2, and the bottom third as frame 3. With Vsync on you would've only seen frame 1 and had to wait until it was done displaying the whole thing before the refresh moved back to the top of the screen, at which point it would've displayed frame 4.

    My point is that your statement about not seeing more than 60 frames in a second is false.
    Wow ... you are talkative .. ... this my last tonight, gotta get in bed.

    In terms of noticeable difference.... again, that is in the eye of the beholder. Ironically, just a few days ago I was using Lightroom on a dual core rig for some quick touch ups and got very annoyed ... (this was an X6800) ... which at the time I was using that rig was darn fast, but I was annoyed because it wasn't as snappy or responsive.... my quad does it so much faster, and I notice it. Also, when I am transcoding or importing from a different video format into Premier or for simple routine stuff -- Pinnacle Studio 11 -- it is very noticable, especially when it takes 2x longer to build the DVD ... but see this is me, I prefer it faster... you may not be doing this level of computing.

    In terms of GPGPU this is really a different topic, a different thread for debate -- but I don't see CUDA, for example, really taking off -- for a few reason which I will not elaborate on ... my personal opinion is that CUDA/nVidia will be victimized very much like AMD victimized Intanium.

    For your video monitor commentary, I am correct on this you should do some more research ... google is your friend.... -- video monitors refresh the entire screen at typical 60 Hz that means each pixel within the field is updated in tandem 60 times (progressive), that is 60 frames in one second. Some monitors support higher, but the human brain cannot distinguish between frames that until about 20 FPS ... if you run the game full bore, your 60 Hz refresh will often capture the frame buffer in mid refresh since they are not synced -- this is what creates the lines and tearing in the image ... Vsync gives the smoothest, most enjoyable game play because each refresh of the monitor coincides with a completed frame buffer i.e. that is what it means to be sync'ed. We may be saying the same thing but differently ...
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  9. #84
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by JumpingJack View Post
    For your video monitor commentary, I am correct on this you should do some more research ... google is your friend.... -- video monitors refresh the entire screen at typical 60 Hz that means each pixel within the field is updated in tandem 60 times (progressive), that is 60 frames in one second. Some monitors support higher, but the human brain cannot distinguish between frames that until about 20 FPS ... if you run the game full bore, your 60 Hz refresh will often capture the frame buffer in mid refresh since they are not synced -- this is what creates the lines and tearing in the image ... Vsync gives the smoothest, most enjoyable game play because each refresh of the monitor coincides with a completed frame buffer i.e. that is what it means to be sync'ed. We may be saying the same thing but differently ...
    Sorry, that's not correct. If what you're saying were correct, you'd never see tearing on an LCD, which is obviously not the case. Tearing comes from seeing part of an old frame at the same time as part of a new frame, and this is what happens on LCDs just the same as CRT monitors.

    The per-pixel update is done progressively, not simultaneously.. or perhaps you can explain just how horizontal tearing shows up on an LCD?

    What you're describing is the V-sync specific case, where the monitor only displays a single frame per refresh cycle.

    There is no such thing as catching the frame buffer mid-refresh because the flip is done by swapping pointers to the buffer.. which is instantaneous. It's not filling in the buffer. The front buffer in the swap chain (which is all that the monitor sees.. it's what is actually displayed) only ever contains full frames because of the swapping behavior. It's not a "filling" process.

    If what you were saying were true, that would mean that the bottom of the torn monitor image were part of the *old* frame (i.e. you're saying the buffer being updated hasn't gotten to that lower part of the buffer yet), when in fact it's actually part of the most recently processed frame.

    While Vsync gives the cleanest image, it results in skipped frames (frames that get processed but never displayed) and adds input latency, though it's below the threshold of anything I could ever notice. If you don't believe me, ask a seriously high level competitive gamer if they'd ever dream of playing with v-sync on.
    Last edited by Sr7; 10-18-2008 at 11:13 PM.

  10. #85
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    Sorry, that's not correct. If what you're saying were correct, you'd never see tearing on an LCD, which is obviously not the case. Tearing comes from seeing part of an old frame at the same time as part of a new frame, and this is what happens on LCDs just the same as CRT monitors.

    The per-pixel update is done progressively, not simultaneously.. or perhaps you can explain just how horizontal tearing shows up on an LCD?

    What you're describing is the V-sync specific case, where the monitor only displays a single frame per refresh cycle.
    Dude ... refresh rate is the number of times the monitor updates the image on screen, regardless of the condition of the frame data being fed to it .... that is what refresh rate means. If the frame buffer refreshes higher than the refresh rate of the monitor, you will see images distorted (tearing) because the image frame is inconsistent with the monitor frame, if the frame buffer refreshes slower than the refresh rate you will see the same effect.

    Progressive and simultaneous are the same thing. LCDs are not scanned like CRTs, nor are the interlaced, by definition they are progressive.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  11. #86
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by JumpingJack View Post
    Dude ... refresh rate is the number of times the monitor updates the image on screen, regardless of the condition of the frame data being fed to it .... that is what refresh rate means. If the frame buffer refreshes higher than the refresh rate of the monitor, you will see images distorted (tearing) because the image frame is inconsistent with the monitor frame, if the frame buffer refreshes slower than the refresh rate you will see the same effect.

    Progressive and simultaneous are the same thing. LCDs are not scanned like CRTs, nor are the interlaced, by definition they are progressive.
    Obviously an image is being displayed a set number of times per second. The question we are debating is how is it doing it.. instantaneously and simultaneously, or serially. If it were truly instantaneous, you'd see no tearing.

    The tearing artifact does not exist in the front buffer of a graphics devices swap chain, which is what goes to the monitor. So since the artifact doesn't exist there, where else do you suppose the tearing comes from? It's obviously introduced somewhere after that.. it comes from the LCD display.

    Trust me on this one.

    Read here for an explanation of why LCD's still essentially "refresh" serially.. it's because of their need to serially read speed from the framebuffer (which is also, not coincidentally 60Hz):

    http://www.hardforum.com/showpost.ph...8&postcount=21
    Last edited by Sr7; 10-18-2008 at 11:29 PM.

  12. #87
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    Obviously an image is being displayed a set number of times per second. The question we are debating is how is it doing it.. instantaneously and simultaneously, or serially. If it were truly instantaneous, you'd see no tearing.

    The tearing artifact does not exist in the front buffer of a graphics devices swap chain, which is what goes to the monitor. So since the artifact doesn't exist there, where else do you suppose the tearing comes from? It's obviously introduced somewhere after that.. it comes from the LCD display.

    Trust me on this one.
    We can contain our discussion to LCDs since CRTs are out. However, in analog world, CRTs raster the image to the screen pixel by pixel, such that the RAMDACs scan the frambuffer in video ram, produce the signal and then match the frequency to the refresh rate necessary to produce the image.

    In LCDs, the frame is also scanned from the frame buffer that data is sent to the LCD where the video processor in the LCD screen assembles the frame and all the pixels update simultaneously. LCDs accept interlaced or progressive scanned data, but builds the image and displays it simultaneously, all lines... google LCD and you will find multiple references that all LCD displays are progressive.

    The tearing etc. that occurs when the refresh rate is not sync'ed is because the frame is grabbed from the frame buffer, the frame buffer is rastered, and if the buffer is read mid-refresh (for example exactly 1/2 refreshed) ... the image will be sent complete to the monitor with 1/2 of one image frame and the other 1/2 of the new frame.... artifacts are coming from grabbing the frame buffer out of synch, mid refresh.

    I am sorry, you are a good fella -- but I cannot trust you on this because I have done the research... if you like I will begin linking up the patents and technical documentation that support my case.

    Jack
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  13. #88
    Xtreme Enthusiast
    Join Date
    Aug 2008
    Posts
    577
    And this has what to do with 3.2G OC'd CORE i7 940 vs. 3.16Ghz Stock E8500 vs. 3.2Ghz Stock QX9770 Complete Review? Not being nasty, just following both the arguments here and trying to figure out how the thread got to be arguing about the finer details of screen 'tearing' on LCDs?
    --Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))

  14. #89
    Xtreme Mentor
    Join Date
    May 2005
    Location
    Westlake Village, West Hills
    Posts
    3,046
    Quote Originally Posted by Macadamia View Post
    HL2 EP2 graph is


    ROFLMAOMFGWTFBBQSAUSAGE
    lmao, well said lol
    PC Lab Qmicra V2 Case SFFi7 950 4.4GHz 200 x 22 1.36 volts
    Cooled by Swiftech GTZ - CPX-Pro - MCR420+MCR320+MCR220 | Completely Silent loads at 62c
    GTX 470 EVGA SuperClocked Plain stock
    12 Gigs OCZ Reaper DDR3 1600MHz) 8-8-8-24
    ASUS Rampage Gene II |Four OCZ Vertex 2 in RAID-0(60Gig x 4) | WD 2000Gig Storage


    Theater ::: Panasonic G20 50" Plasma | Onkyo SC5508 Processor | Emotiva XPA-5 and XPA-2 | CSi A6 Center| 2 x Polk RTi A9 Front Towers| 2 x Klipsch RW-12d
    Lian-LI HTPC | Panasonic Blu Ray 655k| APC AV J10BLK Conditioner |

  15. #90
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Stukov View Post
    And this has what to do with 3.2G OC'd CORE i7 940 vs. 3.16Ghz Stock E8500 vs. 3.2Ghz Stock QX9770 Complete Review? Not being nasty, just following both the arguments here and trying to figure out how the thread got to be arguing about the finer details of screen 'tearing' on LCDs?
    lol sorry. We got here by his contesting my argument on the need for these new CPUs, since he said "you can only see 60fps on a monitor with 60Hz refresh rate anyway" which I wanted to correct.

    JJ, while you're right the LCD's don't serially light pixel by pixel, you're forgetting that they serially read pixel by pixel, at the same speed that the refresh rate happens. So effectively it's displaying pixel by pixel leading to the tearing. There's no tearing in the front buffer, I promise you

    If you don't want to agree with me that's fine, but I am positive on this one dude

    My last post I swear!
    Last edited by Sr7; 10-18-2008 at 11:36 PM.

  16. #91
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Sr7 View Post
    lol sorry. We got here by his contesting my argument on the need for these new CPUs, since he said "you can only see 60fps on a monitor with 60Hz refresh rate anyway" which I wanted to correct.

    JJ, while you're right the LCD's don't serially light pixel by pixel, you're forgetting that they serially read pixel by pixel, at the same speed that the refresh rate happens. So effectively it's displaying pixel by pixel leading to the tearing. There's no tearing in the front buffer, I promise you

    If you don't want to agree with me that's fine, but I am positive on this one dude

    My last post I swear!
    My last post to ... I am positive too I am positive you are wrong about this --- refresh rate is the rate at which the display refreshes the image ... the frame rate in terms of what we measure is the number of times the frame buffer (the frame) is updated ... at 60 Hz refresh rate you will only see 60 updates on the monitor, regardless of the number of times the frame buffer is updated. .... I will leave you with this: http://www.google.com/search?client=...UTF-8&oe=UTF-8
    Last edited by JumpingJack; 10-19-2008 at 08:55 AM.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  17. #92
    Xtreme Enthusiast
    Join Date
    Mar 2008
    Location
    Dallas, TX
    Posts
    965
    this is why I'm probably going to skip Nehalem,and go with a hardcore Dual Core system tricked out with the next generation of video cards..
    "fightoffyourdemons"


  18. #93
    Xtreme Cruncher
    Join Date
    Oct 2006
    Location
    1000 Elysian Park Ave
    Posts
    2,669
    Oh dear.......don't get me started on crappy 60hz LCDs.....
    i3-8100 | GTX 970
    Ryzen 5 1600 | RX 580
    Assume nothing; Question everything

  19. #94
    I am Xtreme
    Join Date
    Feb 2005
    Location
    SiliCORN Valley
    Posts
    5,543
    My point is that your statement about not seeing more than 60 frames in a second is false.
    in terms of the human eye, it can only physcially see 60fps. and no more that is a scientific fact of the human body.
    MOST people can not see the difference from 45fps upward.

    any GAMER can see the difference most of the time, and most here will go on a rampage about how you can see more than this or that or even a few bold ones who will claim that they can see over 60fps (which is physically impossible)
    60hz monitors are perfect you cant see anything different.
    the difference is soly in the lcd vs CRT and how it effects your eyes, but your eyes adjust themselves very quickly.
    "These are the rules. Everybody fights, nobody quits. If you don't do your job I'll kill you myself.
    Welcome to the Roughnecks"

    "Anytime you think I'm being too rough, anytime you think I'm being too tough, anytime you miss-your-mommy, QUIT!
    You sign your 1248, you get your gear, and you take a stroll down washout lane. Do you get me?"

    Heat Ebay Feedback

  20. #95
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Quote Originally Posted by Lestat View Post
    in terms of the human eye, it can only physcially see 60fps. and no more that is a scientific fact of the human body.
    And there is where most people are wrong. You don't see frames. You can't see frames. Your eye doesn't work using frames. Whatever is after that statement is total BS.

    Seriously, who started this "the human eye can't see more than 60fps, period" thing?
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  21. #96
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Kingcarcas View Post
    Oh dear.......don't get me started on crappy 60hz LCDs.....
    I think we share the same opinion.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  22. #97
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Barack Hussein Obama-Biden's Nation
    Posts
    1,084
    I wish I never read this carp!

    --two awesome rigs, wildly customized with
    5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
    --SONY GDM-FW900 24" widescreen CRT, overclocked to:
    2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)

    Updated List of Video Card GPU Voodoopower Ratings!!!!!

  23. #98
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Lestat View Post
    in terms of the human eye, it can only physcially see 60fps. and no more that is a scientific fact of the human body.
    MOST people can not see the difference from 45fps upward.

    any GAMER can see the difference most of the time, and most here will go on a rampage about how you can see more than this or that or even a few bold ones who will claim that they can see over 60fps (which is physically impossible)
    60hz monitors are perfect you cant see anything different.
    the difference is soly in the lcd vs CRT and how it effects your eyes, but your eyes adjust themselves very quickly.
    Assembling frames to produce a moving image is an 'optical illusion' ... your brain cannot process information fast enough to rationalize a picture changing more than 15 or 20 times per second. Technically, you can't see even 30 FPS ... it will simply be a blur in time, hence, a moving image.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  24. #99
    Xtreme Monster
    Join Date
    May 2006
    Location
    United Kingdom
    Posts
    2,182
    Quote Originally Posted by JumpingJack View Post
    Assembling frames to produce a moving image is an 'optical illusion' ... your brain cannot process information fast enough to rationalize a picture changing more than 15 or 20 times per second. Technically, you can't see even 30 FPS ... it will simply be a blur in time, hence, a moving image.
    Yes but what you are trying to do is a hard job, time to time I do try this as well but then again there will be 30 of them who will not agree about anything you say, for them higher FPS means amazing game play even if they can not see 85% of the frames. They misunderstand about response time and FPS.

    A good example to the people who disagree about it will be the way you move your hand, check how many frames your eyes can see when moving your hands using the stated speeds (slow, medium, fast), some of you can try this to check if you are a human or not.

    If you can see all the frames when moving your hands fast then we are ready to give a place to you in the skywalker family. I wonder if Luke or Anaking skywalker could see more than 60 frames per second as well.

    Metroid.
    Last edited by Metroid; 10-20-2008 at 02:32 AM. Reason: Grammar errors :)

  25. #100
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Metroid View Post
    Yes but what you are trying to do is a hard job, time to time I do try this as well but then again there will be 30 of them who will not agree about anything you say, for them higher FPS means amazing gameplay even if they can not see 85% of the frames. They misunderstand about response time and FPS.

    A good example to the people who disagrees about it will be the way you move your hand, check how much frames your eyes can see when moving your hands using the stated speeds (slow, medium, fast), some of you can try this to check if you are a human or not.

    If you can see all the frames when moving your hands fast then we are ready to give a place to you in the skywalker family. I wonder if Luke or Anaking skywalker could see more than 60 frames per second as well.

    Metroid.
    Your argument is a little bit bogus, because the strive for the highest framerate is not about seeing more than 60 frames per second.. it's about seeing the next frame SOONER (i.e. reaction time) than you would have with v-sync on (because the absolute most recent data ends up reading into part of your current refresh cycle instead of waiting 33ms for the next refresh reading loop to begin).

    Whether having that updated frame data in the same refresh cycle is able to be perceived is a different question, but I can guarantee you you'll see tearing on your LCD. If you can perceive the tearing, you can perceive the difference, and could benefit by having faster reaction times to changes in the frames, even if you don't know it (obviously no one can sit there and say "oh i moved 16ms faster than I would have otherwise").

    Explaining how this works to you guys would be a lot easier with an animated gif or something

    That said, I think we should discuss this in a dedicated thread. If you want to continue that discussion feel free to start the thread.
    Last edited by Sr7; 10-19-2008 at 11:42 PM.

Page 4 of 6 FirstFirst 123456 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •