Page 4 of 5 FirstFirst 12345 LastLast
Results 76 to 100 of 121

Thread: GT300 new leaks?

  1. #76
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    theres a site that totally debunks the stupid 30 fps myth i'm surprised it hasn't been posted yet

    http://www.100fps.com/how_many_frame...humans_see.htm


    theres a better one, but i can't seem to find it h/o

    oh well i can't find it


    besides crysis @ 60 fps looks way better and plays way better than it @ 30 fps... even though 30 fps is very playable

  2. #77
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by hurleybird View Post
    Funny how no one can remember the days of CRTs. Can you see a CRT flickering @ 60Hz? If yes, then you can see 60FPS at the very least.
    The reason you can see CRT's flickering is because it's physically scanning the screen at that frequency, line by line. LCD's don't scan so that removes a big part of the problem right there.

  3. #78
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    spculation and rumors...

  4. #79
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by trinibwoy View Post
    The reason you can see CRT's flickering is because it's physically scanning the screen at that frequency, line by line. LCD's don't scan so that removes a big part of the problem right there.
    The comment was not why, but to merely demonstrate that people can see the 60Hz flicker between refresh.
    Now take that and imagine 30Hz refresh flicker & people here are saying you would not notice the difference in flicker.
    Last edited by Final8ty; 08-19-2009 at 08:49 PM.

  5. #80
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by quake6 View Post
    And you (human eye) just can't see the difference between 30 and 60fps
    No.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  6. #81
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Final8ty View Post
    The comment was not why, but to merely demonstrate that people can see the 60Hz flicker between refresh.
    Now take that and imagine 30Hz refresh flicker & people here are saying you would not notice the difference in flicker.
    Ah, gotcha.

  7. #82
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    So how much memory will the 300 have? 1024MB's or 2048?

    I hope it is a 2GB card...
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  8. #83
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by Chumbucket843 View Post
    how could higher margins make them less money?
    I don't think that NVIDIA's GTX 285 has a higher margin than AMD's HD-4890 (big chip vs small chip). Or are we talking about different margins?

    For me: margin = (price - costs)/costs

    Or am I confusing things? I'm (or better will be in a few weeks time) a mechanical engineer and we are not known to like economics xD

    Which has the higher margin?
    Chip A:
    cost: 50 $
    sold at: 80 $

    Chip B:
    cost: 30 $
    sold at: 50 $

    Quote Originally Posted by Chumbucket843 View Post
    for the making money part you cant say nvidia going in the wrong direction. have you heard of tegra? the mobile mark is a cash cow because lap tops are cute and trendy.
    My comment was on the performance desktop-chips only, not on the company in general Honestly I expected a little bit more interest by the companies in Tegra. It is a very capable solution.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  9. #84
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by FischOderAal View Post
    For me: margin = (price - costs)/costs

    Or am I confusing things?
    Nope, that's correct.

  10. #85
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    112
    GT300 targets X13k pts?

    http://translate.google.com/translat...hl=en&ie=UTF-8

    IF true that would be a real monster.

  11. #86
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by saaya View Post
    spculation and rumors...
    Well, ATI card details are out, why not create a thread for Nvidia to balance things, right???
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  12. #87
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    I thought he was asking us to get back to the speculation and rumours

  13. #88
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    I sure hope its 2GB as my 1GB card I can already tell is losing vram in certain games (GTA4, Fallout 3 [custom settings that make it look like 20x better], crysis with some texture packs that do the same thing)

  14. #89
    Registered User
    Join Date
    Jul 2009
    Location
    Switzerland
    Posts
    52
    One thing concerning the eye and fps story :
    - on a movie and so on, the fps are equally distributed under a second. so yes there are 24 fps, and each frame is 1/24 second "long".
    - in a game, fps vary all the time, and if you once have 24fps, there may be 40 fps in half a second then 8 fps the second half of that same second, which also gives an average of 24 fps. but when you will have the 8 fps period, you will notice the lagging, there's the trick. When people say "i need at least 60fps" is that with about 60fps (or above) there will be much more fps distributed in the same second, so the probability of once having less than 30 or whatever will be almost none, i hope i explained it correctly. The only important notion you have to get here is the way fps are distributed in time.

    Concerning the GT300 now : for the X13k points in 3dmark V, in comparison, we have this :

    with i7 920 @ 4Ghz CPU seems promising !
    Last edited by Wh!te Sh4dow; 08-22-2009 at 12:16 PM.
    Q6600 (VID : 1.325v ) || P5Q PRO || 3GB Corsair || 8800 GTS 512 || 1.5 TB HDD || Thermaltake 550W || Antec 900 || Logitech G15 || Logitech G3 || Pioneer HDJ-1000 Headphones || Razer mousepad

    WaterCooling System:
    MCP 655 || XSPC 360 Rad || HeatKiller 3.0 Cu || 3 x NoiseBlock BlackSilentFan XL2 rev 3.0 || and some tubes/fittings etc

  15. #90
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    uh thats wrong white shadow


    the reason 24 fps doesn't seem laggy or anything is because the frames are blurred into each other...

    thats why crysis doesn't seem laggy at 30 fps while other games do... because of the extensive blurring of the image


    having 60fps negates the need for that because, as you said, the sheer number of frames being shown

  16. #91
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    640
    Quote Originally Posted by orangekiwii View Post
    uh thats wrong white shadow


    the reason 24 fps doesn't seem laggy or anything is because the frames are blurred into each other...

    What the heck are you talking about....films are "blurred" between frames?

    Are you crazy? Have you never seen how films, and I'm not speaking of digital streaming or anything, but how films are shown in theatres?

    Films are nothing more than a series of stills, shown one still at a time for 1/24 of a second onto the screen. The film is stopped for 1/24 of a second, light projected through the projector's apeture, through the stopped still frame which is projected onto the screen, the shutter then closes, the film advances one frame, shutter opens with still frame's image shown on the screen, and the process progresses.

    That's why you hear the clicking and whirring from film projectors when a film is shown....the rapid opening and closing of the projector's shutter and the film's moving mechanism ratcheting the film forward one frame at a time, each 1/24 of a second.


    They don't blur the frames together, the sense of motion is provided by your brain and retina.....due to the fact they can be fooled into "seeing" smooth motion if a series of stills are shown fast enough, smoothly enough.

  17. #92
    Xtreme Enthusiast
    Join Date
    Jul 2006
    Posts
    756
    Quote Originally Posted by orangekiwii View Post
    uh thats wrong white shadow


    the reason 24 fps doesn't seem laggy or anything is because the frames are blurred into each other...

    thats why crysis doesn't seem laggy at 30 fps while other games do... because of the extensive blurring of the image


    having 60fps negates the need for that because, as you said, the sheer number of frames being shown
    What about 59fps? or is 60 the magic number

    Crysis doesn't seem laggy at 30 fps because it ALWAYS puts your GPU under a huge amount of demanding stuff, may it be physics processing, or just rendering. You rarely have framerate dips(which cause "lag") because almost everything in crysis(from a GPU's point of view) is equal... meaning 30 fps= 30(or close) constant fps. Normally when I am upgrading a new graphics card, I look for the minimum fps in games, who cares if it can render 9001 fps if it gets minimums of 1 fps?(extremities but point made, I hope)
    I don't know about what exactly a GPU does, but that is what the myths of "30fps in crysis = 60 fps in other games". That, and the crysis engine is -RELATIVELY- optimized for less dips.
    My Rig:
    i7 3770k
    Sabertooth Z77
    MSI GTX680 reference
    2x8gb Mushkin Redline 8-8-8-18 @1600
    Corsair Neutron 256gb
    2x Toshiba 7200rpm 3TB(Raid 0)
    Corsair 600T White
    Corsair HX750
    Asus PCE-AC66
    Hanns-G 27"

  18. #93
    Registered User
    Join Date
    Jul 2007
    Posts
    59
    An advantage of >30 FPS is input lag. If you have an average of 30 FPS that means each frame takes an average of 33.33 ms to render. If you have 120 FPS average then the average time it takes to render a frame is 8.33 ms.

    Personally I notice a difference in both input lag and smoothness between 30 and 60 FPS even in Crysis.
    i7 930 @ 4.3GHz|Ultra 120 Extreme|EVGA X58 SLI Classified|EVGA GTX 580 SLI|Intel X25-M G2 80GB + WD 1TB|6GB OCZ Platinum (3x2GB)|Antec TPQ 1200W|CM HAF 932|

  19. #94
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    Quote Originally Posted by Wh!te Sh4dow View Post
    Concerning the GT300 now : for the X13k points in 3dmark V, in comparison, we have this : img-snip...

    with i7 920 @ 4Ghz CPU seems promising !
    Indeed...

    X13K would be fun.
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  20. #95
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    frames are blurred in movies, films, and even tv shows...

    why do you think if you pause it... ITS BLURRY?

    seriously...

    http://en.wikipedia.org/wiki/Frame_rate

    read down... its not a lot but its enough that makes it seem fluid

    CRYSIS has blurring which lessons the fact that its fewer fps then you would have in other games

    turn motion blur on and off in crysis at 20 fps and see which looks more FLUID

  21. #96
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by Talonman View Post
    Indeed...

    X13K would be fun.
    Sure would!

  22. #97
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by orangekiwii View Post
    CRYSIS has blurring which lessons the fact that its fewer fps then you would have in other games

    turn motion blur on and off in crysis at 20 fps and see which looks more FLUID
    So why is it still fluild when motion blur is off?
    You were not supposed to see this.

  23. #98
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Humminn55 View Post
    They don't blur the frames together, the sense of motion is provided by your brain and retina.....due to the fact they can be fooled into "seeing" smooth motion if a series of stills are shown fast enough, smoothly enough.
    Film/tv frames are naturally blurred together due to the capturing mechanism, it's not a series of "stills" like with games.

  24. #99
    Xtreme Enthusiast
    Join Date
    Aug 2006
    Location
    Oxford, UK
    Posts
    747
    lol can't believe after all these years people are still having these FPS debates.
    || 2500K @ 5GHz 1 thread, 4.8 2 threads, 4.7 3, 4.6 4 1.284V ||
    || P8P67-M Pro || 8GB @ 2133MHz ||
    || 5850 @ 1000/1225 || XFX 650W || Silverstone FT03B ||
    || 37" LCD TV || CM Hyper 212+ || Samsung 2.1 Soundbar ||

  25. #100
    Xtreme Addict
    Join Date
    May 2003
    Location
    Peoples Republic of Kalifornia
    Posts
    1,541
    Quote Originally Posted by GoldenTiger View Post
    Yup, it's been proven scientifically that fighter pilots in the USAF have been able to tell 140FPS or more, per articles I read at the time of the tests. Many can tell 30-50 FPS, and a lot can tell higher than that with ease. The whole "eye can only see 30fps" is a myth that's been debunked long ago.
    Please be honest here and include in your "Facts" that those pilots are able to distinguish such things because under stressful combat situations with pure adrinaline pumping, the human brain will actually go into overdrive in it's "fight or flee" mode, making time actually seem to slow down. I saw this done where they used a digital screen that was running numbers by at a far greater rate of speed than people can normally read, yet when they had the same guy try to read them while in free-fall skydiving... he read them perfectly.

    "If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
    -- Alexander Hamilton

Page 4 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •