Page 3 of 5 FirstFirst 12345 LastLast
Results 51 to 75 of 121

Thread: GT300 new leaks?

  1. #51
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    112
    Quote Originally Posted by jaredpace View Post
    Yah if someone says they have an evergreen card at this time - it's believable. But to say you have GT300 card now.... Kinda bs. But who knows? maybe some people have gt300 protoype cards of A1???
    Hm but who has said that he has a GT300 card?

  2. #52
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Who knows...
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  3. #53
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    319
    very impressive specs, bring it on Nvidia
    So, are we expecting score of 35,000+ on 3DMark 06
    and 25,000+ on 3DMark Vantage
    2x Asus P8Z68-V PRO Bios 0501
    i7 2600K @ 4.6GHz 1.325v / i5 2500K @ 4.4GHz 1.300v
    2x G.SKILL Ripjaws X Series 8GB DDR3 1600
    Plextor M5P 256GB SSD / Samsung 840 Pro 256GB SSD
    Seasonic X-1050 PSU / SeaSonic X Series X650 Gold PSU
    EVGA GTX 690 (+135%/+100MHz/+200MHz/75%) / EVGA GTX 680 SC Signature+ (+130%/+80MHz/+200MHz/70%)


  4. #54
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    I cant wait to start a 300 Vantage recording thread over at EVGA...

    (After they make a 300 Section.)

    They will be some fun numbers to look at. Vantage is so GPU dependent.

    It should be a good test to let the 300 shine, if it can...
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  5. #55
    Xtreme Member
    Join Date
    Nov 2007
    Location
    Russia
    Posts
    138
    My comp in Crysis

  6. #56
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Good I wish NVIDIA that GT300 is a success and they don't get surprised by the green team again

    Quote Originally Posted by Dainas View Post
    Hit the nail on the head, it is fun seeing people go crazy in hope of ATI finally breaking the status quo though. All known hints scream more of the same from both teams. Every time half the forums scream that ATI is going to blow up the performance crown, and every time Nvidia nips back the crown.
    Who cares if NVIDIA's chip is 10 percent faster but costs twice as much to make? (I'm exaggerating.) The performance crown is good for reputation (and a bit sales figures of the slower cards, where the real money is made) but not necessarily for NVIDIA's wallet And for a company, money is usually more important than reputation

    That being said, my humble opinion is that GT300 might be the last big monolithic GPU made by NVIDIA. ATI's way is wiser from the business POV.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  7. #57
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    614
    After having switched to ATI's X2, I'm thinking of going back to Nvidia. ATI's driver support sucks, aftermarket heat sink arrived soo late, etcc. If the same cycle repeats itself this time, I'll wait till 5870x2 drives GT300 prices way down and jump on it. By that time, Accelero heatsinks should be ready soonafter, and hopefully drivers are good enough for 8x / 16x aa.
    Modded Cosmos. | Maximus II Formula. Bios 1307| 2x2 Mushkin XP ASCENT 8500 | Q9550-E0- 4.10 + TRUE | Visiontek HD4870X2 | LN32A550 1920x1080 | X-FI Extreme Gamer | Z5300E | G15v.1 | G7 | MX518 | Corsair HX1000 | X25-MG2 80G | 5xHDD
    ____________________________________
    Quote Originally Posted by saaya View Post
    most people dont care about opencl, physix, folding at home and direct compute... they want cool explosions and things blowing up and boobs jumping around realistically... .

  8. #58
    Xtreme Mentor
    Join Date
    Sep 2007
    Location
    Ohio
    Posts
    2,977
    Big monolithic are fun though...
    Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)

  9. #59
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,462
    Quote Originally Posted by RagzaroK View Post
    After having switched to ATI's X2, I'm thinking of going back to Nvidia. ATI's driver support sucks, aftermarket heat sink arrived soo late, etcc. If the same cycle repeats itself this time, I'll wait till 5870x2 drives GT300 prices way down and jump on it. By that time, Accelero heatsinks should be ready soonafter, and hopefully drivers are good enough for 8x / 16x aa.
    Shouldn't have bought a X2 then

    Quote Originally Posted by Talonman View Post
    Big monolithic are fun though...
    If you need the power, then yes, monolithic > CF/SLI every time. SLI and CF is retarded imho.
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  10. #60
    Xtreme Member
    Join Date
    Jan 2008
    Location
    Lexington, KY
    Posts
    401
    Quote Originally Posted by RagzaroK View Post
    After having switched to ATI's X2, I'm thinking of going back to Nvidia. ATI's driver support sucks, aftermarket heat sink arrived soo late, etcc. If the same cycle repeats itself this time, I'll wait till 5870x2 drives GT300 prices way down and jump on it. By that time, Accelero heatsinks should be ready soonafter, and hopefully drivers are good enough for 8x / 16x aa.
    You're blaming the lag time for the introduction of third party coolers on ATI?

    Gaming Box

    Ryzen R7 1700X * ASUS PRIME X370-Pro * 2x8GB Corsair Vengeance LPX 3200 * XFX Radeon RX 480 8GB * Corsair HX620 * 250GB Crucial BX100 * 1TB Seagate 7200.11

    EK Supremacy MX * Swiftech MCR320 * 3x Fractal Venture HP-12 * EK D5 PWM

  11. #61
    Xtreme Member
    Join Date
    Dec 2004
    Location
    .ca
    Posts
    476
    GDDR5 + 512bit sounds all good but if I'm not mistaken that is only bandwidth between memory and GPU. If the GPU is slow what is the benefit from this much bandwidth ? If all this data cannot be calculated by the GPU than its going to come out like a big commercial gimmick. Nvidia suffered already from big expensive chips so im wondering what exactly are they planning. If the production process has matured, maybe they wont have that much trouble like GT200.
    i9 9900K/1080 Ti

  12. #62
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by FischOderAal View Post
    Good I wish NVIDIA that GT300 is a success and they don't get surprised by the green team again



    Who cares if NVIDIA's chip is 10 percent faster but costs twice as much to make? (I'm exaggerating.) The performance crown is good for reputation (and a bit sales figures of the slower cards, where the real money is made) but not necessarily for NVIDIA's wallet And for a company, money is usually more important than reputation

    That being said, my humble opinion is that GT300 might be the last big monolithic GPU made by NVIDIA. ATI's way is wiser from the business POV.
    how could higher margins make them less money? also upholding a reputation like being socially responsible is a very important of business. for the making money part you cant say nvidia going in the wrong direction. have you heard of tegra? the mobile mark is a cash cow because lap tops are cute and trendy.

  13. #63
    Xtreme Addict
    Join Date
    Aug 2008
    Location
    Hollywierd, CA
    Posts
    1,284
    Quote Originally Posted by Oberon View Post
    You're blaming the lag time for the introduction of third party coolers on ATI?

    i don't think he's blaming ati, if you're into modding, having to wait for aftermarket support is a reason to look at other products.
    [SIGPIC][/SIGPIC]

    I am an artist (EDM producer/DJ), pls check out mah stuff.

  14. #64
    Xtreme Addict
    Join Date
    Nov 2007
    Location
    Illinois
    Posts
    2,095
    Quote Originally Posted by Talonman View Post
    Big monolithic are fun though...
    Really fun for nvidia, I'll bet.
    E7200 @ 3.4 ; 7870 GHz 2 GB
    Intel's atom is a terrible chip.

  15. #65
    Banned
    Join Date
    Jul 2006
    Posts
    291
    well, you don't have to be a Nostradamus to come up with those six lines from above
    if you would ask most people before this thread "what specs do you reckon gt300 will have?" they'd answer in similar way, no?

  16. #66
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Andrew LB View Post
    I play Crysis just fine @ 1920x1200 with 8xAA and 8xAF with everything on "Very High".
    No, you don't. Perhaps you enjoy 15-20fps, I don't know. But I certainly do not. As a general rule, I like my minimum framerate to not dip below 60 in FPS games.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  17. #67
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Lancaster, UK
    Posts
    473
    Quote Originally Posted by 003 View Post
    No, you don't. Perhaps you enjoy 15-20fps, I don't know. But I certainly do not. As a general rule, I like my minimum framerate to not dip below 60 in FPS games.
    +1. This

  18. #68
    HWGurus
    Join Date
    Mar 2006
    Posts
    798
    And you (human eye) just can't see the difference between 30 and 60fps

  19. #69
    Banned Movieman...
    Join Date
    May 2009
    Location
    illinois
    Posts
    1,809
    Quote Originally Posted by quake6 View Post
    And you (human eye) just can't see the difference between 30 and 60fps
    i can so dont be an ...

    alot of people can. just because you cant tell 30 fps is choppy doesnt mean other cant. especially in source games i can tell between 60 and 100

  20. #70
    I am Xtreme zanzabar's Avatar
    Join Date
    Jul 2007
    Location
    SF bay area, CA
    Posts
    15,871
    Quote Originally Posted by quake6 View Post
    And you (human eye) just can't see the difference between 30 and 60fps
    u cant see the difference but the computer works differently and u can feel the difference with input lag changing
    5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
    samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi

  21. #71
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by stangracin3 View Post
    i can so dont be an ...

    alot of people can. just because you cant tell 30 fps is choppy doesnt mean other cant. especially in source games i can tell between 60 and 100
    Yup, it's been proven scientifically that fighter pilots in the USAF have been able to tell 140FPS or more, per articles I read at the time of the tests. Many can tell 30-50 FPS, and a lot can tell higher than that with ease. The whole "eye can only see 30fps" is a myth that's been debunked long ago.

  22. #72
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by quake6 View Post
    And you (human eye) just can't see the difference between 30 and 60fps
    Funny how the people who always say that never have any research links to back that up.....never.

    30fps & 60fps is like night & day to me.

    play them on repeat loop.
    www.echo147.adsl24.co.uk/temp/q3_30.avi (30fps)
    www.echo147.adsl24.co.uk/temp/q3_60.avi (60fps)
    www.echo147.adsl24.co.uk/temp/q3_120.avi (120fps) wont notice on a 60Hz LCD


    http://amo.net/NT/02-21-01FPS.html
    Last edited by Final8ty; 08-19-2009 at 04:11 PM.

  23. #73
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by 003 View Post
    No, you don't. Perhaps you enjoy 15-20fps, I don't know. But I certainly do not. As a general rule, I like my minimum framerate to not dip below 60 in FPS games.
    to be honest it is not all that hard to get good performance in crysis. I built a system for a buddy of mine using an overclocked i7 (to over 4ghz) and a pair of 1Gb 4870's in crossfire and it runs crysis (warhead) like a champ, Everything set to very high and 8xAA at 1920x1080 runs anyware between 20-60fps normally around 30-40fps. now normally i agree that anything below 60fps is not fun to use but cryengine 2 seems to run smoother at lower frame rates than most other engines (i think anyways although you can still kinda tell) i am willing to bet if you had a high clocked i7 and 3 overclocked GTX 285's in TRI SLI that you could get around 60fps min frames with the settings he mentioned. SLI does scale much much better in Crysis than crossfire does
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  24. #74
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Funny how no one can remember the days of CRTs. Can you see a CRT flickering @ 60Hz? If yes, then you can see 60FPS at the very least.

  25. #75
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Exactly!

Page 3 of 5 FirstFirst 12345 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •