Who knows...
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P.), Juan J. Guerrero
very impressive specs, bring it on Nvidia
So, are we expecting score of 35,000+ on 3DMark 06
and 25,000+ on 3DMark Vantage
2x Asus P8Z68-V PRO Bios 0501
i7 2600K @ 4.6GHz 1.325v / i5 2500K @ 4.4GHz 1.300v
2x G.SKILL Ripjaws X Series 8GB DDR3 1600
Plextor M5P 256GB SSD / Samsung 840 Pro 256GB SSD
Seasonic X-1050 PSU / SeaSonic X Series X650 Gold PSU
EVGA GTX 690 (+135%/+100MHz/+200MHz/75%) / EVGA GTX 680 SC Signature+ (+130%/+80MHz/+200MHz/70%)
I cant wait to start a 300 Vantage recording thread over at EVGA...
(After they make a 300 Section.)
They will be some fun numbers to look at. Vantage is so GPU dependent.
It should be a good test to let the 300 shine, if it can...![]()
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
GoodI wish NVIDIA that GT300 is a success and they don't get surprised by the green team again
Who cares if NVIDIA's chip is 10 percent faster but costs twice as much to make? (I'm exaggerating.) The performance crown is good for reputation (and a bit sales figures of the slower cards, where the real money is made) but not necessarily for NVIDIA's walletAnd for a company, money is usually more important than reputation
That being said, my humble opinion is that GT300 might be the last big monolithic GPU made by NVIDIA. ATI's way is wiser from the business POV.
Notice any grammar or spelling mistakes? Feel free to correct me! Thanks
After having switched to ATI's X2, I'm thinking of going back to Nvidia. ATI's driver support sucks, aftermarket heat sink arrived soo late, etcc. If the same cycle repeats itself this time, I'll wait till 5870x2 drives GT300 prices way down and jump on it. By that time, Accelero heatsinks should be ready soonafter, and hopefully drivers are good enough for 8x / 16x aa.
Big monolithic are fun though...![]()
Asus Maximus SE X38 / Lapped Q6600 G0 @ 3.8GHz (L726B397 stock VID=1.224) / 7 Ultimate x64 /EVGA GTX 295 C=650 S=1512 M=1188 (Graphics)/ EVGA GTX 280 C=756 S=1512 M=1296 (PhysX)/ G.SKILL 8GB (4 x 2GB) SDRAM DDR2 1000 (PC2 8000) / Gateway FPD2485W (1920 x 1200 res) / Toughpower 1,000-Watt modular PSU / SilverStone TJ-09 BW / (2) 150 GB Raptor's RAID-0 / (1) Western Digital Caviar 750 GB / LG GGC-H20L (CD, DVD, HD-DVD, and BlueRay Drive) / WaterKegIII Xtreme / D-TEK FuZion CPU, EVGA Hydro Copper 16 GPU, and EK NB S-MAX Acetal Waterblocks / Enzotech Forged Copper CNB-S1L (South Bridge heat sink)
Gaming Box
Ryzen R7 1700X * ASUS PRIME X370-Pro * 2x8GB Corsair Vengeance LPX 3200 * XFX Radeon RX 480 8GB * Corsair HX620 * 250GB Crucial BX100 * 1TB Seagate 7200.11
EK Supremacy MX * Swiftech MCR320 * 3x Fractal Venture HP-12 * EK D5 PWM
GDDR5 + 512bit sounds all good but if I'm not mistaken that is only bandwidth between memory and GPU. If the GPU is slow what is the benefit from this much bandwidth ? If all this data cannot be calculated by the GPU than its going to come out like a big commercial gimmick. Nvidia suffered already from big expensive chips so im wondering what exactly are they planning. If the production process has matured, maybe they wont have that much trouble like GT200.
i9 9900K/1080 Ti
how could higher margins make them less money? also upholding a reputation like being socially responsible is a very important of business. for the making money part you cant say nvidia going in the wrong direction. have you heard of tegra? the mobile mark is a cash cow because lap tops are cute and trendy.
well, you don't have to be a Nostradamus to come up with those six lines from above
if you would ask most people before this thread "what specs do you reckon gt300 will have?" they'd answer in similar way, no?
Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
—Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.
And you (human eye) just can't see the difference between 30 and 60fps
5930k, R5E, samsung 8GBx4 d-die, vega 56, wd gold 8TB, wd 4TB red, 2TB raid1 wd blue 5400
samsung 840 evo 500GB, HP EX 1TB NVME , CM690II, swiftech h220, corsair 750hxi
Yup, it's been proven scientifically that fighter pilots in the USAF have been able to tell 140FPS or more, per articles I read at the time of the tests. Many can tell 30-50 FPS, and a lot can tell higher than that with ease. The whole "eye can only see 30fps" is a myth that's been debunked long ago.
Funny how the people who always say that never have any research links to back that up.....never.
30fps & 60fps is like night & day to me.
play them on repeat loop.
www.echo147.adsl24.co.uk/temp/q3_30.avi (30fps)
www.echo147.adsl24.co.uk/temp/q3_60.avi (60fps)
www.echo147.adsl24.co.uk/temp/q3_120.avi (120fps) wont notice on a 60Hz LCD
http://amo.net/NT/02-21-01FPS.html
to be honest it is not all that hard to get good performance in crysis. I built a system for a buddy of mine using an overclocked i7 (to over 4ghz) and a pair of 1Gb 4870's in crossfire and it runs crysis (warhead) like a champ, Everything set to very high and 8xAA at 1920x1080 runs anyware between 20-60fps normally around 30-40fps. now normally i agree that anything below 60fps is not fun to use but cryengine 2 seems to run smoother at lower frame rates than most other engines (i think anyways although you can still kinda tell) i am willing to bet if you had a high clocked i7 and 3 overclocked GTX 285's in TRI SLI that you could get around 60fps min frames with the settings he mentioned. SLI does scale much much better in Crysis than crossfire does
CPU: Intel Core i7 3930K @ 4.5GHz
Mobo: Asus Rampage IV Extreme
RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
GPU: EVGA GTX Titan (1087Boost/6700Mem)
Physx: Evga GTX 560 2GB
Sound: Creative XFI Titanium
Case: Modded 700D
PSU: Corsair 1200AX (Fully Sleeved)
Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's
Funny how no one can remember the days of CRTs. Can you see a CRT flickering @ 60Hz? If yes, then you can see 60FPS at the very least.
Bookmarks