Man, that's a huge difference. I wonder if it would even be possible for NVIDIA and ATI to move to that model without too much driver reworking.
I heard though that the G80 does some form of tile rendering.
http://www.icare3d.org/GPU/CN08
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Well, rather not.
It's possible in Larrabee, because it keeps tiles in L2 cache, so it reads each pixel only once.
Bandwidth has always been a problem! Since Everquest, video cards have always had a hard time dealing with the game world, simply due to the limitations of bandwidth and memory size.
All the stuttering that goes on is almost solely due to lack of bandwidth and/or only 1gig of memory. I'm sick of this. Who cares about max fps or even shaders, when every game glitches when entering a building or awsome spot... totally ruining the experience.
.
--Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))
Partially true.
Xoulz man... that game and the previous Elders Scroll suck big time, and their programmers even more.
Just because you see stuttering doesn't make your assumption true.
That's not an issue of the graphics card memory bandwidth ( which is godly high for those games ) neither the VGA's FB capacity ( 512MB is more than ok for those games ).
It's just a stupid programmer and no future-proofing their game with advanced "options" to preload the whole world into the VRAM.
You'd be surprised if you check the VRAM consumption of nowadays games ( past games aren't even worth it... max ~200MB ) and how much of the VRAM BW the cards are utilizing.
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P.), Juan J. Guerrero
ooooh - GTX 350.... Better not require more power else its a flop in my case no matter what performance numbers its brings out! But with 2GB GDDR5 surely thats gotta be a X2 card of some sort. Hopefully they've got the ATI route with of X2 cards
Considering the HD 4870 X2 consumes around 100W more than the GTX 280, nVidia has plenty of room to increase power consumption and I am sure that they will do so. I hope that the GTX 350 doesn't consume as much as the 4870 X2 though, the insane power consumption is something that is keeping me away. I could probably handle the X2 but my 520W PSU would be crying.
Intel Core i7 920 @ 3.8GHz - Asus P6T Deluxe X58 - 6GB (2GBx3) G. SKILL DDR3-1600 @ 8-8-8-20 - 2 x EVGA GTX 280 1GB SLI - Corsair TX750 PSU - Windows Vista HP 64-bit
Hum?
This is a small card with plently of tecnology and speed.
The HD4550/9400GT are micro cards with stupid performance for a standalone card. They decode HD and not much more.
Otherwise the HD 4600 play every games with good resolution and good setings. Performance is between HD 3850 - 3870.
I find it hard to believe that both the GTX 280+ and the GT300 will BOTH be on 55nm.., with the GT300 using GDDR5 memory (if 512-bit bandwidth should already be enough). Perhaps the so-called "GTX 350" will just be a dual-chip card, using the same 55nm chip, since that is what 2GB of GDDR5 memory suggests. Then it makes sense that both are using the same chips after all (and that the GT300 is not a new generation).
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
--Intel i5 3570k 4.4ghz (stock volts) - Corsair H100 - 6970 UL XFX 2GB - - Asrock Z77 Professional - 16GB Gskill 1866mhz - 2x90GB Agility 3 - WD640GB - 2xWD320GB - 2TB Samsung Spinpoint F4 - Audigy-- --NZXT Phantom - Samsung SATA DVD--(old systems Intel E8400 Wolfdale/Asus P45, AMD965BEC3 790X, Antec 180, Sapphire 4870 X2 (dead twice))
GTX 350 could easily be a medium-powered 55nm GT200 derivative on a 256-bit bus with 2 chips totaling 512-bit of GDDR5 bandwidth. This would be a very good option for Nvidia to pursue, I doubt with the "350" name it'll be a GPU that requires both 512-bit and GDDR5 for a single-core. After G80 I would have thought Nvidia would have learned about the problem of huge and hot cores, maybe GT300 will be smaller and scale like RV770.
" Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^
Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance
Rig 2
i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower
Please, let's quote that picture 500 more times just in case someone missed it!
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
I just gave a headstart and posted it on 6 other forums!
I stumbled upon a Hardspell article written July 18, 2008 that speaks of a GTX 350 engineering sample with the same specs as listed above:
HardSpell.com - NVIDIA GTX 350 ES version is ready and the specs revealed?!
We got to know the related news but we are not so sure about this:
NVIDIA GTX 350
GT300 core
55nm technology
576mm
512bit
DDR5 2GB memory, doubled GTX280
480SP doubled GTX280
Grating operation units are 64 the same with GTX280
216G bandwidth
Default 830/2075/3360MHZ
Pixel filling 36.3G pixels/s
Texture filling 84.4Gpixels/s
Cannot support D10.1 .10.0/SM4.0
Last edited by AuDioFreaK39; 09-21-2008 at 01:52 PM.
EVGA X58 SLI Classified E759 Limited Edition
Intel Core i7 Extreme 980X Gulftown six-core
Thermalright TRUE Copper w/ 2x Noctua NF-P12s (push-pull)
2x EVGA GeForce GTX 590 Classified [Quad-SLI]
6GB Mushkin XP Series DDR3 1600MHz 7-8-7-20
SilverStone Strider ST1500 1500W
OCZ RevoDrive 3 240GB 1.0GB/s PCI-Express SSD
Creative X-Fi Fatal1ty Professional / Logitech G51 5.1 Surround
SilverStone Raven RV02
Windows 7 Ultimate x64 RTM
Yeah, 256x2 bits for a dual-chip configuration makes perfect sense, paired with GDDR5.
However, do not expect Nvidia to maintain integrity with its naming scheme. Remember the 9800GX2? It was a new "generation" from the 8800GT and 8800GT 512MB. Nvidia might just as well do the same thing with its GTX 350, just to make it sound more attractive.
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
Tell me, what is the best way to get your computer store world renown overnight? Post false GPU SKUs.![]()
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
Case-Coolermaster Cosmos S
MoBo- ASUS Crosshair IV
Graphics Card-XFX R9 280X [out for RMA] using HD5870
Hard Drive-Kingston 240Gig V300 master Seagate 160Gb slave Seagate 250Gb slave Seagate 500Gb slave Western Digital 500Gb
CPU-AMD FX-8320 5Ghz
RAM 8Gig Corshair c8
Logitech 5.1 Z5500 BOOST22
300Gb of MUSICA!!
Steam ID: alphamonkeywoman
http://www.techpowerup.com/gpuz/933ab/
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Oh yeah, I overlooked this absurd claim: 830 MHz core clock and 2075 MHz for the 240 shaders... on a dual-chip configuration, I think it's far-fetched.
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
Bookmarks