Yep.
Printable View
Man, that's a huge difference. I wonder if it would even be possible for NVIDIA and ATI to move to that model without too much driver reworking.
I heard though that the G80 does some form of tile rendering.
http://www.icare3d.org/GPU/CN08
Well, rather not.
It's possible in Larrabee, because it keeps tiles in L2 cache, so it reads each pixel only once.
:rolleyes:
Bandwidth has always been a problem! Since Everquest, video cards have always had a hard time dealing with the game world, simply due to the limitations of bandwidth and memory size.
All the stuttering that goes on is almost solely due to lack of bandwidth and/or only 1gig of memory. I'm sick of this. Who cares about max fps or even shaders, when every game glitches when entering a building or awsome spot... totally ruining the experience.
.
Partially true.
Xoulz man... that game and the previous Elders Scroll suck big time, and their programmers even more.
Just because you see stuttering doesn't make your assumption true.
That's not an issue of the graphics card memory bandwidth ( which is godly high for those games ) neither the VGA's FB capacity ( 512MB is more than ok for those games ).
It's just a stupid programmer and no future-proofing their game with advanced "options" to preload the whole world into the VRAM.
You'd be surprised if you check the VRAM consumption of nowadays games ( past games aren't even worth it... max ~200MB ) and how much of the VRAM BW the cards are utilizing.
ooooh - GTX 350.... Better not require more power else its a flop in my case no matter what performance numbers its brings out! But with 2GB GDDR5 surely thats gotta be a X2 card of some sort. Hopefully they've got the ATI route with of X2 cards
Considering the HD 4870 X2 consumes around 100W more than the GTX 280, nVidia has plenty of room to increase power consumption and I am sure that they will do so. I hope that the GTX 350 doesn't consume as much as the 4870 X2 though, the insane power consumption is something that is keeping me away. I could probably handle the X2 but my 520W PSU would be crying.
Hum?
http://resources.vr-zone.com//newspi...RO_V3750_M.jpg
This is a small card with plently of tecnology and speed.
The HD4550/9400GT are micro cards with stupid performance for a standalone card. They decode HD and not much more.
Otherwise the HD 4600 play every games with good resolution and good setings. Performance is between HD 3850 - 3870.
I find it hard to believe that both the GTX 280+ and the GT300 will BOTH be on 55nm.., with the GT300 using GDDR5 memory (if 512-bit bandwidth should already be enough). Perhaps the so-called "GTX 350" will just be a dual-chip card, using the same 55nm chip, since that is what 2GB of GDDR5 memory suggests. Then it makes sense that both are using the same chips after all (and that the GT300 is not a new generation).
GTX 350 could easily be a medium-powered 55nm GT200 derivative on a 256-bit bus with 2 chips totaling 512-bit of GDDR5 bandwidth. This would be a very good option for Nvidia to pursue, I doubt with the "350" name it'll be a GPU that requires both 512-bit and GDDR5 for a single-core. After G80 I would have thought Nvidia would have learned about the problem of huge and hot cores, maybe GT300 will be smaller and scale like RV770.
Please, let's quote that picture 500 more times just in case someone missed it!
I just gave a headstart and posted it on 6 other forums! :D
I stumbled upon a Hardspell article written July 18, 2008 that speaks of a GTX 350 engineering sample with the same specs as listed above:
HardSpell.com - NVIDIA GTX 350 ES version is ready and the specs revealed?!
We got to know the related news but we are not so sure about this:
NVIDIA GTX 350
GT300 core
55nm technology
576mm
512bit
DDR5 2GB memory, doubled GTX280
480SP doubled GTX280
Grating operation units are 64 the same with GTX280
216G bandwidth
Default 830/2075/3360MHZ
Pixel filling 36.3G pixels/s
Texture filling 84.4Gpixels/s
Cannot support D10.1 .10.0/SM4.0
Yeah, 256x2 bits for a dual-chip configuration makes perfect sense, paired with GDDR5.
However, do not expect Nvidia to maintain integrity with its naming scheme. Remember the 9800GX2? It was a new "generation" from the 8800GT and 8800GT 512MB. Nvidia might just as well do the same thing with its GTX 350, just to make it sound more attractive.
Tell me, what is the best way to get your computer store world renown overnight? Post false GPU SKUs. ;)
Oh yeah, I overlooked this absurd claim: 830 MHz core clock and 2075 MHz for the 240 shaders... on a dual-chip configuration, I think it's far-fetched.