Why all the talks about new graphic cards when presentation sheet clearly says Driver Timeline
Printable View
Why all the talks about new graphic cards when presentation sheet clearly says Driver Timeline
New HWFeatures (hardware may be )
SLI Connectivity Features
Display connectivity
Quality Improvements
Performance Improvements
OpenGL 3.0
If big bag II is as big as big bang I.... then we'd need sli/crossfire.... link an nvidia card to ATI! Now that would make heads turn!
A lot has to be said before whatever was being discussed can be considered massively mind blowing....
It's more of a Big Update II than a big bang.
Perkam
Line 1:
Wendy Quest -something in caps- Roadhouse Pitballer
Line 2:
Hi'yall Whipped Appelog
Line 3:
Sping Noodlebrook... feck knows
Line 4:
Nope. Nothing.
Line 5:
Big Bang III - Fullwillipuppetsoh
Thats as much as I can get from it anyways.:shrug:
lol blury on purpose
Maybe their going to have some kind of Cuda ray tracing engine built into the drivers.
"Big Bang II" will be a card not based on G80 I bet. (first in 2 years!)
Hopefully something that can run Crysis on 1920X1200, very high, full AA/AF, 60FPS minimum.
I think the Master Chief asked Cortana if she could run Crysis at those settings and then she blue screened :p:
But seriously this more likely seems to be a new feature set update/improvement to the forcewares and future NV platforms more than a new card launch.
i know what it is, the ability to run multi monitors in sli :p
Well that'd catch them up to CrossFire in that area.
Hopefully it's something bigger than multimonitors in SLi. That would just be disappointing. If they name it Big Bang II, it better be big, and not monolithic dies, like the GTX 200 series. Here's to hoping for a nice surprise!
Picture is a fake photoshopped mythbusting nvidia scam.
There will be no whoop ass ever can.
Big Bang I:
SLI introduced
Big Bang II:
SLI that actually scales introduced...
Makes sense to me :rofl:
hmm, the opengl 3.0 part implies a new card, but the big bang names implies some type of new sli.
My guess:
-Easier CUDA implementation (Which includes PhysX.)
I doubt they'll release a new gfx flagship line. Maybe they'll make some hints what comes after the GT200b, but thats it.
free performance improvements sounds good to me..
From Inq
Quote:
Nvidia's Big Bang II is more of a whimper
Time to stop artificially breaking SLI
By Charlie Demerjian: Friday, 25 July 2008, 4:55 PM
WHAT DO YOU do when you suddenly find yourself in second place, trailing badly with no hope for the rest of the year? You stop artificially crippling your drivers and spin it to the users as magnanimous, welcome to Nvidia's Big Bang II.
The good folk at Chile Hardware were the first to notice it on a blurry slide. The obvious inference is to 'Big Bang', aka the SLI introduction, which was a big deal. Big Bang II is simply not. It is the code name for Release 180 drivers, and they are coming from September 08 to February 09.
R180 has five bullet points, 10-bit displayport support, OpenGL 3.0, SLI on multi-monitors, transcoding on the GPU, and some performance 'optimisations' over R177. Come autumn, they will be catching up to ATI on several key checkboxes, this is a big bang?
The only one in there worth getting excited over is SLI on multi-monitors, and that is kind of a sham. Nvidia will obviously tout it as the greatest thing since sliced bread, but they are just unbreaking the driver. They could have turned it on any time they wanted to, it works just fine in the Quadro line (1). Nvidia hurts their most loyal customer base by artificially turning it off in the normal card line. Why anyone thinks they deserve kudos for stopping breaking drivers is beyond me.
In any case, Big Bang II is a small step forward and it brings a few nice features and one very needed one. It is a real pity that their hand had to be forced to stop hurting users for margins though, but that is the way they operate. µ
(1) I wonder what this will do to Quadro sales, and thus NV margins?
It is a Can of Whoop-Ass renamed to Big Bang II.
My guess is also PhysX support for all G80 and up GPUS. That's the only thing that makes sense.