LOL..... That would be kind of funny..... I also believe they will be using GDDR4
~Mike
Printable View
LOL..... That would be kind of funny..... I also believe they will be using GDDR4
~Mike
So do i assume that this card is a ES and nV will modify it when they start shipping retail cards.
I love your avatar, with the cat coming out of computer case...Quote:
Originally Posted by arisythila
Long time ago, in galaxy far far away, in school we started computer programming on 286's. Since we never used the floppies, we would cram candy bar wrappers into the floppy slot, and every nook and crany. Boy, were they surprised when they tried to save some guy's onto a floppy!!
Make sure it is a thong .. this is extreme after all.Quote:
Originally Posted by ***Deimos***
You are a girly, right ? :confused:
:D
Regards
Andy
DX10 reference whitepaper, which I use as my reference:
http://download.microsoft.com/downlo...indexing%22%22
I like this
"Operation Direct3D 9 Direct3D 10 (reference) Draw 1470 154 Bind VS Shader 6636 416 Set Constant 3297 916 Set Blend Function 787 530 Table 3: Command cycle counts on Pentium IV."
no longer limited to just specular maps, bump maps, normal maps, or even parallax maps (photon mapping)... now you can do displacement mapping (in hardware).
old 2048x2048 texture size limit be gone.. now 8192x8192.. just try to imagine the size of that compared to your monitor's resolution.
each invocation of geometry shader can produce no vertices or up to 1024 in a primative. Where would you need new to "grow" vertices? Dynamic growing trees. A dragon that grows a second head. Realistic non-predetermined debris from collisions/explosions. Displacement mapping. A transformers video game?
And finally, sampling of shadow maps!.. No more jaggy shadows!!
for the bet, does it matter?Quote:
Make sure it is a thong .. this is extreme after all.
You are a girly, right ?
Regards
Andy
I got the perfect pink panties in mind.. ;)
Yay!!!!
So DX10 will fix the dodgy shadows in BF2!!!!
lol
Prolly not in BF2, this new stuff most likely is only on DX10 games, which still have to be released.
Yeah. BF2's dodgy shadows suck like hell. Not even super high resolution's fix it.
Quote:
Originally Posted by Helmore
I was jokin m8, while giving an example of crappy shadows!
Being a DX9 game, obviously DX10 wont fix it!
sometimes hard to tell if you are joking.Quote:
Originally Posted by Dublin_Gunner
but, I suppose if a company like Bungie updated Halo, then Dice can update BF2 too ((HL1 source also comes to mind). Get it all Vista "compatible" and showcase those dark sily smooth shadows.. oh yeah baby.
Likewise with 3DMark2005.. especially apparent on the deck when the captain is looking out for the beast in game test 3.
But, ofcourse you'll only see these benefits if the hardware supports it... and I doubt folks would buy G80/R600 to relive the "classics".
FROM THE ARCHIVES:
I got so excited to see 4-5 trees at a time in Unreal2 preview.. I though wow, this is next generation technology. Yet, all those teaser clips and screenshots of HL2 and Doom3 looked absolutely incredible (back then). At least half the people discredited them as pre-rendered, or photoshoped. Then, I think it was March/APril 2004, OUT OF NOWHERE, Far Cry. My hippie pot smoking friend, eyez ablaze, turned down a reefer to barge into my room, and get me to download this "demo".. pff.. another bunch of screenshots or some stupid clip of "supposed gameplay".
My heart skipped a beat. OMG, I couldn't take my eyes of the screen. I spent hours literaly crawling through the vegetation. Exploring the island. Taking long luxurious swims. Probably took more in-game screenshots that day then all the years before combined! For such a long time, in all sorts of games, you "imagined" you were in a jungle (ex Goldeneye N64), or in a forest, or just walking on gravel or through grass... now, you no longer had to imagine.. it was there. Probably the 3rd religious awakaning, after seeing 3D for first time (Quake), and ofcourse the enchanting music and that memorable warthog in Halo.
Well, stop being so serious all the time Deimos ;)Quote:
Originally Posted by ***Deimos***
How much power Will this Tower really Eat up ?? They Say 300w, But i seriously doubt it. If yes, then they would fry like hell. Cant say about the R600 since they will be very hot. Will my OCZ 700w be fine powering 8800GTX.
Also was wondering, since this Card IS SUPPOSED TO BE VERY LONG, how long will be R600, (ATI say they will be the biggest Cards ever) OMG. :D :rolleyes: :rolleyes:
I don't like it when they release these school bus size cards. They are just bleh. I remember the 6800ultra, that was a huge card, then the 7900GTX came out and that was pretty big. Pretty soon we are going to need a case just for a GPU. It will be linked with a floppy cable haha.
Pretty soon you will buy a video card and plug the motherboard into it!!
Quote:
Originally Posted by ANP !!!
Your psu should handle one at least. Quad Sli might not work though ;)
I doubt they would try quad again.. That whole project was doomed from the start. Direct3D games see little benefit over SLI, since typically only triple buffering. No limitation on buffers in OpenGL, but then again, not many games either. Quad SLI is just way too impractical, and inefficient.Quote:
Originally Posted by Poodle
However, even SLI G80 will surely be big burden. That's probably where they came up with the 250-300W figures. But, also remember you need at least 100-150W for the rest of the system, as well as demand response, and margin for stability. All of a sudden 500W+ PSU requirement doesn't sound crazy anymore.
And one other important thing.. heat. And not just the challenge of cooling G80 (dual slot heatsink or aftermarket water cooling). G80 SLI will surely stress other components too. Some special applications may generate huge PCIE bandwidth load. Chipset probably hotter than ever. Likewise, more CPU-limted scenarios, and more stress on the CPU. If PCIE slot is relied on for power, significant strain there too. And ofcourse, with nearly 500W total being used by the system, even an efficient 80% 500W powersupply will be putting out nearly 100W of heat, all by itself.
i wonder if there will be a gx2 version of the g80. aka a more effecient sli solution? who knows? it would be cool tho.
feeling better about my $230 7900gt(x) LOL
Ah, so T&L is more a weaker less efficient form of what STMicro used in the Kyro series, though limited to the "frustrum". That makes more sense now.
Having mentioned STMicro, anyone else been keeping tabs on the things they've been doing? I poke over there once in a blue moon and saw a few rather interesting things..
EDIT - Looking forward to the panty pic, though there are bound to be more "entry level" consumer cards as well.. so maybe not.
Perhaps you misunderstood meQuote:
Originally Posted by STEvil
there IS NO SUBSTITUTE for T&L (or TLC). Its an integral part of the graphics pipeline. Doesn't matter if you're using pixel shaders or vertex shaders.. those are add-ons. You're always doing T&L.
ST Micro Kyro chips had no T&L hardware acceleration. It was done in software, ie the CPU did the calculations. Games which dont have compatibility fall-back.. games that require hardware pixel shaders or T&L.. they will not work at all with the Kyro.
the die is huge!!!!
using 1.1ns gddr3
it has something like HT build-in
Is it at least small enough for a MCW60 to make contact with the whole thing?
ur kidding right?Quote:
Originally Posted by ***Deimos***
Bungie didnt update halo2 they are just gonna put some code init so it will only run on Vista no other changes are done to it. It wont use Dx10 and it wont look anything fancier then ont he old Xbox. So to be short its a very easy job for a developer.
and Dice/EA will have Bf2142 out when Vista comes so i highly doubt theyl waste any energy on fixing or upgrading anything about Bf2 by then.
ya i guess they are the kind of company that will slap a Vista compatible sticker onit for a value version of Bf2 just to sell to the suckers out there but i doubt theyl fix anything once bf2142 is out.
sometimes hard to tell if you are joking.
but, I suppose if a company like Bungie updated Halo, then Dice can update BF2 too ((HL1 source also comes to mind). Get it all Vista "compatible" and showcase those dark sily smooth shadows.. oh yeah baby.
EDIT: its very well possible... I didn't say they WILL do it.
I had an odd thought
Regarding the power connectors.
Okay so instead running two seperate PCI-e cables to two cards, what if.
It would use one cable, have a "jumper" to the second card.
Running two cards off of one PCI-e cable?
Once there's sufficient current on that rail, I dont see how that would become a problem.
I still fail to see why they would need 2.