What do you people think? I say the GTX will score pretty damb close to that overclocked.
http://www.theinquirer.net/default.aspx?article=35195
Now if we could only get a good SLI mobo for these cards...
Printable View
What do you people think? I say the GTX will score pretty damb close to that overclocked.
http://www.theinquirer.net/default.aspx?article=35195
Now if we could only get a good SLI mobo for these cards...
Not quite fast enough to beat an R600 so they say. Wow, I would love to see what the ATI card can do.
:woot: if its true.. I can easily double my graphics performance without going SLI! Hellz yeah. Step-Up program here i most likely come!
omg...we are going to see 20k easy soon :D
I wonder what a 5Ghz Kentsfield on a nForce 680 SLI mobo with dual G80's cooled with LN2 will score. 22k?
I really hope they live up to the hype, I've been putting off upgrading for far to long in eager anticipation.
OMG...i just ask myself which game will stop this beasts? I mean...will not exist this game :D
At least when DX10 survive...
it is hard to say that is the inq no screenies and no information about test bed and what so ever
________
los angeles dispensaries
Quote:
Originally Posted by Scimitar
much more, the rig must have been at stock speeds most probably a X6800 and it seems like 12k is single card (not sure thought).
Imho.. I can see a reocurrence with the X1800/7800GTX 512 time.
We have the G80 coming out now.
The R600 @ Christmas(Or more).. that give Nvidia a 2 or more stong months.
In that time, prices can drop.. and they might have something else up there sleave to combat the R600.
8850GX2/8900GTX /or 8950GX2 anyone?
probably 8900GTX? :fact:
8850GX2 in Quad SLI will:slapass: ATI.
well, if it's a fx62 testbed then that's a decent score, if't a core 2 @ 3.33ghz or something then not so impressive.
Not really as the whole A64 (FX included) line will cause a bit of a bottle neck with a G80, let alone 2 of them. Right now 06 (as well as the FEAR and oblivion engine) is GPU bottlenecked pretty bad, the G80 will allow the conroe to stretch it's wings. Furthurmore I for one do not consider a FX cpu to be much of a gamers cpu anymore so they best not use them on G80 testbeds;)Quote:
well, if it's a fx62 testbed then that's a decent score
i think that's my point. in a cpu limited setup it gets 12k, that's not bad but in a non cpu restrictive setup it gets 12k then that's not so impressive.Quote:
Originally Posted by rodman
LOL, I guess it was wasn't it;)Quote:
i think that's my point.
Its not clear as to what kind of setup gets you 12K... but I am sure it was Core2 as FX aint da norm since July :)
the score should be like this
Core 2 Duo E6600 @ 3.15Ghz (350x9)
Geforce 8800GTX 768MB 384bit GDDR3 @ 575MHz/1800MHz
3DMark06 Score 11934 3DMarks
SM 2.0 Score 5562 Marks
SM 3.0 Score 5443 Marks
CPU Score 2727 Marks
a single 8800GTX cant compare with this 7900GTX SLI
http://service.futuremark.com/compare?3dm06=440284
Intel(R) Xeon(R) Clovertown 2.66GHz + Quadro FX 4500
(2x Intel Woodcrest Xeon 5150 CPU)
3DMark06 CPU score 5581
http://img180.imageshack.us/img180/5765/3dmarkkk6.jpg
That's indeed cool, expecially because for all I know and for all that "news" tells me I can score 12k in 3d06 with a p2 333mhz and 16mb ram, surely :rolleyes:
I'd like to know the SM3.0 part of the score only....
Regards
Andy
What ever the CPU its probably stock clocks and the video is also probably not overclocked at all and on beta drivers. This card may make 15k single card pretty easy.
I care more about how many watts this thing is chucking out, scores at this point are little more than speculation, reality will probably be good scores with a ridiculous heat output.
G
The FX62 is right up there with the X6800 and E6700.. to say that it isn't a gaming CPU is abit of an understatement.Quote:
Originally Posted by rodman
i guess it depends on system configuration - what if theres a physx card? 3dmark06 supports physx doesnt it?
anyways - my X2 3800+ @ 2.6ghz and 7900gtx 512mb scored 6166 in 3dmark06... if they double the amount of power it could put out, and then made it a dual card like the 7950gx2... that would be SO SICK lol
personally i prefer ATI but if a deal comes along then i'll go with the flow.. lol
Not really. Especially if we are talking about overclocking.Quote:
Originally Posted by Gag3
A "gaming" CPU would be E6400 or E6300 as then you can spend money on what's really important for games - the videocard.
Considering how hot and big the single card is, it doesn't look too likely that we'll see a multi-GPU card anytime soonQuote:
Originally Posted by fade2green514
Yea, but they will do a die shrink and put them together (mabe even use GDDR4) in time for the R600;)Quote:
Considering how hot and big the single card is, it doesn't look too likely that we'll see a multi-GPU card anytime soon
Today 11:41 AM
Actually what die size is the G80, lol.
No, they won't do a die shrink in 3 months :eek: Unless they have been already working on it for a while.Quote:
Originally Posted by rodman
Even then, the power reduction would probably not be enough unless they clock it down a LOT.
Yields are already horrible, so no way they would have enough chips for dual-chip cards anyway.
What die size is the G80?
I think both have been working on smaller die varients for some time. The G80/R600 cards were never supposed to be energy efficient. They were designed to be fast at all costs and get to market ASAP. Your going to have to wait until later next year for something that takes less energy.
I wonder if it will really have 2 power connectors:eek: (edit) Nope I guess the retail versions will only have one, the prototype cards had them for testing. Either that or the GTX will have 2 and the GS one.
bingoQuote:
Originally Posted by ewitte
sure, sure.. just like how each year 9700pro gained 1000 in 3DMark2005... right?Quote:
Originally Posted by ewitte
The only time you ever really see any significant performance increases from drivers is when a new game just comes out, its all rough, rushed and unoptimized and they're patching it and the drivers to fix the mistakes. DX10 is new. Vista is new. We may very well see big driver performance improvements.. but I think that 12000 (or some say 11000) score came from WinXP setup.
Usually the first 3-4 months are pretty big with gains. Then its pretty minimal from there. There were several times I got 500-1000 points gain. It tapered back to 100,50 or even nothing or negative after that. Usually thats with the newer versions while the older versions drop a little.Quote:
Originally Posted by ***Deimos***
The 12k number floating around for 06 supposedly came from a system with an E6600 @ 3.6 or 400 x 9. If you look @ the 06 rankings here on the forum that's as fast as X1900 Crossfire @ around 750|850 with the same CPU support. Example, the number 7 score:
7. rob[GL] - 12004.00 - dual Radeon X1900 XT @ 756/828mhz - Intel Conroe 3.75GHz X1900XT Crossfire 756/828
So if it really puts up 12k for a single card, I'll certainly be looking @ it.
I agree! Sounds like it's going to drain stupid amounts of power, when are they going to work on improving the efficiency of these things.Quote:
Originally Posted by Master_G
The stock cooler can deal with the heat and the stock cooler doesn't look like anything TOO special. No doubt this will have the heartiest power consumption of any card, ever.....but it won't be 300W, 250W, or even pushing the 225W envelope nV gave themselves at stock....but it will eat up a lot of power, and OCing should only push that higher.
lets assume hypothetically for a minute, that we're taking existing control logic from 7900 and doubling execution units, and using similar to 90nm fabrication. 7900GT uses 1.2V set at 450Mhz, and oc to about 550 without overvolting. GTX uses 1.4V set at 650Mhz.. and oc only little bit (700) without overvolting. The increases Mhz only slighly raises the power consumption.. but the voltage makes the big difference... GTX almost twice the power of the GT.Quote:
Originally Posted by Vapor
G80 will be a big complex chip. Like I earlier rationolized, they will use lower voltage and lower clocks to reign in the power consumption. Lets say for example 1.2V. But, you get Macci, PCIce, OPPainter etc.. over-volting one of these devils on cascade... 1.3, 1.4, 1.5.. maybe even 1.7V!! That alone is going to be a HUGE increase in power. Will certainly put those 1000W PSU to good use. And, the increased clockrate on such an SLI system might just push even the high end PSU past the breaking point.
http://www.pcwelt.de/news/hardware/v...081/index.html
Can anyone here translate this?
Do a google for PC welt and it should translate.
Diagram chip Geforce 8800 GTX
Code name G80
Road price approximately 650 euro
Transistors about 700 million
Manufacturing 90 nanometers
Chip clock 575 MHz
Streaming processors (FR) 128
Working frequency of the SPs 1350 MHz
Theoretical pixel filling rate 36800 MPix/s
Memory quantity 768 MT GDDR3
Number of memory chips 12
Storing act 900 MHz
Memory interface 384 bits
Memory range 86.4 GB/s
Shader model 4.0
Direct 3D version 10
Open GL version 2.0
SLI yes
What does 650 euro end up in U.S dollars?
€650.00 = $819.848
http://www.xe.com/ucc
Likely to not be a direct currency conversion on the retail price in the USA though.
G
Does anybody else here think that 1350Mhz for the Streaming Processors is a really odd number. Why isn't it the chip clock of 575Mhz? Why isn't it at least a rational fraction multiple? How is nVidia able to create such a drastically different clockrate on the chip for a portion of it? How has nVidia managed to double the clockspeed compared to 7900's pixel/vertex shaders? If its true, what kind of IPC sacrifices were required (ie P4 vs Athlon)?
In tech....Euros are usually 1:1 with USD.
In regards to the varying clocks, they could do it with the G7x as well, and did. As for why the difference is so much....we'll probably find out in a few weeks....
Yikes 0_0
Its time to make a trip with 4 of those guys with me :D
good news
Why do I get the strange feeling that using a Kentsfield over a conroe will be needed to get the most out of the G80:rolleyes: I wonder if Intel and Nvidia worked something out with how the Intel chipset/cpu work together with the G80 (like using the 3rd and or 4th core for help in rendering somehow) to get the most performance:confused:
They say the scores are like 1500 point higher with quad core, but I wonder if it's just cuz the cpu score is higher, thus giving you a higher final score. 05 and 03 do not include the cpu mark for the final score so i say 05 would be a better bench for pure GPU performance.
By *getting the most out of* if you mean running 3DMarks, then yeah maybe. Single-core A64 @2.4GHz+ (preferably 1MB L2) will be all you need for gaming. Don't! Let's not talk about Alen Wake, YET.Quote:
Originally Posted by rodman
New pics...
http://bbs.mychat.to/read.php?tid=578438
WOW...Quote:
Originally Posted by rodman
that PCB looks completely different.
Its shorter. Still has 2 PCIE power connectors. But, many more electrolytic capacitors. I can count quite a few large inductors too (as expected for large power consumption device). Can't see under heatsink to check the 12 memory chip thing though. Dual slot dense fin heatsink similar to 7900GTX heatsink.. probably a bit bigger/heavier. I certainly hope that nVidia can continue the tradition of the excellent 7900GTX heatsink.. elegant, quiet and runs the card cool.
From what I see elsewhere its 12000 with quad core and about 10500 with a x6800. I'm almost thinking it would be a good idea to grab a used 7950gx2 for cheap of someone upgrading and wait for R600 ;) I already know I'll probably not be able to resist waiting without something to keep me occupied.
LIVE FOTOS:
http://we.pcinlife.com/attachments/f...asPyuZXTiC.jpg
http://we.pcinlife.com/attachments/f...DJS6LipkN1.jpg
http://we.pcinlife.com/attachments/f...WWn6L1AifP.jpg
http://we.pcinlife.com/attachments/f...YYuiTR2B93.jpg
http://we.pcinlife.com/attachments/f...9U3nisUrmD.jpg
http://we.pcinlife.com/attachments/f...yjlLYHGKHm.jpg
http://we.pcinlife.com/attachments/f...IUFJxmrLqL.jpg
http://we.pcinlife.com/attachments/f...0pq3Y8KcYE.jpg
http://we.pcinlife.com/attachments/f...HuOlrkcH2a.jpg
http://we.pcinlife.com/attachments/f...y28M3vhOWN.jpg
http://we.pcinlife.com/attachments/f...BqZSKiXViz.jpg
http://we.pcinlife.com/attachments/f...0pq3Y8KcYE.jpg
:banana: :banana:
What kind of PSU will be necessary to run that monster ?
( single 8800 I mean ..:p: )
Quote:
Originally Posted by Mykou
8800 GTX ---------450 Watts & 30A +12v.
8800 GTS ---------400 Watts & 26A +12v.
regards
Why are they still using analogue power regulation?
Also
thats a BBBBIIIGGGG card :eek:
they said that the 8800GTS (not GTX ) is 1.8 times faster than the X1950XTX in the 3DMark06
http://we.pcinlife.com/viewthread.ph...&extra=&page=1
ASUS EN8800GTS/HTDP/640M , is: USD $410 (FOB).
ASUS EN8800GTX/HTDP/768M , is: USD $540 (FOB).
8800GTS 320-bit 640MB GDDR3 @ 500MHz/1800MHz
3DMark06 8800+
8800GTX 384-bit 768MB GDDR3 @ 575MHz/1800MHz
3DMark06 11800+
http://img201.imageshack.us/img201/9...finedbgxx5.jpg
Anyone else seen the note in the inquirer that says nvidia has told all its partners it won't be allowing overclocking on the G80 due to high returns on the 7900gt? Probably more likely due to low yields though :).
Could see coolbits etc being blocked, but surely a bios mod would work? Just be a pain in the ass to figure out your max clocks....
Change clock, flash bios, boot up, 3dmark, change clock, flash bios, 3dmark etc etc...
Mind you ATI tried it and failed already...I'm sure in time that some clever sod will figure out how to do it in software or a driver level hack.
Matt.
X6800 overclocked @ 3600MHz (no tweaks-nothing) - 2 x 8800GTXs default = 15500Marks in 2k6.......BETA bioses - BETA Drivers = it's gonna be faster...
I think that these are some bad news for ATI for the time been.......Dumn it....:(
Quote:
Originally Posted by PiLsY
yeh :
Quote:
NVIDIA forbids factory overclocking of G80
NVIDIA has told all its partners that it will not allow for factory overclocking of it's cards. NVIDIA wants vendors to stick to the core and memory clocks that it sets, to prevent fiascos like what happened to the 7900GT. Overclocked editions of the card caused a very high return rate of the cards, and NVIDIA just does not want that to happen again.
http://www.theinq.com/default.aspx?article=35244
Wow, I am really worried one of those (let alone 2) will not fit im my rigs case:confused: GTX looks like 2-3 inches longer than my 7800gtx:eek:
http://we.pcinlife.com/attachments/f...UMLAQvfIWk.jpg
Look at the size of the die on that baby:eek: My waterblock will never fit that:p:
That is such a damn huge card.
@hipro5
You already playing with the new hardware huh? I guess we shouldnt be surprised lol. I wanna see them all under ln2 now!
Just out of curiosity, what kind of stock oc's are you getting and also what temps do you get on the stock coolers @ load?
Yes, an oldy but a goodie.
I like the bottom right hand corner :p:
It's about 2 - 3cm longer than a motherboard....:)Quote:
Originally Posted by rodman
They are not mine.....I just happent to be in the place they had them and test them at noon......Quote:
Originally Posted by madgravity34
They had about 67*C measured with a laser digital thermometer at the back of the PCB.....So it might be about 75*C+ with stock heatsinks running 2k6...
The heatsink is -copy/paste- of the ATI ones (pulling from inside of the box and through the air at the back of the case).....The heatsink ALSO cools down the Core's Power Mos-Fets.....They were on a reference NVIDIA based chipset mobo.....:) ;)
i take it we should see some new world record with this card
i am holding testing pics for about 2 to 3 weeks by now.... i hope tomorrow is 7th nov ... :(
IS that when NDA is up? I take it it is.Quote:
Originally Posted by guess2098
:lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :lol: :fact:Quote:
Originally Posted by Gam3Ra
lmao there is always one of these when new cards come out. Gotta love it.Quote:
Originally Posted by dinos22
In the meantime I will be saving money for a new PSU. The 510w Turbocool just is not enough for R600.
you don't need to get a new system PSUQuote:
Originally Posted by RangerXLT8
get one or two of these it's perfect (single unit handles 2xX1900XT cards in Xfire no problems at all)
http://www.thermaltake.com/product/P...0099/w0099.asp
i'll buy two of these i reckon
or just follow nVidia recommendation:Quote:
Originally Posted by dinos22
450W PSU for computer system with 8800GTX
800W PSU for 8800GTX SLI...
was there some part of that which was too complicated to understand??
LOL what's with all the smartarses around hereQuote:
Originally Posted by ***Deimos***
if you had some grey matter you would work it all out easily
8800gts
http://www.tomshw.it/articles/200610...yYpicNQ3_c.jpg
note the lack of dual link SLI...
Wonder what that separate processor is near the back plate. Its blanked out in the picture on the page before...
Separate video processing unit maybe? Looks like its near where all the tvout stuff is normally.
Another take on nvidia's statement about no overclocking...it could just mean no factory overclocked cards looking at the phrasing?
Matt.
Quote:
Originally Posted by hipro5
damn..at that rate the next milestone will be 20-25K in 3dm06!! :D
Nice! :O
I wonder if these will have cold bug with LN2 ?
Assuming the IHS has a good contact with the die it might be that the pots do quite well with such a large area to touch. Comments from LN2 guys on this ?
Regards
Andy
Victor can i please ask for the sizes of the core ?
Great scores :)
@.@ can only say...very big............Quote:
Originally Posted by |-jokker-|
btw, default clk 3dmark06 with a qx6600es @ 3.6G, nearly 12k4
and... overclk a little bit @.@
http://vic.expreview.com/attachment/1161849814.jpg
the quad core Kentsfield CPU power is horrible :slobber:
Core 2 Quad Kentsfield Q6600 @ 3600MHz (400x9)
3DMark06 CPU Score 5417 Marks
2x AMD Opteron(tm) 8220 SE processor @ 2800MHz
3DMark06 CPU Score 4782 Marks
Victor,
We're definitely waiting on some 8800 GTX SLi action! BTW impressive score to say the least.
8220SE is mp proc...dp proc is supposed to be 2220 SE...Quote:
Originally Posted by milkcafe
Difference? None...except price is double for 8220 proc as they're supposed be run in 4x8220se environment.
Perkam
So as the time let you please could mesure that , thanks a lot for the response.Quote:
Originally Posted by VictorWang
Great scores , but not at the point :P overclock that cpu :)
He doesn't know how big it is because of the heat spreader. He can't take it off because of the simple fact. Removing the IHS will make the cooling system useless.
It also looks like the shim is Pop rivited in.
http://img87.imageshack.us/img87/6949/5888tb7.jpg
IHS is also 43mm square
wow single card almost 14K, i'm impressed :)
sli g80 will hit 18-20K for sure and more with ln2..i bet kingpin is happy about this
Not sure of that AMD score. But, Victor got 3216 on X6800. So, I think kentsfield 3.6ghz 5417 >> 2.93ghz 3216 is not so bad. Its not like anybody is actually expecting 3DMark2006 CPU score to scale linearly, right?Quote:
Originally Posted by milkcafe
does it really matter? It could be 200mm^2, 300mm^2 or even 999mm^2. Doesn't matter because nVidia will produce relatively few for the hardcore enthusiast sector either way. The big surprise will be how well the architecture scales. (ie "How efficient is it")Quote:
Originally Posted by |-jokker-|
If nVidia can get amazing compact 8800 design like 7600 out.. everything else will fall into place.
One thing I find very interesting is how 3DMark takes advantage of 4 cores and the fact that 4 cores clocked slower than the X6800 gives higher graphics marks as well. Either 3Dmark is taking advantage of Quad core or something else. It's as if you have 2 cores on cpu duty and 2 cpus on graphics duty.
I wonder if 4 cores will help that much in gaming? Not becouse the game was threaded for 4 cores but more along the lines of 4 cores working with the 8800GTX
Oh and I wonder how much of a boost Quad core will give in SLI;)
3D graphics is very parallelizable. Thats why GPUs with increasing numbers of pipelines scale so well. However, 3D pipeline is also serial progression of calculations: vertices->geometry->t&L->ps->rendering etc... and thus also relies on serial CPU control and assignment of work/data.Quote:
Originally Posted by rodman
Example. First you need to update the world (ie your BF2 dude ran forward).. send stuff to GPU to do, it will do all the polygons and pixels, output frame and then you can work on next one. CPU cant work on like 10 frames in the future.. it obviously doesnt know where you moves the mouse, what keys you pressed, and what other multiplayers have done... etc
So, typically I think when game developers talk about multi-threading, I think they mean more along the lines of doing pre-fetching in seperate thread, or AI or any other non graphics pipeline stuff. Thus the benefits of multi-core CPUs in games are probably never going to be quite as good as imagined.
That is one powerful card. Victor what are the max clocks you have reached with it so far?
14k single card.. thats very impressive.
the Kentsfield XE wont be the most powerful CPU in 2007
the Clovertown ,Tulsa Xeon ,Yorkfield XE and AMD Altair K8L will be faster:rolleyes:
AMD Athlon(tm) FX57 @ 4208 MHz (210x20) single core
3DMark06 CPU Score 1696 Marks
http://img199.imageshack.us/img199/9243/cpu42089kx.jpg
Intel Pentium M 780 2MB L2cache @ 4000 MHz (266.6x15) single core
3DMark06 CPU Score 1713 Marks
Intel(R) Core(TM)2 Solo E6600 @ 4050MHz (450x9) single core
3DMark06 CPU score 1753 Marks
Intel Core 2 Extreme X6800 4MB L2cache 2.93GGHz 2x core
3DMark06 CPU score 2512
http://enthusiast.hardocp.com/images...a7RG_2_2_l.gif
Intel(R) Core(TM)2 Duo E6600 @ 3150MHz (350x9) 2x core
3DMark06 CPU score 2727
AMD Athlon(tm) FX62 @ 3676 MHz 2x core
3DMark06 CPU Score 2787 Marks
http://service.futuremark.com/compare?3dm06=329639
Intel Core Duo T2600 @ 3516 MHz 2x core
3DMark06 CPU Score 2874 Marks
http://service.futuremark.com/compare?3dm06=303322
Intel Pentium 965XE @ 6172 MHz 4x core
3DMark06 CPU Score 3314 Marks
http://service.futuremark.com/compare?3dm06=86920
AMD 2x Opteron 275 @ 2797 MHz 4x core
3DMark06 CPU Score 3824 Marks
http://service.futuremark.com/compare?3dm06=91145
Intel(R) Core(TM)2 Extreme X6800 @ 5058 MHz 2x core
3DMark06 CPU score 4387 Marks
http://service.futuremark.com/compare?3dm06=440284
Intel XEON 5160M @ 3000MHz (Woodcrest 3.0GHz 4MB L3cache ) 2x core
3DMark06 CPU Score 4570 Marks
http://service.futuremark.com/compare?3dm06=501026
2x AMD Opteron(tm) 8220 SE processor @ 2800MHz 4x core
3DMark06 CPU Score 4782 Marks
Intel(R) Core 2 Quado Kentsfield Q6600 @ 3600MHz (400x9) 4x core
3DMark06 CPU score 5417 Marks
http://vic.expreview.com/attachment/1161849814.jpg
Intel(R) Xeon(R) Clovertown 2.66GHz (2x Intel Woodcrest Xeon 5150 CPU) 4x core
3DMark06 CPU score 5581 Marks
http://img180.imageshack.us/img180/5765/3dmarkkk6.jpg
AMD Athlon(tm) FX8x (Altair) 2.93 MHz (4x512KB L2cache + 2MB L3cache) 4x core
3DMark06 CPU score ???? Marks
Intel Core 2 Quado QX6700 Kentsfield 8MB L2cache @ 4104MHz 4x core
3DMark06 CPU Score 6057 Marks
http://www.tyrou.net/screens/12856-3d06.png
Intel(R) Xeon(R) 7140M (Tulsa 3.40GHz 16MB L3cache ) 4x core
3DMark06 CPU score 7824 Marks
Intel Core 2 Quado Yorkfield XE 3.73GHz 2x6MB L2cache 4x core
3DMark06 CPU Score ????? Marks
Borrowd from VW....Comparo to CF...
http://img205.imageshack.us/img205/3...shot150vx0.jpg
http://img205.imageshack.us/img205/9...shot149zt3.jpg
:slobber: :slobber: :slobber: :slobber:
But!Quote:
Originally Posted by Dumo
CPU scores are quite different. 3400 vs 5400.
Yorkfield is my quad core upgrade. I'm just sticking with Conroe when Kentsfield comes out.Quote:
Originally Posted by milkcafe