PDA

View Full Version : G70 to be power hungry



SLaY3r07
04-30-2005, 08:19 AM
Nvidia just recently asked PCI Sig for an additional 75W of power on top of 75W for the connector and 75W for the mobo. This card could end up w/ 225W. Imagine 2 G70's in SLI :eek:

Link (http://theinquirer.net/?article=22917)

eva2000
04-30-2005, 08:45 AM
:eek: home made heaters LOL

grimREEFER
04-30-2005, 09:10 AM
this is why here will be no g70 agp, it would need like 5 molex connectors!

R.Rabbit
04-30-2005, 01:51 PM
isn't it enough that the card will be expensive? and now your utility bill will be too!!

saaya
04-30-2005, 01:56 PM
the card wont end up with 225W :P

afaik g70 is nothing more than sli on a package or a dual core nv4x nothing really new :)
it should be 120W max... depends on the clcokspeeds nvidia will need to beat r520 though...

MaxxxRacer
04-30-2005, 02:08 PM
its more than just dual core... its gonna be more pipelines...

(sin)morpheus
04-30-2005, 02:49 PM
They haven't said anything on it yet though. So, it could be anything. We'll just have to wait until it's released. I am excited like a little school girl over the upcoming cards, as I'm sure most of the other people here are. Should be an interesting battle. :D

shadowing
04-30-2005, 03:31 PM
If it turns out that the extra juice is needed to excel ATI with dual cores and 32 pipelines and a lot more, I'm already happy. :D

MaxxxRacer
04-30-2005, 04:30 PM
nVidia will be using the 110 or 130nm process meaning that they really couldnt even fit 32 pipeline dual core onto one package.. its not possible.. and the chip costs would be 4x just in pure size. on top of htat the yields would be about 1/4 of what they are now as the die size would be so huge that it would be unlikely at best that many cores would be good entirety.

Northwood
05-01-2005, 01:18 AM
theres only 3 possibilities for this extra power consumption:

1: Nvidia has moved over to dual-core for it's G70 line

2: They have put 2 GPU's on 1 card, so maybe 4 GPU's all together with 2 cards in SLI.

3: they have made a huge die with a silly amount of pipelines, and an even sillier amount of transistors.

Take your pick :)

$a1Ty
05-01-2005, 01:50 AM
knowing nvidia, i'll take number 3 :toast:

saaya
05-01-2005, 07:46 AM
heh, yeah thatd make most sence :D

i think its 1+3 :)

EMC2
05-01-2005, 03:53 PM
This is one reason you have to take anything from the inquirer with a BIG grain of salt :lol:

Here's what they said in that linked article:

WE HEARD THAT Nvidia asked PCI-Sig, the PCI and PCIe standardisation body, to provide some more juice for Nvidia's next generation cards. It turns out that 75W from external connector and 75W from motherboard is not enough. Nvidia wants an additional 75W for its G70 card.

This is something that was whispered at the WinHEC conference held this week in Seattle. When we dug a little bit more it turns out that there is such a proposal at PCI-Sig website. You can see it here and download it if you have a user name and password. Notice their last two sentences "confirming" there hyperbole, where they state that "there is such a proposal, see it here"

The link is to the present 150W PCIe specification, that has been released since October of 2004, and does not have anything in it about 225W cards. It is the original addition of the 6-pin connector to get the PCIe spec up to 150W. :ROTF:

wickedld9
05-02-2005, 05:56 AM
Not that I'm an industry insider, but I spoke with one a few weeks back. They said that the power requirements of the new Nvidia core are going to be very high. They went on to note that there are only 2 PSU's on the market that are able to pass Nvidia's testing at the moment. They are both from the same manufacturer. There is a 3rd from another manufacturer that has finally passed but is not yet available. It's a newly designed piece as all of their current models failed.
I guess at least to me, this kind of confirms what I was told. They did say "It will be FAST" but that was about it. There was nothing about the R520 as they have not gotten anything from ATI.

ferrari_freak
05-02-2005, 12:07 PM
and how long away are these cards?

Northwood
05-02-2005, 12:44 PM
taking note of Graphics cards life cycles, i'd say probably around Q1 06.

alexio
05-02-2005, 01:02 PM
It will be so expensive because the yields won't be good and the die is just so big.

perkam
05-02-2005, 01:55 PM
The G70 might be several things, based on what Nvidia decides:

1. If the core will be 90nm, then most likely it will be single core and follow ATI's direction there and 32 pipeline capable, with more pipelines a possibility.

2. If the core is 130nm/110nm, it'll be dual core based on the NV4x, therefore 32pipeline capable at max.

3. If Nvidia goes retarded on this, it'll be two 65nm cores on one chip with insane memory.

What we dont know is if either ATI or Nvidia will be using GDDR4/XDR memory, which could be an expensive option for now as already 1.2ns ram will guarantee 700 on the memory, making 800 easy for watercooling, hence making 40k+ on air easily as well....but not at stock.

Those are my predictions.

Perkam

_Eduard_
05-02-2005, 02:01 PM
yeah right this thing will suck like 20 amps of my 12v rail? :rolleyes:

I hope ATI has a better alternative

EMC2
05-02-2005, 05:27 PM
Not that I'm an industry insider, but I spoke with one a few weeks back. They said that the power requirements of the new Nvidia core are going to be very high. They went on to note that there are only 2 PSU's on the market that are able to pass Nvidia's testing at the moment. They are both from the same manufacturer. There is a 3rd from another manufacturer that has finally passed but is not yet available. It's a newly designed piece as all of their current models failed.
I guess at least to me, this kind of confirms what I was told. They did say "It will be FAST" but that was about it. There was nothing about the R520 as they have not gotten anything from ATI.

Let's discuss information publicly available...

As of today, there are 4 PS's that are NVidia SLI certified... 2 from PCPnC (one of which I own), one from Silverstone that has been available for a few weeks, and the recent addition of one from Enermax. If you look at the output rails on the 4 units and their allocation by device(s), the lowest common denominator is a 17A rail split between two SLI power connectors, meaning at most 8.5A per connector. The 150W PCIe specification calls for 6.25A @ 12V on the 6-pin PCIe power cnx. There is NO way you could pull an extra 75W from a third power cnx. The extra 2.25A available above the requirements for the present 150W spec is only 25W - not even close to 75W and is there for two predominate reasons... commonality with the other 12V rails helping to keep design and manufacturing costs down - and - added safety margin. A lesser consideration is the simple fact that the manufacturer's know that the cards will be overclocked.

Peace :toast:

{edit} Oh... and 150W maximum *is* a lot of power when compared to previous 75W maximum cards ;)

xman01
05-02-2005, 06:58 PM
rediculous
with the s*&t components i have i already have a heater

that would just make my room a furnace

MaxxxRacer
05-02-2005, 10:31 PM
So this all begs the question. what is the stock cooler gonna loook like and how many slots will it be...

i can see it now.. 4 slot design with an adapted SP94 strapped to it.. :slap:

ScHpAnKy
05-03-2005, 08:02 PM
So this all begs the question. what is the stock cooler gonna loook like and how many slots will it be...

i can see it now.. 4 slot design with an adapted SP94 strapped to it.. :slap:

That's actually the first thing that popped into my head, too! :woot:

NotoriousMike
05-05-2005, 02:59 PM
I think people should start furnishing their home's with minuture nuclear reactors to offset their power consumption from utilities. The 65nm would be nice, but who on earth would give access to that type of equipment.

saaya
05-08-2005, 03:33 PM
taking note of Graphics cards life cycles, i'd say probably around Q1 06.
:confused:

perkam, g70 is already sampling, so its unlikely its 65nm... there wasnt enough time to get a 65nm part ready... plus nvidia wouldnt jump from 110nm to 65nm.
all manufacturers usually try to stick to the same chip size they have been successfull with, for nvidia this means double the transistors in 90nm, or same number of transistors in 110nm but put 2 chips on one package. i think g70 is 2 improved nv4x chips in dual core