Where do you find that info ?Quote:
Originally Posted by Tim
Printable View
Where do you find that info ?Quote:
Originally Posted by Tim
I doubt the core will scale that high on 110nm...architectural limitations...And i doubt it'll be 90nm...otherwise it wouldn't need that cooler.
Perkam
The X1800 XT is 90nm and runs extremely hot. To go from 110nm to 90nm you have do make a lot of changes in the architecture.Quote:
Originally Posted by perkam
It's not like 90nm uses 100 - (90 : 110 * 100) = 18% less power at the same frequency and vcore. It isn't that simple.
He made up half of what he said himself.
Not enough rumours ? Make your own ;)
JESUS guys......my english isn't great but where did I say it's gonna be 90nm???Quote:
Originally Posted by Ubermann
Sources told the Inq that it ran at 800Mhz (some test samples)
Now see what I wrote..
Maybe the Inq got fooled into thinking it was a 110nm core? Maybe it was just a test sample of an upcoming 90nm product (Q1)...TheInq got all excited thinking that it was this GFX was 110nm, the card that we expect.....geez. That's all I said. :slap:Quote:
for all we know the 800Mhz was done on a 90nm core..and not on the 110nm....(not saying that it will be 90nm for those who read the sentence wrong)
Let's all just wait and see how high the improved G70 core will go....
Should be higher then the XT and GTX I guess...but I'm no hardware expert... :)Quote:
Do we have any idea the power consumption of this thing? Looks brutal performance-wise, but a consideration is power usage, especially for those who plan to SLI it. If you need to buy a 600+watt ($250+) PSU just to run them, that's a price bump to consider. Sli 6800gt's run just dandy on my cheap fort 400, but I highly doubt this would as well
dito Tim said: for all we know
as in simply putting up a suggestion.
@Alexio
yeah to many people think that going from 110 to 90nm means the core uses less power.
simply look at Intel and AMD.
and for the 18% bit. In hardware 1+1 is rarely 2.
but usualy its hard to say how much power is saved when shrinking a die from 110 to 90nm cause usualy the companys also put in other power saving changes into the core with a die-shrink.
but eh does anyone have a chart of how much power the X1800 cards use in idle and under load compaired to the 7800 and x850?
but if the x1800 uses alot fo power while on 90nm that doesnt say the 7800 wil to if it wil b made on 90nm.
Quote:
Originally Posted by DilTech
since when does nvidia have access to different ram then ATi?
just becuse they use 1.26ns currently doesnt mean that they cant get 1.1ns, ati themselves said that this memory controller is very furture proof, so technically they could just switch out the ic and modify the bios.
but i wasnt talking about an overclocked X1800Xt, i was talking about an R520 based card with revised power circuits and 1.1ns ram. as i said earlier, seems like the core of this card is limited not by itself but by someother unknown factor, people are hitting a wall with them, and maybe ati did this deliberatley...
anyways, enough with my conspiracy theory, ati shows no signs of this occurence and certainly something would have been leaked if it were to happen.
http://graphics.tomshardware.com/gra...d_land-13.htmlQuote:
Originally Posted by Starscream
as you can see, the x1800xt draws a LOT of power however the x1800xl isnt nearly as bad, drawing slightly less power than the 7800gtx.the x1800xt,s massive power draw has something to do with the high clocks i think...
shrinking a die doesnt really save power.
lowering the voltage does.
but since they always also increase the clockspeeds the power consumption rises again.
wrong...power is not a function of voltage in that changing the voltage doesn't make the system use more power....amps lower as voltage rises and vice versa...there is much more to this than playing with voltages... :nono:Quote:
Originally Posted by Der_KHAN
Whoa !!! I'm like, oh look its an informed member posting in the news for the first time....then I see...NO, its fcg with freaky matrix cat avatar lol...Quote:
Originally Posted by freecableguy
As for Tim and rumours, we generally have a greater tolerance of speculation than other sections, but yes Ubermann is right, backing up your statements with links or quotes helps your credibility while enhancing the discussion :)
Perkam
@Perkam
tim got them 800mhz from theinq
were they got them from no one seems to know although it would have been nice of them to put their source fo that in that article.
btw today si the 7th
isnt there supposed to be a launch today?
http://www.scan.co.uk/
the kock out thing.
my money is on the 7800GS today.
not the 7th here yet :P but there is supposed to be a launch tomorrow...i'm betting on the 7800GS as well :)
Then this 512mb GTX on the 14th.. :toast:
The X1800XT doesn't use much more power than the 7800GTX 256MB...Quote:
Originally Posted by RAMMAN
It has to do with faster RAM speeds (by nearly 50%!) and double the amount of RAM.
The 625mhz R520XT core, on the other hand, probably uses less power than the 430mhz G70 core.
:confused: The universe that you live in must be backwards. ;)
Quote:
Originally Posted by freecableguy
Well truthfully Grayskull Watts are a product of Volts x Amps so based upon the fact that most computer PSUs I have tested do not change amperage when you change voltage increase the voltage and leaving the amperage the same will result in more power becuase watts are a measure of work. However the card will not use more power unless a card parameter is changed(IE overclocking) becuase then you are looking at looking at a different unit of measure Watts per a unit of time.
cetainly, v x amps = watss. lower volts = lower watts = less total power.
drop the volts power consumption goes down, smaller proccesess need less volts to run higher speeds...
I really don't know how it all works but my guess is adding more voltage to a circuit is going to generate more heat somewhere, that extra power has to go somewhere as there is going to be more electrons trying to flow through the same circuit.
Volts/ohms=amps so would that not mean increasing voltage to a circuit would in turn net an increase in amperage which would simply be output as heat somewhere.
Kinda like increasing the voltage to a light bulb, resistance across the filiment is the same but increasing the voltage causes the filiment to burn brighter/hotter when electron flow is increased.
Nonono....amperage * voltage = wattage. Therefore lower volts != lower amps. A downward shift in one will cause the other to go up, however, at the same wattage. Since less voltage is needed with smaller processes, amps can go up with the same power envelope which means you can power more transistors. Or, with the exact same # of transistors, you can have reduced power.Quote:
Originally Posted by Revv23
OK back on topic.... it's the 7th and I see no 7800GTX 512MB...
next rumored date coming up is the 14th.
Shadow.....it's 7 CET in the fricking morning....and I still can't get a X1800XT :owned: :DQuote:
Originally Posted by Shadowmage
Anyway.....patience.
7th isn't for the 512mb GTX, the 14th is shadow. The 7th we're all completely clueless about, we're *guessing* it's the 7800GS.
Now, stop with the flamebaiting. It's against the rules now you know.
:::edit:::
Shadow, here's the card...
6800GS
http://www.monarchcomputer.com/Merch...ct_Code=190443
Signed and delievered, november 7th.
:owned:
That thing will have my 6800GT for breakfast!!! :eek:
for $230! :D :D :D
Not sure about that... 12pipes but 425mhz core... hrm... it'll be close.. looks like they found something to make the super extra marginal 7800GTX cores out of... ;)
WHOA... looks like something else new... a 512mb 6800GT?... er, maybe this is old... lol... news to me tho..
Yea the 512mb 6800GT is old...Quote:
Originally Posted by revenant
Edit: Yea, it's 12pipes....guess it aint gonna beat my 6800GT? I paid $500 for that card :D Now you can have similar performance with $230, crazy