sweet :p:
Printable View
yeah 7800GTX 512 sickens me .. although it was done on a better manufacture process 110nm vs 130nm i think .. not sure , but that doesnt justify the price tag of $750 in most e/retailer
but anyway $750 + or in today's case $900 for a single graphic card that will be outdated in 1 year is just stupid ...
a lot of people burn $7000 on dell/allianware, and $10,000 on TEH OMEN voodoo ... and those machine will be oudated in about 1.5 to 2 years.. those last a bit long lol
i mean 1.5 years after the launch of X6800 + 8800GTX we will have Penryn ... Yorksfield , and prolly 9800GTX or HD2950 ...
Surely an ratio increase of
0,103154305200341005967604433078
is worth 850$ Does Nvidia think we all are stupid:confused: :confused:
tomorow Nvidia is geting :owned: by Amt-Ati
bring on red cards and own the greens:lsfight:
Another questions@ offtopic is Dave Orthon still at AMD
well i mean i for 1 hate the way they raise the price for premium graphics card ... GeForce 3 Ti 500 was no more than $399 ... and u look at 8800GTX it was $650 at launch and gouged to $700 on most sites ... and that 's not even the worst ... 7800GTX 512 and this Ultra... $750 to $1000 ... wtf is wrong with these company??
but 8800Ultra should OC higher right?? i mean in the review they were able to get 720 on core thats another 100MHz more than stock, so 8800 Ultra @ 720/2400 should be something like 10 to 20% faster than 8800GTX @ 660/2200
however though this is just tooooo expansive... and not worth that $300 to $500 (we dont know the price of it yet inq said $850 but rumor to be $999)
What is this tomorrow crap that DAAMIT will own Nvidia?
I thoguht the NDA was lifted 5/14?
At that price would it have killed them to throw say 1 maybe 2 more heatpipes in there? I don't really care tons, because I'm not getting one and if I did I could do water to the board, but jeez the 7900 has almost better cooling.
Still unless that price drops significantly I'll wait for the 8900 or their 65nm 2nd round. People buying now would be better with either a stock gtx or a cherry picked evga/bfg. This ultra junk is money wasted as far as I can see now.
If i had $900 to drop i would go buy... another 8800GTX, some WB's for them and another 120.3, now thats money wisely spent
Seriously! At that price you are basically in the ballpark of even 2 vanilla GTX's. You could absolutely get two GTS's and pocket the change.
Oh well. (stops beating the silly horse, sorry)
Been looking around the Ultra core is labeled G80-450-A3, the GTX I can find are G80-300-A2. Also 0710A3 would be week 10 or mid march, so a new batch. Anyone have a GTX A3?
http://img264.imageshack.us/img264/5540/image005lp2.jpg
http://img211.imageshack.us/img211/3...oardsbgwm5.jpg
Even with that pricing they will still sell out, which is why they will price it like that. They are only raising prices because of demand, most of you know this ... as long as people keep buying the stuff like they do they will keep the prices high and increasing.
As far as how much the "premium" graphics cards cost 6-7 years ago or whatever, yeah they were probably lower, but inflation should be calculated in there too, although it wont be a whole lot.
I like the little bump in the cooler, though.
Clearly worth the $900 price tag.
Ryan
btw, I still don't believe this ballpark $1000 garbage.
GTX, what you can see looks like the same PCB.
http://img239.imageshack.us/img239/8519/37868ji7.jpg
lol look a bit closer. like read the numbers of the IC's contained within.:stick:
I said PCB, as in board.:stick:
I read the IC's thank you and I'm sure you know but they can change depending on the fab. I say again, no obvious PCB changes at the back end.
Here's another GTX slightly different IC number, new board revision? No. The obvious one is the delta PWM chip HAHE1040-1R0, I don't think that significant.
http://img70.imageshack.us/img70/2207/52339kt3.jpg
Yes, the board has not changed. Minor IC revisions..I bet that pictured card failed Ultra binning. check mosfets around chokes. this is most noticible difference, IMHO, other than higher-rated capacitors. minor changes, and only confirms that these cards are nothing special.
something from guru3d tomorrow.
Quote:
You've seen dozens of rumours, specs, speculations, people overclocking a GeForce 8800 GTX and posting results. But you know what ? .. there really is only one place where you can get your information and actually trust that info. That's right, leave it to us.
Tomorrow (Wednesday) you'll learn about the real specs and performance; right here on Guru3D.com
color of capacitor should denote temperature rating. you'll find some cards have all purple, some have purple/blue, some have all blue. I do beielve that blue is the higher-rated cap. Colour coding is standard accross ALOT of electrical components.
Of course, nV may have just changed the BOM to match what some other OEM's have been using for a while...
With clocks only mildly tweaked, it might be possible that individual board manufacturers plan to offer even higher overclocked versions of the ultra based on their own binning... I could see the "EVGA 8800 GTX Ultra KO Super-duper clocked burn your wallet on the HSF" version coming out soon! :D
EDIT: Actually, I think I may be right... http://www.fudzilla.com/index.php?op...d=757&Itemid=1
I was going to say, you'll see GTX with blue and purple and a mix of both.
http://img111.imageshack.us/img111/4839/99804147vh2.jpg
You're right there are color codes for the solid ploy/mica caps. The colors on electrolytics vary a lot by manufacturer. The one's on this board are 180uF 16v rated to 105C. They're just trying to confuse you, Sanyo use purple for OS-CON and light blue for Al.
http://www.edc.sanyo.com/english/search/pc.html#a07
http://www.edc.sanyo.com/english/pdf/oscon/E30.pdf
http://www.edc.sanyo.com/english/pdf/alm/E16.pdf
In case anyones cares, the color position indicates polarity, the top line is either case type or date. Then the capacitance in microfarads, finally voltage.
yeah..i was not stating FACt that that's what is different, but what we have mentioned is all that catches the eye.