Nothing wrong with Nvidia profiteering and would be no different to partners profiteering. The question is whether Nvidia is going to be culling some of it's partners, and if so how many?
Printable View
The company that's selling them has them in stock, it's coming from their own warehouse. The news letter came from Yoyotech not nvidia, how the hell would 'Our warehouse' = 'nvidia's warehouse'?
I worry about this forum, every leap of logic seems to come from illegal drugs.
http://www.buy.com/prod/engtx580-2di...218169534.html
No price listed yet...
No.
You are mistaking a placeholder ad for an actual ad. Big mistake.
Plus, NVIDIA runs their own marketing campaigns in parallel with board partners' own.
Hence things like THIS.
Plus, I can promise you the ad you showed is not from NVIDIA but rather a creation from a retailer. So where does that leave your logic?
950MHz overclock sounds nice, especially on air. GTX480 was especially attractive on water, and it looks like things just get better! :up:
Perfect actually.
Asymmetry of information fed into a logic system does not debase throughput.
Picture related.
http://img607.imageshack.us/img607/3...9097119726.png
The projects are ongoing after all these years. XS is littered with dupes.
What the, I think beside a couple guys, their is very little NV guys that stick out, they have so little at the moment.
Beside silicondoc(Who got banned crazy fast) who was crazy fanatical, the AMD side seems a quite a bit stronger in terms of people who seem to look like they are shills.
these streets keep me rollin :D
One of the key employment terms is to act rational and fit in.
The more trollish AMD posters are, the less likely they are to be shills.
I wouldn't class myself on either side but it's undeniable that I hold a greater pro-amd stance due to more market manipulation and nerfing end users originating from nvidia's practices than ATI-amd.
It gets very boring farming new info on cards for both sides. That does lead to me becoming more tense and trollish myself and for that I apologize.
Maybe I'll take a break from 20 tabbing google searches.
I think there will be enough reviews to go around.
Since NVIDIA spilled the beans to a certain extent, I did want to bring something up: the effect of temperatures on the GF100's power consumption numbers. Some may remember that lowering the temperature by about 20 degrees resulted in a significant drop in overall power consumption for GTX 480 cards. TPU experienced this with the AMP! Edition.
One may wonder what this new vapor chamber design will do for full-load, in game power needs. :up:
Whoa is this for real ? hired forum members :eek: wonder how much they pay for trolling online :D
In light of that Issue I also do recall a couple of reviews (I believe was EVGA), where the watercooled version did in fact consume less power even if oc'ed when compared to the aircooled version of the same card, sounds like Nvidia is making good use of thermal performance this time around :D
yes this is a very interesting subject. btw Im surprised that vapor chamber is not patented. I thought xfx were the first with the technology to the market or am I wrong? Anyways patents could explain why nvidia are only using this tech now and not with the 480s. Im just guessing but if someone could clear up the deal with vapor chamber I would appreciate vm thx.
Yeah, all he did in the presentation is talk about the cooler of which tech we already have and how quiet it is..um, yeah, also because of the cooler.
Nothing in that presentation says anything about the actual chip. And, yeah I know it wasn't a chip or card launch so why then even mention that stuff. Why don't you just say, "Hey, we finally got a good aftermarket HSF and its cooler and quieter. That's all he had to say. :confused:
And as some other poster on here said, that city demo has been around for over a month on Youtube so....?:confused:
He wouldn't let anyone see the card in the case because he didn't want to show the wood screws I guess. :ROTF:
I bet inside the case was some GTX480 SLI setup and they just removed the video connectors from the 2nd card and replaced it with a solid plate to make it look like a single card was installed. :D
Hey, I'm just saying...its possible. ;)