Interesting article. Any word on GK110 "in three days" or are they blowing smoke?
http://vr-zone.com/articles/how-the-...ed-/15786.html
Quote:
Originally Posted by VR-Zone
Printable View
Interesting article. Any word on GK110 "in three days" or are they blowing smoke?
http://vr-zone.com/articles/how-the-...ed-/15786.html
Quote:
Originally Posted by VR-Zone
Oops, sorry, misunderstood. I've seen it pretty frequently. Actually, the monitors I use right now, will push 75Hz at 1280x1024. But none to the same level a CRT could.
It depends on the technology used. But for this discussion we're using Nvidia 3D which is shutter style where one lens closes then the other closes alternating back and forth and your brain creates the image. It does exactly what you say, with 60Hz or typically smooth interpretation video in each eye to create a smooth 3D image. A 60Hz 3D display would be useless with this style, they would need something like polarized or those awesome monitors that had two panels behind each other and the glasses filtered the one in front in one eye, lol.
Those TVs are almost always polarized as 120Hz displays are very costly at that size. The "120" "240" "600" Hz (Plasma usually) are typically those lovely "SmoothMotion" or "CinEngine" chips that will double the frames of the movie artificially to create that effect. Hell, I think some of those might even use shutter style and make fake 3D, I'm not sure, I just know shutter is uncomfortable to me.
They're referring to GTC. They will most likely talk architecture and HPC. Don't get your hopes up for any consumer product info or release schedules. Should still be interesting though.
Quote:
Opening Keynote - May 15 @ 10:30am PT
NVIDIA CEO and co-founder Jen-Hsun Huang will kick off the conference with the opening address. He'll review the dramatic and growing impact of GPU technology in science, industry, design and many other fields.
And, he'll announce some big GPU news that you'll not want to miss.
For those of you who can't make it in person, we will provide a video livestream from the keynote.
Both NVIDIA and AMD always rename their lower-end SKUs. Heck, we're still seeing Juniper, Redwood and Cedar parts from AMD.
Sounds like AMD may need another price drop again...
Either way, while we watch the damage get done on the gpu front back and forth I'm going to sit back with a coke and a bag of chips. Picking up my first 3d monitor (deal is FAR too good to pass up) tomorrow, so I'll be ready to have some real fun when prices fall to where I believe they should be.
Hold on, hold on, hold on...
What's all this about GK100 and GK110?
Was it "always" back then? Or a trend triggered by nVidia and adopted by AMD, or the other way around?
By the way... GTX 670 on TigerDirect already.
http://www.tigerdirect.com/applicati...ywords=GTX+670
http://i46.tinypic.com/2hqruvm.jpg
Whats that, a Ninja launch?
By the look of things, ALL companies using TSMC's .28nm process are experiencing the exact same problem with short supply, not just nVidia. But since this is Charlie we're talking about here, publishing such facts would not further the narrative he is always trying to push.
http://www.digitimes.com/topic/28nm_...c/a001191.html
ATI has been rebranding with Radeon since the beginning. The original Radeon SDR was rebranded the 7200 and the Radeon VE became the Radeon 7000. The Radeon 7000 itself was also a 7200 without a T&L unit. 8500 series was a new design but they rebranded it for 9000, 9100, and 9200.
The Radeon SDR became the Radeon 7200, and the Radeon VE became the Radeon 7000.
No its an early release / broken NDA.
Official release is May 10th, 2 pm BST in the UK. I can't wait to get mine, probably going to get an EVGA reference or longer PCB custom version based on the prices (I'm not paying for anything with a blue PCB, Gigabyte, Inno and Galaxy / KFA2 GTX 670s have all been pictured with blue PCBs, so I buy EVGA reference or EVGA / MSI custom design).
I like how Charlie is always saying,"NVIDIA will be bankrupt soon!" and they just keep rolling along making tons of money.
Market cap AMD + ATi this morning: $5.16b Market cap NVIDIA this morning: $7.68b
Q1 2012 AMD + ATi= $590m in the red Q1 2012 NVIDIA= $135m profit
AMD Dual GPU = MIA NVIDIA Dual GPU = 690 launches to reviews saying it's the best video card ever
AMD flagship = price drops and slow sales NVIDIA flagship= every review site in the world says it's the card to buy, sells as many cards in one month as 7970 did in two, outselling 7970 9:1 (how can this be if no one has any Charlie?!)
AMD 7950/7970= about to get another price drop and lower sales NVIDIA 670= rumored launching tomorrow, leaks say it trades blows with 7970 at 7950 costs
Ace reporter Charlie D. looks at all this and blubbers,"NVIDIA is re-badging two low end OEM chips and is selling out 680s! They're imploding! They'll be broke soon!". :shakes:
How did this retard get a web site? Oh yeah, it's funded by AMD advertising.
Note to AMD: The fable of the boy who cried wolf applies to Charlie. He can yell about NVIDIA going broke every year all he likes, but when it doesn't happen, it becomes trite.
Either way you look at it, Nvidia is getting squeezed out slowly. With igpu's becoming more powerful very quickly and very few games needing that much power to run, Nvidia is in trouble. What's gonna happen when they move to the next smaller process? They're having so much trouble already and blaming everyone else for it. Maybe Intel will do a hostile takeover and assimilate Nvidia's tech for themselves RIF :p
they depend alot on their server level products now
and so its going to be good to watch what amd offers in that segment to see if they are really in trouble or not
I'm no fanboy, but I believe Nvidia is far from being squeezed out. I would however agree that the GPU's(for their originally intended purposes) may slowly be squeezed out as a segment. Though high-end GPU's are here to stay...maybe not for gaming, but they have been redefining computing as an alternative to super-computers when used in clusters. Look at the top-ten super computers in the world today, and begin to see a trend.
I dream of the day when I build my own affordable cluster of CUDA cores or AMD's offerings based on 2 or more next generation High-end cards interconnected.
We'll always need more power. New consoles are coming in which will hopefully present a leap also for the desktop market. Then there are 4k resolutions slowly taking off.
The same thing can be said for AMD. The more powerful igp become, the less potential for revenue for their own parts in the dedicated graphics space. In addition, the R and D hit for developing such parts seems to have taken a hit for AMD server and high end desktop processors. AMD server parts used to be huge money for them as it was the area that they remained the most competitive, but their marketshare right now is 5.5%. It seems like after purchasing ATI, their CPU processors parts are either treading water or bailing it out. Same with their desktop side.
The only area AMD appears to be gaining marketshare from a CPU point of view is the mobile space but as you said, iGP is becoming fast quickly and what happens when Intel graphic parts become fast enough which could very well happen with haswell(not to mention this market might be shrinking in the future because of ARM based processors)? With intel having the stronger brand and with the faster CPU part, AMD will be :banana::banana::banana::banana:ed if they don't come up a faster CPU. The public cares alot about a decent GPU, but they care far more about the brand and the CPU performance, hence Intel's massive lead over AMD. People are willing to take a hit on GPU performance for branding and CPU power. And when intel graphic part becomes fast enough for the mainstream, people won't buy AMD processors anymore except for budget builds. This is an AMD that cannot stay afloat.
I think Nv chances of succeeding in the professional space and supercomputing space are a lot better than AMD chances at developing a competitive CPU architecture; both of which are necessary for the long term viability of both companies.