I did:confused: Which part? Are you refering to intel's monopoly abuse(if you are I %110 agree) or something else? I'm not sure if i'm not following what you are saying correctly, or if you aren't following what i'm saying correctly! :p:
Printable View
aint we there yet :/
i'm hoping they've been taking lessons from ATI on how to keep a secret, recently ATI has been supprising everyone at it's product launches with features and performance. anandtech's recent article on the birth of cypress gives good insight as to how they compartmentalize certian aspects of design to controll the number of people who know what's going on. also, they apprear to be using different code names for the same feature across departments to confuse anyone on the outside looking in... if nvidia has begun doing similar things, that would explain the silence... i hope.
Big LOL to the Nvidia marketing director for doing this::shocked:
A marketing director for Santa Clara-based Nvidia is facing a misdemeanor charge after he allegedly told a flight attendant at San Francisco International Airport that he had a bomb in his jacket, prosecutors said Monday...
Full Story and Source - http://www.mercurynews.com/breaking-...nclick_check=1
EPIC FAIL for Yushing Lui :ROTF:
no wonder things in nvidia are not shaping up with employees like this :shakes: , and no sign of Furbi yet.
Wasn't the 2% yield myth debunked a couple of months ago? It was something lost in translation or something like that...
Correct, supposedly was ~10% but higher than 2%.
It was neliz who posted the info before Charlie posted his 2% article, neliz also somehow knew that Charlie would get bad info. ;)
http://forum.beyond3d.com/showpost.p...postcount=2123
AMD's biggest surprise was the 4800's (and mayyyyybe 3800's somewhat) but definitely not 5800's and not at all 5700's. The performance of 5800 was expected and it was even disappointing that AMD didn't take any risks. Juniper performance was flat out disappointing.
well he might be right... after all igor is the king of destroying credibility... his own that is... :D
compute shader for games? idk... except for ai and physics, what could you use it for? and nvidia will push devs to use physix instead of compute shader, so all thats left is ai... and that works very well on the cpu and nobody likes to touch ai programing cause its so damn complex...
damn... i cant find it... it was an nvidia press event... very recently... like a month or two ago... and they showed that they spend a lot on rnd and that it steadily increases year over year... they had a linear graph showing rnd spending...
i dont remember where i saw it... it was a site i never visited before... i think i got linked there from the news section here... hmmm
well nvidia DOES have a booth at cebit this year... they didnt in the previous years iirc... so i think they will "launch" it but whether it will actually be for sale after cebit... who knows...
new one? link? :)
hmmmm competition caught up and at the same time their new architecture is delayed by at least 2 quarters if not 3 (it was supposed to come out before nov originally), their gpgpu segment fails to kick off, their tegra segment fails to kick off, they lost their chipset business almost entirely thanks to intel locking them out and amd not extending their license either, the gaming industry moves more and more away from the pc as a gaming platform and to consoles and mobile devices instead... yeah... this is one of nvidias best quarters ever... 0_o
hah, thats hilarious :D
in germany there was a similar case a few months ago, where a man claimed he had a bomb in his pants... referring to his schlong... unfortunately mr testosterone insisted on this joke when asked repeatedly by security personnel, so they took him into custody, he and his family missed their flight and the airline refused to give them new tickets since it was his fault they missed the flight. he ended up losing around 4kE in ticket costs, had to pay for his own interrogation which was another few kE, plus a fine for obstructing traffic and operations of the airport of a few kE, PLUS his wife/gf supposedly left him with the kids after this little joke... :lol:
Yeah, ati's biggest hit lately has been the 4800's. Nvidias has been g80/g92. I still think the GF100 will be powerful, they just don't have enough full parts at the highest speed bins to start braggin and throwin around benchmarks. GF100 is quality performance IMO. (just can't deliver them in large quantities - hehe) Conversely, the chip's design is more "quantity" for it's time. I think the "Fermi2" refresh will be a hit. This first 512 shader GF100 part should be amazing, yet hard to find.
It's awesome how people fail to see that the R600 design too was long, and the card ended up big, hot and disappointing; but AMD used nearly the exact same design in 3000s, 4000s and 5000s and all those cards kicked butt.
If this is a good design, it might fail for Nvidia this time (which I believe it will) but could be the base for some awesome products later this year or next year. An underwhelming product doesn't mean that Nvidia is dead.
A great deal of graphics effects can be implemented as massive parallel computing tasks, getting around the limitations of the traditional shader pipeline and repetitive texture fetching, like the various ambient occlusion methods, full-res lens effects, order-independent transparency (a bliss for all the deferred engines out there) and many more.
Source: http://www.fudzilla.com/
Not so bad if you ask me. Higher TDP as expected, but looks like the cooler will be more decent so it's not a big deal.Quote:
We got some word that Nvidia's upcoming Fermi-based Geforce GTX 480 single GPU card should be as hot as the GTX 285. These are still preliminary results, as Nvidia still has to send out the final cards with the final coolers, but this looks like slightly better result that the GTX 480 that we saw at CES 2010.
The TDP of the GTX 480 will end up higher than on GTX 285, but with the help of a good cooler the thermals should be very close.
The noise lever should also be very comparable and the card should not end up much noisier than GTX 285. We heard numerous times that GTX 480 will end up with different cooler than the one we saw at CES 2010 and that was showed to Nvidia's loyal press at Editor Days.
All in all, you won't burn your fingers on the GTX 480, but it won't be cold either.