:rofl:
Printable View
FYI - I dont think anybody is trolling here. And certainly not ATI fanbois here.
I loved the $100 specials on 8800GT, and "supported" nVidia with my wallet over years
But, looking at FACTS, the situation looks grim.
chip date #months_difference
GF1 Oct99
GF2 Apr00 6
GF3 Feb01 10
GF4 Feb02 12
FX Mar03 14
68 Apr04 13
78 Jun05 15
79 Mar06 9
GTX Nov06 8
G92 Nov07 12
280 Jun08 7
Fermi Dec09? - 19months and counting since G200.
No matter how you put it, nVidia has slowed down tremendously. Took 12 months for simple die shrink of 8800GTX!
Its already 19 months since the G200 line and counting, and likely by launch it will be close to 2 years! Past "record" was 6800->7800 15 months. Its like after the summer of 08 everybody at nVidia took a 1 year holiday. They missed and skipped a cycle.
source
http://www.neeyik.info/timeline/
Think you need to put G200b in there if you are counting G92 and G71.
The original rumor/speculation/PR BS from Fud/Theo was that a dual card was going to be launched at the same time which is very highly unlikely. It is also unlikely for Nvidia to put a full GF100 on the dual card, they will either have to use a cutdown version like in the past or wait for a die shrink or maybe both.
if you count the 9800 as part of the 8800 line (same GPU core as 8800GTS) then it was 20 months from 8800GTX to GTX 280. Sounds pretty normal for a full architecture change, but if the rumors about yields are true, then it might be a while still
Each of the above listed is a new product, based on a new chip.
Clock speed bumps are not included
ie I dont have GF2 Pro, GF2 Ultra etc.
G200b doesnt count - because its half node, and didn't change product. Whether using G200 or G200b, the GTX260 still has the same name. Same applies for G92 and G92b. There is much better argument for 65nm X2600XT -> 55nm HD3650 because even though same #SP, DX10 -> DX10.1
I'm not even going to acknowledge requests to put in GF9.. GTS250.. etc.
In 38 months since first DX10, the marvelous 8800 GTX, hard working engineers at nVidia only designed 2 high end chips. G92. G200. Neither DX10.1. And only 1 with any significant architecture change to SP.
At the same time, like a dozen "mid-range" products below 8800GT, and a whole series of much delayed 40nm DX10.1 products to cater to OEMs... did nVidia turn into OEM servicing company overnight.. did I miss something... they've changed man.
I'm still thinking that's gonna blow out in late February or March.
I got banned for posting about NVIDIA hardware in the ATI section, ask hilbert, that is the reason. I don't see why I now have trolls following me around on XS and telling everyone I got banned from Guru3D, what is the point? If I supposedly loved my 295 so much then why am I getting rid of it as soon as the GT300 hits?
It's hilarious that Guru3D members are so bored they have to come and flame bait banned members. However if you read the posts I made right before being banned I guess I did lead them to these forums after saying XS can discuss hardware without attachment towards brands. Just because the 295 is the latest dual-GPU card from NVIDIA and the only logiacal comparison for the 5970, doesn't have anything to do with what card I own. If I put a GTS 512 in my sig (it's in my closet) will this harassment towards mentioning a card I own end?
so I should list my old 4850 Crossfire setup? Then I can discuss NVIDIA hardware without all the flame baiters?
lol, I've been a member here for quite a while, I just don't post as much here. Also, I couldn't care less what card you own, it's just you are so annoying about your praise of nVidia and the GTX 295 that you can't even hold a reasonable discussion about anything graphics card related. Say what you want, but you got banned because everyone got sick of you, and you went too far by coming back with a new profile.
I am no ATI fanboy, i use a GTX 275. The 5870 is a great card, the best single card out there. The 295 is a good card, with the disadvantage of using SLi and the problems that come with it.
I loled
Phew, alot of bickering in this thread... The chanses are big that the GT300 will beat the top ATI cards, but only until ATI realease another top card...Back n forth, the way things are...
As it should be Anemic...one on top and then switch around...keeps them competitive.
Is Fermi that hot chick who sings in the Black Eyed Peas? If so, I will take three and do four-way SLI pronto
LOL, that's Fergi, I think.
Please, please, no shouting.
Wont somebody think of the childrens!
AMD went through countless quarterly losses and still around. nVidia's been around for 10+ years and even bought 3Dfx. Regardless of Fermi or not, there will always be the next bigger better faster stronger chip...
or will there? Maybe eco revolution will spread. Its already happening with net-books.. giving up speed for low power efficiency. Consoles already grabbed majority of gamers, and the few PC developer like Infinity Ward are well.. I can't say in front of the childrenz.
if by people you mean admins, then yes I did piss them off. I had plenty of friends on Guru3D after 4 years of posting there. If you really want proof check my profile and look how many friends are listed. Honestly I don't even care if I have "friend" on forums, I'm here to learn and share knowledge, plain and simple.
However you aren't talking about anything related to this thread, so I would call that trolling.
Better just not say anything Ledhead everyone will always try to pic on people like you because of the reaction it gives, it is quite lolz
I don't really care if some people on a forum hate me enough to follow me around different sites, it's their loss (of time).
I call
EPIC FAIL
for this thread
(stick to the topic)
PS: take your fan/forum issues to your psychologist
====================
Does anyone have GPUZ or similar screenshot of Fermi. All the "facebook fiasco" and presentation and others are just "real" photos of the card. I wanna see something substantial that proves silicon revision and/or #SP.