Same applies to ATI's chips right? Still nothing, and that's with one week to go.
Printable View
very reliable about the 5870 performing about the same as 295, wouldnt trust the gt300 10-40% above 295 rumor... nvidia doesnt even have their first silicon back so it must be based on nvidias calculations/predictions or they told that to their partners cause thats how fast gt300 HAS to be to prevent nvidia partners from becoming nervous, or its wishful thinking... def not reliable...
doesnt mean its not true... but its not reliable...
you can do that with any ati vga since the 680 chipset :D
def nice :) and atis hydravision finally works right with that as it now centers everything to the center of the middle display and not exactly the border between 2 displays like it does with 2 monitors :D
To(V)bo Co(V)bo, you didnt really think nvidia didnt plan to build a dual gt300 right? when they planned to do it is another question, but they def had plans for it all along... will they need a dual gt300 to bear the 5870x2? of course... will their dual card beat a 5870x2? thats the more interesting question!
in theory, yes... in practise they are facing the same power limit as ati, so the only way to beat the 5870x2 is to be more energy efficient... and thats something that nobody expects gt300 to be, and for a reason :D
its interesting, originally ati used to go for the brute force approach in raw perf and nvidia went for efficiency, then ati managed to outdo nvidia in efficiency with rv670 and since then they have had the lead in perf/transistor and perf/watt, depending on the tdp the advantage isnt very big though... if nvidia reshuffles their transistors and boosts efficiency only by a little it could be enough to beat ati... and then they have double gt200s transistors with a higher efficiency per transistor, which is probably where the 2.x times gt200 perf rumor comes from
seeing that nvidia has recently not done that well engineering wise... i mean everything after g92 was just a copy paste patchwork with some tweaks... im not sure if gt300 will really be a new design or just gt200 doubled up with a dx11 strapped to the side with tape :D
well i guess perf is going to be better than most people expect and ati want to surprise everybody...
not saying the mars dual gtx285 card from asus sells well... but it does sell... and the 8800 ultra wasnt far off from 999...
asus mars dual gtx285... doesnt sell well i think, but there are def people who buy them out there... not many, but they exist...
I think $400 is a good price for the 5870 especially if it does indeed have 1600 SP's. This is the first DX11 card and the first time ATi has beat nVidia to a flagship launch if ever.
It's not the $300 people were expecting but that's because everybody assumed that because the 4870 was $300 @ launch the 5870 was going to be. That's wheat you get when you assume. :D
They could gouge the price even higher like nVidia does when they're the first to the market i.e 8800GTX $500, GTX285 $650 but they didn't. They're maybe charging $100 more than they did last gen but so what? They've been on a roll with cards, they're the first to DX11 and it will hopefully give the company some breathing room.
I say bring it on. This is the most exciting graphics release for me since G80. As soon as the GT300 is released we will see a reduction in price and the launch of the 5870x2. So those people like me, who are going to be purchasing a card at launch will be able to add a 2nd card for crossfire on the cheaper.
I think G80 to G92 was still some notable tweaking, and R600 ro RV670 was some major tweaking...
G92 to g200 no tweaking at all... was it? just pumping it up...
and g200 65nm to 55nm was no tweaking at all either, was it? just shrinking it down... g200 to g2xx was just gluing 10.1 and gddr5 support on and shrinking... i really hope g300 isnt just g2xx bumped up again...
thats what nvidia says, but they decide whats written on the chip... they could as well print A5 on it... and if they would have had semi working g300 that long ago im pretty sure they would have used it somehow in pr or showed to their big partners... but they havent... so i think thats just nvidia pr... i think the big chip that taped out in 40nm that long ago were 40nm gt200 direct shrink attempts, not gt300... of course, they might have doubled up gt200 and called it gt300 if it would have worked...
For what it's worth both companies ( AMD & nVIDIA ) have been very secretive the last few months, and do their best to make sure nothing really useful gets leaked ( months before the G200 launch I had archit. info that I partly released and other people had that and other info as well, this time nada from both companies ).
Does this means that they both have something good in their hands ? Possibly.
But it could be the opposite as well.
Roll on the 10th.
If no big news posts come from the 10th then i'll FFUUUU
yeah but i wonder why... the head honcho at amds graphics divison, read ati, just admitted in an interview that gpu specs are completely frozen more than 1 year prior to launch... so whats the big deal then?
whats the reason to be so secretive then? creating a hype in the market? idk... if you dont leak anything about perf how would that get people excited? right now its really retarded cause prices have leaked before perf has leaked, which if anything creates an anti hype cause prices are higher than expected and nobody knows what perf to expect...
i dont get it, really... i think all this secrecy is just cause some people like playing corporate james bond and are paranoid about the competition knowing what their plans are... its almost as if the apple spirit has infected ati and nvidia :lol:
edit, oh and intel too...
g300 better be new for nvidias sake... if its not i'll probably be a loyal AMD fan until they can make something truly new
I mean after their epic success with g80 they quite literally have done nothing noteworthy architecturally while ATI has stepped up... apparently 3 times in a row
yepp, definately... r600 was a big ouch, ati has really done a great job while nvidia has been slackin... :D
gotta give nvidia lots of respect though, their arch was that great they really could slack and screw up for this long and they are STILL very competitive... 40nm gt200 could still compete very well in the mainstream segment and G92 shrunk to 40nm would be amazing for entry level and laptops... seriously, who needs dx11... but if they want to keep the perf crown they really need to get their stuff right now... if they havent gotten a propper new chip out by mid 2010 even a theoretical 40nm GT200 and G92 wont keep them alive for long...
well is that really such a big jump? not every unit has to be able to do diferent instructions, they might just add one beefy unit to each sp block similar to what ati did?
That's a load of crap. HD2000 and HD3000 were epic failures that didn't even remotely compete with the GeForce 8.
Then came the HD4000 series and GTX200 series. Both were a significant step up from their predecessors.
That's funny. Now you do of course realize, that the RV770 and upcoming RV870 are still using the R600 (HD2000) architecture?Quote:
they quite literally have done nothing noteworthy architecturally while ATI has stepped up... apparently 3 times in a row
Agree, AMD said (according to fudzilla) that 2 years prior to launch very little can be done to the architecture, so even if AMD would've known about GT300 for 2 years the changes that could be done would be very minimal, and that 1 year statement surely is very true.
The only thing I can see AMD or Nvidia doing is bumping clockfrequencies I suppose... But I totally agree it is weird that they both keep this info so tightlipped.
3000 series actually competed very well at a price perf... hd2000 was crap I give you that..
and I agree about the next series although ati won.. essentially by beating them price/perf wise
and it may be essentially based on that achitecture but they have changed it and added things, all nvidia has done has a shrunk + renaming which is lame.
thats what I meant 003
3, 4, and soon to be 5 series from AMD appear to be VERY good on price performance compared to nvidias offering at the time
each brought substantial (... 5 series unknown but really its probably going to be close to double) performance boost over the previous gen
Nvidia on the other hand...
8800 gtx --> 9800 gtx... what 10 20% max? 9800 gtx --> gtx280 was 50 - 60% (double some cases but few) and it wasn't really an improved architecture just more "schtuff" added to it the core differences between g80 and g200 are minor compared to the differences between 2000 series and the 4000 series