Wouldnt the price of $649 make the GTX280 roughly the same price as 2 4870's in xfire ?
Printable View
Wouldnt the price of $649 make the GTX280 roughly the same price as 2 4870's in xfire ?
Any pricedrops for the 9800gtx?
Thinking about getting one..
The slide is also 2 years old but i didn't notice.
http://www.tomshardware.com/reviews/...m,1227-17.html
Okay, but the question is now if 2 oc'd 4870 will outperform 1 oc'd GTX 280, that would be interesting to know. What is sure is that the Ati couple will need more juice to do it's job... what's left is how will these $-prices translate into €. If it's really 419€ then this is really nice!
Can someone post the prices from the pic? Its filtered out here at work.
$450 and $650
Is E6600 2,4 @ 3,1 a big bottleneck for G- monsters?
I love the INQ for letting us know about that GT200b tape-out!!!
If the INQ were not around, some of us buyers would be real pissed off when Nvidia releases a cooler, faster, more power-efficient 55nm version 2 months later, or less!
For sure, it will be more power-efficient (nearly 100mm^2 less silicon--exactly the same as 8800GTX/Ultra, and cooler temps). I also think it should be able to clock at least 5% higher, if not 10% given the huge number of transistors/power consumption/silicon die size estate thanks to 55nm.
This has just priced me out of the market, i'll have my 8800GTS O/C card a year in July...was considering an upgrade, now....not.
Bo_Fox the 55nm refresh will more than likely be released in November/December.
And to be honest, the possibility of a 55nm refresh sounds like old news to the ones who have been watching the GPU's "revolution" for a year or so.
History repeats itself ;)
hmmm... this year is going to be a expensive year... 600€ for GT280, 500€ for a new phone, + 1600€ for nehalem.
€500 for phone?, i'm guessing a 3G Iphone?
IMO those high prices are indicative of the lack of Ati 4xxx series performance... :/
again... :/
Funny how one obviously biased idiot writing a single article can sway the opinions of so many people.
280 performs 44.4% better than 260 then? :rolleyes:
:rolleyes::rolleyes::rolleyes:
:rolleyes::rolleyes:
:rolleyes:
i'll answer that one myself:
NO, but that's the price of being at the top of the dungheap:hehe:
on a more positive note; the $450 price does sound quite reasonable....but might have to wait a while until this actually occurs in the shops. prices will be gouged to deth until supply is plentiful.Quote:
However, patience is key in this situation.
Do you guys think the gtx 280's will come with a 6-pin to 8-pin adapter?
My PSU has two 6-pin connectors only.
3870x2; some came with adaptors, and some didnt.
GOD F%$*!# D$@*& I I NEED BENCHMARKS i cant wait :p:
GTX 280 is looking to be G80 reborn, right down to the launch price and potential to be futureproof. Now I 'know' my 8800GTX can go to pasture without compromise:)
I think that after the G92 and the 3800 series people are going to have a hard time swallowing 550+ for a video card. They have fed us a normally unheardof level of performance at the $150-250 price range, something that normally costed $400-500 easy.
$350 is most most I could see myself spending on a videocard. And now with Tri sli, 650 per card is a bit steep...
At least AT&T is willing to subsidize the iPhone now..
650$ US is kinda cheap (not for the Americans, but most others). With import taxes (+25%), bought in the USA, this is still cheaper than 8800 GTX was in my country (Denmark). Compared to my XFX 8800 Ultra XXX, it will be ridiculously cheap to buy.
Excellent :)
Do you fully understand the situation? Are you 100% certain that there are no benefits with the revised core? If so, it makes sense to disagree. But if you are not 100% sure it does not make sense. If you paying $600+ for a video card that's high end you expect to get the best quality that series of card has to offer. That also includes any revisions that comes along with it. Having said that, I couldn't disagree with your disagreement more!
Well again, if you keep waiting for better revisions you will never buy anything as there is always something better a few months down the line.
I think what's important is when this supposed revision will come out. BenchZowner, who seems like a reliable source, said that a revision will probably not come out until November/December. That's too long to wait for basically the same card.
Now if it were going to be out in late July I would strongly consider waiting.
It's revision, not revisions. That makes all the difference here between our dialogue on the subject. Besides, statements like that are intended when going from one gen to the next gen. Not when you know ahead of time that there will be a revision in GT200 before it's actually released! Besides, there is no actually ETA set so we really don't know when to expect GT200b based cards.
http://img67.imageshack.us/img67/950...oducingeo6.jpg
nVidia G80 = 581 million
nVidia g92= 754 million
nVidia gt200 = 1400 million
ATI R600 = 700 million
ATI RV670 = 666 Miljoen
ATI RV770 = 900 million ?
ATI R700 = 1800 million total ?
nVidia 8800GTX = 518 Gflops
nVidia 8800ultra = 576 Gflops
nVidia 8800GT = 504 Gflops
nVidia 8800GTS = 624 Gflops
nVidia gt200 = 933 Gflops
nVidia 9800GTX = 576 Gflops
ATI R600 = 474 Gflops
ATI RV670 = 500 Gflops
ATI R680 = 1006 Gflops
ATI RV770 = 1008 Gflops ?
ATI R700 = 2016 Gflops ?
8800gtx ruled for how long? although price did fall over time.Quote:
Well again, if you keep waiting for better revisions you will never buy anything as there is always something better a few months down the line
i wonder whether 4870X2 will compete with gtx280 or not ?
mm me thinks about a 4850 and a gtx260 to replace my current cards....maybe.
The 2 extra pins are just ground. Just need the correct 6 pin and 8 pin plug/jack, wire, and pins.
Why are you guys so concerned about G200b? NV will do basicly the same they did with G80-->G92. If you want the GTX 280 perfomance just buy it now.
Well, I for one will take the 50 dollar step up and get rid of this GX2. I find a single card to be better than two and this looks like just the replacement for me. My GX2 flies, but I am on my third card now after RMAs.
So if the cards launch on 18th June, when will they be avaliable in stores like Newegg?
Which should be viewed in light of the author's transparent hatred of Nvidia and his laughable history of recommending holding off purchasing the G80 in favor of waiting for the R600 numbers about which he stated:
http://www.theinquirer.net/en/inquir...rts-thickeningQuote:
Originally Posted by A Shill
Charlie's goal is to write anything that might prevent people from buying Nvidia, period.
Yeah he didn't have a good record w/ R600 vs. G80 *but* no one else did either. In fact, going off rumors at the time, half the people didn't even think it had a unified shader architecture.
Anwyays, he does have a hatred of Nvidia but a lot of what he said in that article is true - you just have to sift through the facts from the spin and the conclusions
i think the most i have ever spent on a videocard was a 6800gt for $388 bucks back in the day.
Charlie from the INQ hates on nVidia like its a part of his religion.
Usually, it takes like 3 months in between the tape-out and the product availability.. am I right?
Be careful--you might end up calling yourself one if you buy a GT200, only to see a 55nm version released right after you started to enjoy your 65nm card! What one says can show a lot about one's own potential self-criticism.. (Just kidding--rubbing you the wrong way, eh? :P)
i Hope newer and better games will challenge the GTX 280's power. hehehe
Life to short to worry about stuff like that. Its not like we are going to see a crazy amount of performance difference.
The only people who should care about this stuff are benchers aiming for the record. Theres not going to be a big difference in playability.
Knowing the intentions of this guy and listening to a tool like charlie, is foolish and playing into his game. A guy that calls a card slow and obsolete which is the 2nd fastest thing availible at the time obviously has a different agenda on his mind.
Charlie + nVIDIA = validity = unknown word.
Please don't tell me that after reading the quotes from his "articles" ( Laugh out loud please ) in the previous page you still believe anything from this guy...
For what it's worth I'm telling you that in February 2009 nVIDIA will come up with a 55nm "enhanced" refresh of the 55nm refresh of the "GT200" you're going to start advising people to wait till then ?
In IT if you wait and believe everything that can be found in the wild from various "authors" with or without sources, with hate or without, then you'll never upgrade.
Childish behavior, pretty much like "mister" Sanders in the past.
There were news that had some correct info, but of course that wasn't from the Inquirer, DailyFetch :p: or the "trustworthy" chiphell, bleachhell, blah blah.
You just need to filter what you read.
Also using something like saying that nobody got the info right back then, does not help or withdraws what TheInq and any other site that call themselves reliable with those obviously biased, and pure speculative "articles".
When I don't have a clue about something I do not talk about it, and of course I'm not saying things that originated from my "wildest dreams or nightmares".
I'd rather not say something or publish a not so sure article, than losing my credibility.But that's just me...
;)
3 to 4 months from tape-out to production yes.
To availability ? That depends on what the manufacturer wants to do ;)
Some hate, some love some companies.
You don't have to "listen" or "believe" them though.
You have the right to filter & think about what you read.
However, posting just that isn't what I'd call constructive, offering nothing, and of course opening the ground for a flame-war.
Any preorder available yet?
I think there's nothing to worry.
It was said from the beginning that would exist some 65nm AND 55nm.
It was said that Nvida is playing safe.... and will use the 55nm FIRST in the low-end as a test.....
pretty sure that AFTER some successful test, there will be a TOP 55nm chip............. just after..... let´s see how the low-end 55nm chip behave.
IMO
I can still remember the same stupid comments because 7800GTX was supposed to be hot, and they had a display model with a burned out transistor at a show. TheInq went crazy over it. That's when Fudo was anti nVidia as well, mind you Fuad has never been this bad.
They said the same thing about 7900, 8800, and now GT200.
Silly isn't it.
I think the new wave of graphic cards will last for a looooong time since game devs won't bother to add more complexity to their games more than what a 360 and ps3 can do, maybe just a bit higher detail here and there but not too much, we already reaching the limits of what those consoles can do and they are going to stay on the market for some good years until the next gen.
awa man i want one of these new cards but dang theres another one coming in Sept, damn i dont know wtf to do i just dont know if i can continue to hold off :(
http://resources.vr-zone.com//newspi...erformance.jpg
http://www.vr-zone.com/articles/GeFo...ance/5817.html
Quote:
CJ sent us a performance chart comparing Radeon HD 3870 X2 against the GeForce GTX 280 and GTX 260 on some of the games out there. Although, the benchmarks are done internally by Nvidia but it is probably conclusive to compare GTX 280 and GTX 260 here. The drivers for GeForce GTX 200 cards is the upcoming GeForce Release 177 while the drivers for 3870 X2 is a rather old Catalyst 8.3.
Cat 8.3? lol
Would cat 8.6 or 8.5 make a visible difference?
A lot. Cat 8.3 date from 4 March. To Cat 8.6 (begining June) will be 3 months difference.
But you know, those benchmarks are from PR Nvidia. They don´t use Cat 8.5 for a simple reason. The graph will look diferent.
For example, the Cat 8.5 release Notes:
Quote:
The following performance gains are noticed with this release of Catalyst™.
* Call of Juarez DX10: Performance increases up to 12% on systems containing an ATI Radeon™ HD 3xx0 series of product
* Halo: Performance increases by 10-30% across all of the supported ATI Radeon™ series of products
* Lost Planet DX10: Performance increases from 5 to 35% on systems containing an ATI Radeon™ HD 3xx0 series of product
* Stalker DX9: Performance increases by 20-50% when HDR is enabled in the game; across all ATI Radeon™ HD38x0 series of products
* World in Conflict DX10: Performance increases up to 25% on systems containing an ATI Radeon™ HD36x0 and/or an ATI Radeon™ 38x0 series of product. Higher performance gains are noticed on systems contianing an ATI Radeon™ 3870x2 series of product
Shouldn’t they’ve put GX2 in that slide instead of 3870 X2 ‘cos GTX280 is replacement for GX2’s price segment…
Well either way if according the release note only game in comparison here ( and I bet Nvidia chose the ones that work the best on their hardware) only in World in Conflict ( if DX10 run) the green bars would be a bit lower in this Nvidia Graph...(unless previous cats improved the other games too : too lazy too look there)
Secondly I've seen this being mentioned many times that overal performance improves but 25% that's a big one mate...like going form 20 laggy to 25 playable frames with same hardware...
But overall if we can believe the data, these Nvidiots are gonna be screamers on ya screen and in ya pocket too... and they are in a total diferent market segment then 3870X2 which cost half the price...
Hard to compare these cards bang for the buck....
Crysis + 1920x1200 (very HIGH) = 2009 !
Those results (if remotely true) are all over the place. Specially the differences between 260 and 280. Look at how the 260 is on par with the 280 in COJ and, the 260 is nearly as good as the 280 in WOC? While in other games like:
Fear
COD4
UT3
results are hardly impressive. In most cases the 260/280 tries to be at least 1.8 times better then the X2 but it still fell short in Fear, COD4 and UT3. When you look at it from this POV results are up here, down there. The 260 ties the 280 here and just as good 280 there. In all, performance results are all over the board!
but the 3870x2 gets the same in all games i don't think so
just PR spin
Even though CJ has some reputation, these graphs must always be looked at with very careful eyes... Drivers were not the best for HD3870X2, and it's a nVidia graph, comparing a rival product. If it were from a reviewer, it would have much more meaning, IMO, if done with the right settings.
Mind is, always look at info like this as not being the real truth, but nVs perspective about it ;)
It's not CJ's own graph - he just sent in nvidia's PR one
Heh... pretty bad scores on those slides. 50-70% faster than a 3870X2 on average for the GTX 260, yet a pair of 8800GT 512's in SLI runs about 50% faster on its own, costing $250-300, versus the GTX 260 for $450 with only 10-15% better performance per these charts to that setup. I just pulled my eBay listings for my pair of cards just in case... hedging my bets ;). Especially with the fact that these are nV slides (if true) that would indicate a favorable setting for the GTX 260/280, meaning the difference could be smaller. I find it interesting it shows some of them as 2x faster or more but in those ones it shows the 260/280 as both very high which makes it odd.
I prefer to buy a single Gpu card like the G200 than the 3870x2 , even if the performance are not so good as we could expect .
At least in this case , there will be no problem with the driver for the futur games ! :)
You guys are really under rating the 3870X2...
In alot of games ( besides Crysis ), its beating the Ultra quite alot...
http://bp1.blogger.com/_veswGS4uNJQ/.../08_result.png
If the GTX280 really is twice as fast as the 3870X2 without the pain in the ass that is crossfire/sli, then its a huge improvement imo..
A single card that is twice as fast as the Ultra, what do want?
Guys expecting it to be much faster ( 50% ) than a 9800gx2 in games that scale well are crazy imo...
4870 vs GTX 260 should be quite interesting :)
Perkam
Cat 8.3 vs Cat 8.4+Hotfix:
http://www.rage3d.com/articles/vantage%2Dhotfix/
There are the Cat 8.5. You can clearly see by that the graph made by Nvidia hould be totaly diferent if they use Cat 8.5. The diference is huge.... like day/night.