I'm holding a candle out for under $600 USD in the U.S. at least... economy sucks :rofl: .
Printable View
That's right, I see endless streams of people bashing crysis for not being able to be run at max on their old 8800GTXs, while there are good reasons for this. Crysis looks so much better than other games to justify it definately for me, as you can easily see how very advanced the shaders are compared to CoD4 ie.
Crysis is what it is - a tech demo for future engines. Of course, it remains to be seen how many other games pick up the engine considering the previous one didn't do as well
http://largon.wippiespace.com/smilies/squint.gif
I've seen that pic somewhere... years ago... likely in a factory tour article.
Someone's pulling your leg.
I really hope you are right, but a quad at 4ghz and 8800GTS 512 in SLI gave me some very jerky moments... so I had to cut down on the res and gfx detail... any game to me that doesn't run well and needs hardware that will be released in the future (what 6 months after the game's release) to be sort of playable with decent IQ... is a major bummer to me... Crysis has pushed peeps to upgrade and yet till now it's not really fluid on high detail... on an X2 card it's way better but how many have bought those, the gamers are usually looking for hardware in a lower price segment...and you think these guys will upgrade again in June...
I just want fluid gameplay in an FPS...at all times... surely with decent hardware...
But to each his own opinion, I hope these babies ( or the ones from the red team) are worth every dime like the 8800GTX was to me...
And that stack of naked cards has been indeed seen before, maybe it was another range of cards but I've seen something sortlike before ,though not with the ninja :)
Have you tried to play the game using XP and tweaked cvars? I don't think so :rolleyes:
I can play Crysis fairly decently at 1280x960, Ultra High with my sign rig (gfx @ 760/1900/1800 core/shader/mem).
I think with some tweaking your system would run Crysis really nice... :)
The game does run well, but not at it's maximum, note that they have the ability as well to patch the game to Ultra settings in future, and said by Yerli Cevat. Even with 8800GTX SLI I had to turn down settings to high and very high. This game has been fully optimized for the G80 architecture, they had the 8800GTX looong before it was announced.
I mean, they could have named the medium settings high, then everyone would be able to run it, but instead they wanted to future proof the engine, which is a great idea really.
Farcry did the same thing, and that was a benchmark for literally years. Scaled very very well, and still looks great up to this day.
Tweaking can be crucial in any game in order to get the performance and graphics your computer deserves.
Quote:
regardsQuote:
Next-gen NVIDIA GeForce Specs Unveiled, Part 2
Although it's a bit risky to post information taken from unknown for me sources, not to mention that the site is German, it's worth trying. The guys over at Gamezoom (Google translated) reported yesterday that during NVIDIA's Editors Day, the same place where the F@H project for NVIDIA cards and the buyout of RayScale were announced, NVIDIA has also unveiled the final specs for its soon to be released GT200 cards. This information comes to complement our previous Next-gen NVIDIA GeForce Specs Unveiled story:
- GeForce GTX 280 will feature 602MHz/1296MHz/1107MHz core/shader/memory clocks, 240 stream processors, 512-bit memory interface, and GDDR3 memory as already mentioned.
- GeForce GTX 260 will come with 576MHz/999MHz/896MHz reference clock speeds, 192 steam processors, 448-bit memory interface and GDDR3 memory
The prices are 449 U.S. dollars for the GTX 260 and more than 600$ for the GeForce 280 GTX. That's all for now.
Gamezoom
http://forums.techpowerup.com/showpo...31&postcount=1
Well I predicted $450 for 260 and 550 ~ $600 for 280 so not that far away... I can't understand why 280 would be above $600 tho if 260 is priced $450, sounds like a too huge gap. $600 I can still accept but say ~ $650 would start being too much at least just looking at the specs and trying to translate it into some perf difference. Guess being top of the hill just wants that much price bump eh or the retail price will actually be slightly lower (except the initial price gouging)?
if it's 600 then with the initial price gouging and if sells out fast or low stock etailers well jack it to 800+
In Sweden price gouging is non existent. Hope the competetion from the red team will be substantial this round, so that the MSRP on the 280 can go down to as low as 500 USD (this would be optimum, yeah). But that is probably far too optimistic.
Just joking but I bet NVIDIA's doing an internal poll with following options:
1. $550 - Aggressive
2. $600 - Reasonable
3. $650 - NVIDIA price-gauge™ (aka "The way it's meant to be priced")
...every NVIDIA rep votes #3. \o/
80 ROPs ? Don't think so ;)
Edit: There's at least one error in the specifications sheet posted above ( not only the wrong units on the wrong line )
http://resources.vr-zone.com/image_d...4f592cb84b.jpg
i dont realy understand everything there but the GT280 dont look that mach butter then then RV770XT:shrug:
any reviews out?:rolleyes:
do you guyz think that the GT280 will out preform 2x8800GTX in SLI by allot?
Awww @ 260 having larger power draw than G80 Ultra! :(
Time to find an AC unit on sale. :D
yeah ur right @ that chart here:
http://forums.vr-zone.com/showthread.php?t=280553
1st of all the ROP and TMU row should be switched for All nvidia cards
2nd i am pretty sure that G80 8800GTX has 32 texture address units, and 64Texture Filtering Units, vs G92 8800GTS 512's 64 texture address units, 64 Texture Filtering Units
http://www.bit-tech.net/hardware/200...8800_gtx_g80/6
http://www.bit-tech.net/hardware/200...orce_8800_gt/2
G92:G80:Quote:
16 stream processors, and share eight texture address units, eight texture filtering units and its own independent cache.
Quote:
The clusters of sixteen streaming processors also share four texture address units and eight texture filtering units, making up a total of 32 texture address units and 64 texture filtering units in GeForce 8800 GTX.
heh yeah, but i think it will be gouged to like $700+ at launch, hopefully not though...
i iwll prolly wait for 2/3months since no game will be coming out b4 Sept that cant be run on my G92 GTS SLI, and hopefully by that time the price will drop to around 600 USD
Hmmm the 770XT might make some noise, considering the power draw and price... Definitely between that card and the 260 for me.
correct chart:
http://news.mydrivers.com/img/20080526/11132493.png~
I'd say 649$
Sub 1 Tera flop
Meh..
If the Ultra can get 12 fps on Crysis @ 1920 by 1200, very high, 8 AA & AF on Q9650 @ 4.0ghz
What will the Gtx280 get?
23fps
You can't scale flops exactly. ANd 933 Gflops IS the number - its calculated by 3 (2 MADD + 1 MUL) * 1300 Shader Clock * 240 SP = 933 GFlops
But on the other hand, apparently G80 MUL wasn't working 75% of the time so it was closr to 2 * 1500 * 128 = 350ish Gflops rather than the theoretical 581
Also, thats why you cant compare numbers directly over different architectures, even within derivatives (such as G80 Ultra vs. G92 GTX, where clearly the G92 GTX architecture has some memory bottlenecks and performs up to G80 ultra performance at higher resolutions, and even loses some when AA and other things are on at higher res)
http://www.fudzilla.com/index.php?op...=7523&Itemid=1
If fud is right, we're looking at 600mhz for the core, which seems pretty high considering how large the die is, definitely going to run very hot. That, and that means what the inquirer has said is impossible to happen because 1296mhz just can't happen now for the shaders (unless we're looking at something like a 2.xxxx shader domain lol) and it never could for the core, plus I think its safe to say gt200 won't be spanked by the r700
Yep as LordEC911, CJ had already stated those numbers before Fud ever got a hand on them, and more than a few sources besides TheInq has quoted them.
And that's how the math for counting flops has been done for some time. Lower clocks doesnt mean anything bad - they're just saving us from excessive heat and power draw and the fact that these shaders might have the full 3 operations / sec and you have 240 of them means that you will still get tons more performance from shaders
Same price as 8800GTX launched at IIRC, and any other previous high end pieces of hardware.
And with 50% morre efficient shader units, do they mean 50% improvisation in performance per transistor, or 50% higher transistor count, as in more raw power? If the latter was true, wouldn't the maths have to be adjusted for the flop-count, making it well beyond 1 Tflop?
And how much performance will able to bring "the missing MUL component" ?
This post could probably tell you something: http://forum.beyond3d.com/showpost.p...postcount=1548
Damn power consumption is just increasing and increasing. :(
80 ROPs :shocked:
BTW what do you guys mean the MUL wasnt working 75 percent of the time? Do you mean the third shader can do additional operations now?
Yep essentially the shaders can do all full 3 operations rather than the 2 previously.
Also, it's not 80 ROPs, its 32 ROPS.
Like G80 architecture, its 4 ROPS per 64-bits of memory bus hence 512-bit bus -> 32 ROPS, 448-bit bus -> 28 ROPS
And it looks like the ratio is 24 SP's per 8 TMU's as well which isn't quite the 2 to 1 for G92 but better than the 4 to 1 on G80 (though it was 32TA, 64TF)
whats the story with the 260 shader clock, some sources say 1240 others say 999
Isn't it kinda weird that there is virtually no reviews with definitive numbers for a graphics card that is scheduled for a July release date?
will we get some benchmarks this week ?
i wonder how badly will this reflect on GT200 since it's gigantic size of 576mm^2:
http://www.reuters.com/article/techn...080527?sp=true
The same as if it was a 100mm2 chip. TSMC seems to want to raise the price primary on 45nm. And else overall on the GP process. Size doesnt matter, what process you use does. TSMC just saw how much it starts to hurt being in the big boys league on process advancement.
You might want to tell TSMC that, plus what Andreas said
http://www.tsmc.com/english/b_techno...10101_45nm.htm
And yes I read it, or are you claiming AMD and nVidia wouldnt be in the same boat? Perhaps you aint quite sure what you posted :p:
usually you are putting much more effort in Googling when you're trying to back up your statements! Why haven't you done same this time? Wait I know: 'cos you're wrong!
1. TSMC will skip 45nm and go diretly to 40nm: http://www.semiconductor.net/blog/27.../10025801.html
2. I know it's hurting you, but you must face reality: AMD and NVIDIA are not in the same boat with upcomming generation of GPU's! Why? Simple: let's say that wafer price for both of them is equal and AMD with their RV770 is getting twice as much chips comparing to GT200 in ideal yield situation! What's realistic to expect is that AMD will have better yields due to smaller chip... so this inflation of wafer price will have more severe consequences on NVIDIA in comparison to AMD.
BTW congrats on your 4K post ;)
You are wrong :p:
From your link....Seems like they are using 40nm but not for PC processors?Quote:
TSMC and the Reverse Temperature Effect
April 30, 2008
Taiwan Semiconductor Manufacturing Co. Ltd. (TSMC) held its annual technology symposium in Austin Tuesday (April 29), with much of the attention on the foundry’s 40 nm technology. TSMC would prefer that its leading edge customers go directly from 65 to 40 nm design rules, making 40 nm much more than an afterthought 0.9× linear shrink. In fact, TSMC will skip 45 nm and only offer 40 nm for the general (G) and (LPG) processes, with 45 nm and 40 nm offerings for the low power (LP) process which Qualcomm Inc. and others use.
If you need twice as many ATI chips to be at the performance of the nVidia chips. Then you still are in the same boat.
For 45nm, they still developed and use 45nm for LP. But as they wrote, 45nm dont hold enough benefits so they go for 40nm. Yet more waitingtime.
If anything its just a show of things aint going as they should.
Real world performance of these chips hasn’t got anything to do with the subject of how much do you harvest them from the wafer…Only thing that’s publicly available about TSMC’s nex-gen manufacturing process is that they are very satisfied with the progress and are even so confident in the quality that they dare to compare it with Intel’s High-K process.Quote:
For 45nm, they still developed and use 45nm for LP. But as they wrote, 45nm dont hold enough benefits so they go for 40nm. Yet more waitingtime.
If anything its just a show of things aint going as they should.
http://www.beyond3d.com/content/news/636
Thats like comparing a G94 etc chip with a RV770. Performance matters when you need 2 AMD chips to compete with 1 nVidia chip. Thats how it works. Else you are comparing the wrong chips.
And again, even if TSMC incrased the price for nVidia only with 20% it would still leave AMD/ATI in the red and nVidia shovelling in the cash as usual. Your point is useless. Any price increase hits equally on the 2. If nVidias chips get 5% more expensive, so will AMDs.
When these new 9900 will be available on stores?...july?
how much will the 9900gtx cost? I know for sure I can't afford the gtx 280, so its either that or a 4870 for me
Can't really tell AliG
Depends on many many variable factors ( variable as factors can change ).
Normally it shall be launched at 349$.
Don't take this statement to the bank though :p:
HD 4870 512MB GDDR5 $349
9900GTX (G92b) 512MB GDDR3 $299
HD 4850 512MB GDDR3 $229
9800GT 512MB (G92) $199
Somewhere along there. Though, the HD 48xx should be better than both of Nvidia's counterparts.
Perkam
I want to know the GTX 260/280 pricing for sure haha... hoping I can afford the 280 :D.
I bet Nvidia probably never intended the G92s to be as cheap as they were - if they had it their way, they would have kept it at big prices, but they had no other cards to compete at the 3850/3870 range until the 9600GT's. They've always favored the high expensive monolithic GPU's
The reason why G92 and ATi 3XXX were cheaper at the performance/$ ratio than the last series is mainly because they used a 256bit/lower manufacture process....
I don't think either one of the 2 big competitors would sell their stuff cheaper than it should be....
Has nvidia or any news site hinted at or reported on any kind of dual-GPU version of a GT200 based card? Like a GTX280GX2? If the GT200 turns out being faster than the RV770, which I have a feeling it will be, I want to get two of them in SLI. Unfortunately SLI does not work on intel chipsets, and I have a top end x48 board.
So my only option would be a dual-GPU solution such as the GX2 cards that nvidia has released in the past. Anybody have any info on this? I'd be willing to shell out $800+ for it, so long as it outperformed the 4870(X2).
Yeah well the dollar was worth a lot more back then.
So has ATI until they realized that they could not compete with NV for MAX performance and competed the only way they could, selling their top end as midrange so they could look competitive again.
ATI would sell their cards as high end if they could.
The 9700 pro started at 399, and this is as far back as most people want to go.
Boo Hoo, stuff gets more expensive. The 300 dollar price range still exists, but it just means that their is a higher level of card priced above that. Who cares if all you can afford is a 300 dollar graphics card and you can't say your computer has a top of the line graphics card in it.
What putting out expensive cards does is allow us to buy higher tech earlier. It allows companies to put out crazy tech even if yeilds are bad.
If you want top of the line, wait. It will get their when yeilds are better and they go to a more efficient manufacturing process.
I have always felt videocards were a bargain compared to CPU's. Top of the line CPU have always been around a thousand dollars and here we never get what we pay for. E.g the 1000 dollar part won't smash their 400 dollar part.
I don't really care how hot it is, I am on water. If nvidia does not officially make one, perhaps a partner will? ASUS unofficially made a 3850x3, and they also make nvidia cards. Perhaps they can make a GTX280GX2? :D
You left out a couple of key details, for one, the value of the dollar has temendously depreciated, secondly cpus offered a lot more power and value even 5/6 years ago, and third, ati and nvidia got away with $1k cards because people were willing to spend the money, if they weren't the pricing would have been lower
I hope this post is a joke. In case it isn't, I'll respond. With the dollar depreciated, it matters a lot more on price. The line of "there's a higher card priced above that" is bull, marketing hook line and sinker.
As far as card prices? My 9700 pro ran me $320 on launch day. My Ti200 ran me about $220... my Voodoo 3 3000 ran $299, my V5 5500 ran about $350, my GF4 Ti4200 ran me about $200, I skipped the 5/6 series, 7900GT ran $250 for me at launch, 8800GTS 640 was the most expensive I ever bought at $380 and it seemed out of this world expensive. Since then I've bought a pair of 8800GT 512 cards for $250 total after a $30 rebate for SLI. The whole "but they've always been expensive!" thing is absurd and reeks of buying into marketing hype that spins things like a top.
Again, cards should not run $650... it's a *RECENT* trend with the 8800GTX and probably this GTX 280, not a long-term thing that should be coddled and appreciated for giving us tech "early" (yeah, right! early my arse). As consumers we should be appalled at such absurdly high prices... it's not like the economy is in a great time period, either, let alone the fact that it's basically ripping you off at those prices.
Try again with something reasonable, please, as to why video cards should run $650 for the high-end when they never have/had in the past?
Well, that's silly to do when the Ti200's were around $220 on Black Friday deals near launch and performed nearly as well when overclocked :). It doesn't make it any less of a rip.
DO you think it practical to release the 280 GTX at this point at the 300 dollar price point when the 4870 doesn't have as close to the performance and its not nearly as expensive to manufacture.
Considering on top of this, the US dollar has less buying power in supplies(NV is a US corporation), the price of things have skyrocket lately and your competition doesn't have anything competitive at the moment to challenge your top end.
Its not practical at all.
What do you want, only release cards when the manufacturing of them is cheap and the yields are fantastic, so they can sell them for cheap or your desired price range.
Sorry but I want my tech early and those who can't afford can it can wait. NV is a business, not a charity. And I highly doubt that the early Highends are what drives NV pocketbook.
I also remember the 1950xtx platinum,1900 xtx platinum being 600+ on release, as well as anything with the Ultra moniker(and the 7800 512 gtx). I will consider these marketing BS because they get 10 percent more performance for 200 dollar more. On top that we don't really get better hardware with the price increase(maybe a better cooler, or memory).
Don't put all this blame on marketing BS. I doubt Nv makes most of their money on GTX sales(sales were 3.7 billion in 2007 I doubt they sold a million gtx at 600 dollars).
To me 600 dollar cards(not the double card x2 and gx2 crap) are a sign of thing to come in the near future for performance at the mainstream(or highend mainstream). They are sign of what is to come in the lower price segments eventually when the manufacturing cost have come down. They also provide marketing muscle by glorifying the brand.
So why is the card 600 dollars?
It is expensive to make a card that has: 65nm(yields and heat) 1.4 billion transisters(yields), has a 576nm die size(yields), produces nearly 250 watts of heat( needs better cooler), 512bit memory(more complex PCB and memory controller). Lets not forget R and D.
You're not comparing prices of top-end cards, you are quoting prices of mid-range cards, so of course the price point is going to be lower. For instance, the 7900GTX was $550+ USD for the first 4 months, but in your post you note the GT model for $250, not the GTX. Again with the 8800 series, you are recalling the prices you paid for a GTS, not a GTX.
I understand your complaint, but your argument is not valid with the previous purchases you are citing...remember, you have to pay to play. I am scraping together money as well in order to purchase a GTX280 on launch day and I am going to have to delay other things or go without (the mortgage has to be paid though! :D)
One last thing...open box...:rofl: Of course that doesn't count, if you're bargin hunting, the top-end card on launch day is not the place to bargin hunt.
IN addition the whole price going up, you make it sound like NV is the source of it because of their $600 8800gtx.
Lets look at the MSRP of the top end cards for a moment at time of release.
9700 Pro $399
9800 XT $499
Radeon X850 XT Platinum Edition $549 or 499 for none platinum edition.
Radeon 1800xt $549
Radeon 1900xtx $649
Check these up if you want, these were the launching MSRP of these cards.
ATI only got cheaper with the 2900 xt because it didn't offer as good performance as the competition(and wouldn't have sold if it wasn't price well below competition to reflect it's decrease in performance) and the chip itself didn't make them much money as they got the hell off of making the 2900xt ASAP to 3870 because they were so much cheaper to make.
And the guy above is right, you are cherry picking deals(who find a launch card 70 dollars cheaper than MSRP), and the rest of your card are Midrange or deals.
my 1950xt was 500 when first out, so ati has gone on the high side to but 600+ for a gtx280 no but not for the price but the heat the thing is going to put out
maybe after they go to 55nm
Well, buying at full price is silly when you can spend 5 minutes looking online and get it much cheaper... your loss if you don't. I'm not being brand-specific, as you could see by my card history anyway. What I'm saying is it's absurd in general how much they (both companies) are charging for their high-end products compared to historical pricing. It also happens to just be ludicrously expensive. No one can argue that, except people who rake in a million+ a year, even then a half a grand isn't a penny.
Akimbo 512mb 8800gt was 130 after rebate on frys.com a few days ago
Do remember, even if we all are attracted to the fastest and best, these products are marketed towards and intended for the niche crowds. They don't expect the average joe to pay 600+ for a video card. They know the real earners are in the 200-300 range and this is where the company's strengths show (particularly the 8800gt and 3870 as of now).
I have no idea what the production cost of a GTX 280 would be but the profit margins are probably not high as one would think. For this reason again the profits are in the more affordable mid-low high end range which usually involves cards that both cost less to make and that are also more widely distributed and in larger quantities. You can't honestly expect anyone to believe that they'll make more on the GTX 280 than they've made with G92. Although the cards yields haven't been ideal they've sold like hotcakes and are still going strong.
Sure I don't like the idea of paying this much for a video card but thats the price of admission. If you want to pay it, its totally up to you. If not, do what 95% of everyone else does and go more the more affordable alternative. Value is in the eye of the beholder as I like to think. Even if it is steep, if these cards last as long as the GTX has, I'll gladly pay the asking price. Again its down to what your willing to pay.
I also highly doubt one will manage to find a GTX 280/260 at less than MSRP any time soon after launch, unless availability is abnormally good. Remember what the G92 pricing was like during the first month or so of its release.
i hate to say it but.
6800 ultra was over $600
7800gtx was over $600
7800gtx 512 was over $700
7900gtx was close to $600
7950gx2 was over $600
8800gtx was over $600
8800 ultra was over $800
9800gx2 was about $600
granted the prices dropped (some fast) but at launch even the good e-talliers were at those prices. all newish Nvidia high end cards have been at over over $600 at launch and with the specs the GTX 280 is boasting i will gladly pay over $600 for that beast of a card. some people seem to be forgetting that it is rumored to have the second gen shadders which are up to 50% faster clock per clock tot he old ones. so you take way more shadders, faster clocj per clock shadders, more TMU's, more ROP's and bigger bus width. that's one HECK of a card you have.