:welcome:
Printable View
and also de 9800GX2 dont have 2x DVI side by side like in those photos ;)
regards
Would you all relax and stop calling this card a disappointment, I have the card and its the fastest thing ive ever seen, S*it, what did you expect, 5x the speed of a 9800gtx, not gonna happen, have you not learned to not look at 3dmark numbers, wow, im trying to hold myself back but reading these comments is getting on my nerves, a 2900xt beats an 8800gtx in 3dmark, does it mean its a better card? NO! this is the fastest single chip ever made period, its not going to be cheap, its not going to be low power, its not going to be little heat, its the POWER HOUSE, not the power saver.......................one word for this card FAST!!!!!!!!!!! now relax and wait for the release date.
It is a disappointment in its current rumored state as it's terrible price/performance that is outclassed by currently available offerings (which other than pure benchers, the price/performance is what matters to EVERYONE else). No one gives a **** if it's the fastest single chip, what matters is it performs. It's very easy, and I can say this from many experiences, to say "wait for the release date" when it's you who has the product/software/item/service/etc. in-hand ;).
Actually I do care if its the fastest single card solution for sure! Personally I would rather have a faster card even if the price/performance ratio was wasn't that great, if a 7300GS had the highest price/performance ratio it would still be a 7300GS that might not be great for gaming, benching, or whatever else you might use your card for. I see where your coming from but some people really don't mind paying a bit extra for higher performance, even if the price/performance ratio isn't that great.
I think everyone has a different tolerance, but personally I won't go below a 3.5:5 ratio of performance-to-price... i.e. if I'm spending $250 on a card vs. $500 on a card, and the $500 card is only 20% faster, it's a 20%-to-100% price difference, thus 1:5 ratio of performance-to-price = no sale.
It looks like this site is selling a Nvidia GeForce GTX280 reference card with an estimated delivery time of 16-06-2008.
Nvidia GeForce GTX280 1024MB Marke
Attachment 80152
Thank you GAR.
@ Golden Tiger
You have a great attitude there friend :rolleyes: I understand your in it for the price/performance figure, which in truth is where 90% of them market is but please remember, its not these people who buy cards like this. Its the people who want the fastest, regardless of price not those who are very price conscious. Sure everyone wants the fastest card but only a few buy them while the rest are happy with their sanely priced hardware, just remember this though when you start making comments. I'm not trying to attack you here, I just think your being somewhat pessimistic and hostile towards the whole concept of these cards. They ain't no 8800gt, plain and simple and Nvidia knows that.
Exactly, the people who buy these, are obviously not the people who are casual gamers, only people who want the best in performance no matter the cost, that is very small percentage of us, but hey, this is our hobby right? or else what are we doing posting in these forums everyday, reading about this stuff..............if you want price/performance, wait for the 9800GT or the 9900GTX........or you can always go with ATi, they have cheap and good performing cards coming with the 4870/4850.
Running 2560x1600 pretty much means GTX280 for me (unles ATI pulls the rabbit out of the hat of all surprises on us with R700)... just hope to god that the waterblock doesn't cost me an arm and a leg
http://img.photobucket.com/albums/v5...lson-muntz.gif
4870/gtx260
yes goldentiger i agree with you, and most do, but it's obvious that the carrot on the stick will always be more expensive....and there will be yet-to-come cards that will supercede it.
gtx280 will probly be cheaper than the x850xt i bought 3+ years ago :rofl: even after adding inflation...
so in 3 years will the gtx280 be worth $80 bucks :rofl:
that's depreciation for ya.
http://www.techbuy.com.au/p/51539/VI...T-256-PCIE.asp
x850xt@ $775 (one from the vaults :rofl:, dudes change your pricing:rofl:?)
http://www.skycomp.com.au/product.aspx?id=38750
x850xt@ $96:rolleyes:
but no thanks; i'll take a 9600gso/gt @ ~$115-145aud:lol:
http://www.ozdirect.com.au/index.htm...=730_1152_1403
gtx280 is not bread and butter or noodle congi; it is rather escargot with truffle sauce and gold leaf, with a ridiculous price to match.
im not trying to discredit somone but this looks very familiar:
http://www.hartware.de/news_44980.html :rolleyes:(and was posted allready in this thread. ;) )
The card seems to scale very well in high resolutions and DX10.
Running 1280x1024 3dmark06 will not show this card's strengths.
Run 3dmark Vantage instead, ;)
I've been looking for specs on the GTX260 but except for Fudzilla they have been lacking. I don't know how accurate they are but the Fudzilla article and quotes from it are all I can find. Maybe we'll find out how accurate they are before next Tuesday.
Slightly slower
Nvidia’s second GT200-based card will also be available for sale in about six days and here are the final details and clocks. Geforce GTX 260 is, as well, based on 65nm GT200 core that has 1.4 billion transistors, but this time clocked to 576MHz. Some of these transistors will sit disabled, as GTX 260 has one of eight clusters disabled.
The Shaders are clocked to 1242MHz and the card has a total of 192 Shaders (what used to be called Shader units are now called processor cores).
The card has an odd number of 896MB GDDR3 memory clocked at 999MHz (1998MHz effectively) and this is enough for 111.9GB/s bandwidth. The slower of the chips has 28 ROPs, 64 Texture filtering units and 36.9GigaTexels/second of texture filtering rate.
If you look at these specs closely, you will see that GTX 260 is the same as GTX 280, but with one cluster disabled. If GTX 280 has eight clusters, GTX 260 ends up at seven.
The card has HDCP, HDMI via DVI, but this time two 6-pin power connectors; and launches next Tuesday.
http://www.azerty.nl/producten/zoek?...ING=prijs_desc
Zotac GTX280 549
Zotag GTX260 373
Doesnt seem any different compared usual new nVidia high end cards:confused:
dudes read the name of the person your are quoting , this is "Dynasty" that guy doesn't need a ss to back up what he is saying , if you don't want to believe it just don't read it , you don't need high 3D mark score to make a good card
Or perhaps he mixed up the "4" and "7" so it should be 17400. J/K :p:
I never seen so much FUD in a single place in all my life.
Given the amount of hype many people will end up disappointed.
I think I'm gonna go with the GTX 260...(Xtreme Instincts slaped me in face.."idiot spend money go GTX 280")
Not under NDA :yepp:
As far as i know with a 3GHZ Yorkie
GTS 512 13.7K
so it can be considered a bit faster for 3D mark 06 which is pointless and extremely limited Not to mention that Dynasty didn't say anything about filtering and so , anyway this a pointless argue , enough spamming , i will wait and see
This must've been on vista...even so
Q9450 @ 3.8GHz 8800GTX 648/1026
Q9450 @ 3.7GHz 8800GTX 648/1026
Again, you CANNOT compare cards across different systems via 3dMark scores... CPU affects the 3dMark06 score way more than the actual card itself and who knows how driver performance atm is
I agree, which is why I posted two results where the difference is 100MHz on the cpu clock. It seems 3DMark06 will give an additional 100Pts for each 100MHz on a Quadcore clock. My comparison I thought was showing the potential. If I was home, I would just clock my cpu to 3GHz and core to 600 but then my score would barely break 13,000. I think 14,700 with a single card with no overclocking of card, cpu, or system is pretty good. I think most were just expecting something spectacular like 18,000 or so.
i want to see some 05 03 01 results ...
I remember seeing some charts on toms hardware vga comparison, a 8600GTS beating a 8800Ultra @ 1024x768 in some game, due to higher frequency clocks.
I am not surprised the card scores this low, I quite believe this card shines with filters, so a 1280x1024 test without filters doesn't seem to me the best playground for this card.
Compare it with riding a car using rain tires on gravel.
yeah that's the deal for me; i want 8xQ minimum :D, so that's what i'll be looking for in reviews, all the candy in game.Quote:
I quite believe this card shines with filters
the only reason i have a gcard is for eye candy.
Consider the following statement:
If that's true and I believe it is then I'm going to use some reverse logic to theorize the following:Quote:
Nvidia’s GeForce 9800 GTX features 16 ROPs. The ROPs are split into four clusters – each is able to process four pixels per clock and is also connected to a 64-bit memory interface. This means that there’s a 256-bit memory interface on the GPU and it connects out to eight 64MB GDDR3 DRAMs, making a total of 512MB of video memory.
Nvidia’s GeForce GTX280 features 32 ROPs. The ROPs are probably split into eight (dividing 32 by 4) clusters – each is able to process four pixels per clock and is also connected to a 64-bit memory interface. This means that there’s a 512-bit memory interface on the GPU and it connects out to sixteen 64MB GDDR3 DRAMs, making a total of 1GB of video memory.
Nvidia’s GeForce GTX260 features 28 ROPs. The ROPs are probably split into seven (dividing 28 by 4) clusters – each is able to process four pixels per clock and is also connected to a 64-bit memory interface. This means that there’s a 448-bit memory interface on the GPU and it connects out to fourteen 64MB GDDR3 DRAMs, making a total of 896MB of video memory.
I know that I've seen how many memory chips there are on these boards somewhere but I didn't take to time to see it is 16 and 14 as I stated above.
Are we talking about the same clusters?
We're talking about different things.
I am talking about the SP clusters, of which the SPs and the TMUs are a part of. GT200 has 10 SP clusters, each with 24 SPs and 8 TMUs. The GTX 280 has all 10 enabled, so that gives it 240 SP / 80 TMU. The GTX 260 has 8 enabled, so that gives it 192 SP / 64 TMU.
You are talking about ROPs, which are seperate from the SP clusters but are tied directly to the memory bus width. In terms of ROPs, 28 out of the 32 found on GT200 are enabled in the GTX 260 model. So that does work out to 7/8. This is why you see the 448-bit bus on the GTX 260; 28 * 16 = 448, just like 32 * 16 = 512
FUD just doesn't know what he is talking about. Nobody refers to ROPs as "clusters" so I doubt he meant to refer to them.
im gonna be picking up a gtx260 most likely as well. unless a 9800gtx shows up for less than $200. nvidia hasnt mentioned any price cuts yet for it, but i have the feeling even if they don't do it officially, the invisible hand of the market will bring it's price down, just because its so inferior to all these new products coming out.
so is 9900gtx gtx280? or is it 55nm of 9800gtx???
http://publish.it168.com/2008/0611/20080611033801.shtml
new benches... lol now the 280GTX only reaches 12,5k in 3dmark 06....
probably driver issue ... even if those are true that is .. seems like nv is focused on DX10 game this gen
Looking at the chart confirms whats already been said. The card performs much better than current offerings @ higher resolutions and eye candy cranked up. Still if the price of the current hardware drops significantly for something like the 9800GX2 I'd buy a second before shelling out $700 on the GTX280
Do you believe this? If true, I'll go on a diet for the next year!
attention attention, inquirer news :D
http://www.theinquirer.net/gb/inquir...cores-revealedQuote:
GT200 scores revealed
THANKS TO NVIDIA'S shutting us out, we are not handcuffed about the GT200 numbers, so here they are. Prepare to be underwhelmed, Nvidia botched this one badly.
Since you probably care only about the numbers, lets start out with them. All 3DMark scores are rounded to the nearest 250, frame rates to the nearest .25FPS. The drivers are a bit old, but not that old, and the CPU is an Intel QX9650 @ 3.0GHz on the broken OS, 32-bit, SP1.
http://images.vnu.net/gb/inquirer/ne...00_numbers.jpg
that's from charlie the infamous nvidia basher
Price cut anyone?
http://www.fudzilla.com/index.php?op...=7853&Itemid=1
If it's true then I'm so in for a GTX 280 :up:
at launch yes, but MSRP 2~3 weeks after launch after 4870 and 4850 decimate the performance section will drop, at least for the gtx 260, 280 probably won't drop until the launch of the 4870x2, but probably only if its performance is +10/-10 of the gtx
regardsQuote:
Nvidia changes GT200 dates again
Also changes prices and adds a PhysX driver
NVIDIA IS CHANGING the GT200 launch date again, this whole 'let's prove them wrong when they leak' thing is getting tiring. That said, there are a few goodies here and there in the email that Ken Brown sent yesterday.
The meat of the email that went out to reviewers is that the launch date has moved from the 17th to the 16th at 6am PDT. They claim that the 280 will be available the next day, and the 260 comes later, on the 26th. One word to reviewers, make sure you check the retail prices with partners before you quote price/performance numbers, NV has a dirty tricks campaign lined up here, we told you they would have to drop the price when they saw the 770 number, and they did.
There are also a bunch of new things on the NV press FTP site, including 177.34 drivers, up from the 177.26 we tested with. We would be shocked if these were not special press-tweaked drivers, so beware of scores tested with these last-minute releases. Also included are a folding@home client, now possible due to the unbreaking of their architecture this time around, the Elemental "Badaboom!" encoding application, and a bunch of documentation.
Speaking of tweaked drivers, the next new one is coming next week, and it is a PhysX driver. Look for this one to pump 3DMark Vantage scores to the moon, you can do that when you own the API. Sigh.
The more things change, the more they are gamable. µ
INQ
mmh
Awesome news Mascaras :up:
I'm not too sure retail partners will change prices much from MSRP though. But yes, good news both ways :D Nvidia spooked after seeing 4870 numbers and now wants their chips to get the performance we've always wanted them to :up:
Perkam
The GTX 280 $499 launch price makes sense. The 9800 GX2 was $599 at launch. Judging by the prelim benchies nvidia knows the GTX 280 card may be less of a performer as is priced accordingly.
About a 2% chance of $499 at launch :p:
Perkam
Launching at $499 makes more sence to me than dropping the price only 3 weeks after the launch. And, of course, in the absense of the new ATI products, the actual retail price will end up higher in the first 3 weeks. If they drop the price afterwards, it will look to everyone that they are worried about competition from ATI
So looking at that Inq report, i see the 280 with 5000 points in Vantage Xtreme preset. While it may not be genuine, how good is that? I see the top score on ORB is X9123, no doubt with some extreme cooling on a pair of 9800GX2's. Assuming that 5000 score is on stock air cooling for the 280, it seems ok.
35FPS on Crysis @ 1920X1200 without AA? :(
If that's true, I'll be keeping my 640MB GTS. And I'm on 1920X1200, so this is really disappointing if true.
8800GTX overclocked + busted old 90nm athlon = 20fps constant, everything on 'very high', 1680x1050, 4xAA. And that's the only game that challenges the setup. :shrug: Every other game on the horizon is a console port, even Far Cry 2, so I think I'll do fine.
I don't need this. Incremental improvements for €600? :rofl: I'll wait and see for the die shrink, if it's lower priced and is actually a generational leap..
Quote:
3-way SLI Action with NVIDIA GeForce GTX 280 and 3DMark Vantage
A little joy for today, one week before the official announcement of the NVIDIA GeForce GTX 280 cards. Here's a little sneak peak on what to expect from three NVIDIA GeForce GTX280 cards in tri-SLI configuration, an overclocked Intel QX9650 processor to 4GHz and the 3DMark Vantage Vista DX10 benchmark. Clock speeds of all three cards can be seen in the photo. The end result is 21350 marks.
http://www.techpowerup.com/img/08-06...X280_m_thm.png
VR-Zone
http://www.techpowerup.com/62774/3-w...k_Vantage.html
regards
No need unless somebody has a 24 or 30 Inch monitor and wants AA in every game to the maximum with resolution at default or if it is for the work. A die shrink of it will do it just fine like 7900GT, 8800GT and so on, if Nvidia still has that marketing strategy in place.
Well features which make the image much clear is a good thing to the people who want to spend their money on it. It is up to them to decide if it is worth or not, particularly my personal standpoint which is if there is a general need for it buy it has not changed since I acquired my first graphics card.
I bought a 8800GT only because I needed a DX10 card for testing purposes at home regarding the working in progress.
Metroid.
Well, I run everything now, everything I bother to play including Crysis (I sacrifice frame rate for the juicy details, I guess I could get better framerates by lowering the details but then it would be just another game ;)), so I have no need, general or otherwise for it.
The physics completely chafe the setup, though. Blowing up houses with the rocket launcher and such, at times it'll grind it to a halt. We all know what part of the system runs the physics currently.. So Nehalem will cure that. :up:
Damn NDA and guys who can't_say_anything_but_will_increase_your_irrytati on_every_time_when_they_cay(TM) ;)
GAR,let's say u already have 2x9800GX2 and a 30" monitor.
Would you change your setup or it's not worth it?
Yes or No could work ;)
Let's hope these cards are exactly what we want them to be, because we can now confirm that the 4800 Series will kick some serious a$$: http://www.xtremesystems.org/forums/...d.php?t=190987
Perkam
3dmark06(or any for that matter) isn't exactly a true indication of gaming performance. After all, ati holds 3dmark06 presently, but they don't hold number 1 in actual games. ;)
I hope we see a serious battle this round though, as everything after the G80 was just boring, while at the same time it HAS been really nice to be able to max out everything(except crysis) with a card bought almost 2 years ago, it really has been stale for awhile now technology wise. As such, IF ATi can pull an upset(and not just benchmarks, but gaming, where it REALLY matters) then we'll finally see a real fight once again. :up:
Dilly, Why am I the one breaking this news: :stick:
Official GTX 280 Tri-SLI Vantage marks from VR-Zone (Note Official, not rumoured)
http://forums.vr-zone.com/showthread.php?t=287887
EDIT: Fixed. Ty. :)
Perkam
I think you posted the wrong link there perky, that's to your thread about the HD 4800 marketing site being up, and the 15k 06 score for the 4800(not sure if it's 4850 or 4870).
The score doesn't impress me much, mainly because it's with a 4 ghz quad. It's pricing though should make it very nice indeed though.
:::edit:::
Link works now. I'm not sure how Vantage is about scores(I run vista, but I don't benchmark period), but it doesn't take a rocket scientist to know that they're still massively cpu bottlenecked. Just bringing their cpu up to 4 ghz jumped the score by 4k... 3 ultra high-end cards at 1280x1024 is definitely going to be bottlenecked regardless of what cpu you throw at it presently, unless you throw some LN2 at it.
Damn 3 x 280's @ 1280x! I wonder if the score wouldn't actually rise if the res/settings were increased (didn't that happen with GX2's on 3d06?)
What kind of clocks on a C2D will we need to alleviate any bottleneck while running a 280GTX?
That depends on what you mean Yukon Trooper.
If you're talking about "logical" gaming, with high resolutions with AA/AF, then nothing.
A Core 2 Duo E6400 @ 2.6GHz is more than enough.
Yeah Quad SLi is still not a good choice, and a big waste of money sorry to say, an SLi of GTX 280 will be a much better choice, scaling will be much improved, i would ebay those cards asap while you can still get some value for them, once GTX 280 comes out, prices will drop.
Froogle's starting to show some...
GTX 260 896MB $423 USD: http://store1.alrightdeals.com/Commo...d.96MB___73300
etc.
can someone give me a link to a gtx280 box ?
some idiot in youtube is arguing that gtx 280 is 9900gtx and gtx 260 is 9900gt
http://imgs.xkcd.com/comics/duty_calls.png