Well i was wandering since you have the actual thing could you please ruan a 3dmarks2006 since vantage is quite new and i doubt the majority of us have used it so if you could please give us some heads start
Thanks
Printable View
This is my first post. I will keep it short.
This thread saddens me. I thought the g280x2 was going to be my messiah and free my 2560x1600 display :(.
Time shall tell my children. Until then, I shall continue to rock my 7900gt. AA FTW.
Call me gullible.:comp10::bows:
What I want to know is how does a monthly magazine get a hold of a GTX 280, run there set of benchmarks, write it up, go to print and deliver it to their customers before any of our trusted sources can? I have never read PC Gamer so I don't know if they have a hardware section. They compare a GTX 280 to GeForce 9800 GX2 and Radeon 3870 X2 so they must have a benchmark computer they dust off occasionally.
I guess us gullible creatures will soon find out what the GTX 280 will do when this damn NDA expires. I'll be interested to see what kind scores you give it running 3DMark06, Vantage, Crysis, Half Life 2: Episode One, Company of Heroes and World in Conflict.
I guess you realize that my buying decision may lie on your shoulders as to whether I buy a GTX 280, his little brother or jump ship!:D
Look forward to your review.:)
Maybe the magazine was to go in circulation at a certain time to coincide with the lifting of the NDA and got out early? Or maybe the NDA on print material is different. Or they could have picked one up from an Asian source and are not under NDA, who knows.
Remember it takes time to do hardware reviews, most of them are probably already done or almost done and waiting to be published.
Or Nvidia supplied them with a card. :)
I briefly considered picking one up, but decided against it. I've got other things to buy :cool:
seems to be just a little faster than a gx2 or a little slower:down:
about the same as the review of the sapphire 3870x2 wc 9800gtx 29.1 fps
http://www.techpowerup.com/reviews/S...rcooled/9.html:shrug:
Well it all depends how the benchmark was made, ingame it runs often faster than many of the timedemos in the Crysis benchmark utility and there's also a wide range of slower and faster running timedemos and remember it's DX9 "High" settings and combine that with a not so demanding timedemo it could be legit.
so 280 GTX final clocks are 700/1150/1400?
it's on a PCIe 1.1a MB isn't? c2d...
maybe PCIe 2.0 and c2q give more advantage for the gtx 280?
me like, but me canna afford.
but the test showing 8800gt at 4x 16x 1600x1200 in crysis and 34 fps @ high settings is a farkin joke...id say more like <20fps at those settings.
20-25 fps @1280x1024 with 4xAA roughly.
Something with low core clocks will never be 75% faster at lower resolutions where core clocks shine and cpu speeds actually matter. It isn't going to surprise me when this thing gets its @$$ handed to it by much cheaper solutions at 1280X1024 and the like. Fortunately, there are people who have a purpose for cards like this.(this implies it, but does not mean me specifically)
They are using the same test system to benchmark each video card. I would think you can use each group of benchmarks to judge how one card would scale to another card. There's no way you can judge those benchmarks to what we get on our systems.
I ran the following Crysis test:
EVGA 780i
Q6600 @ 3150 MHz
EVGA 8800 GT SC (2x) SLI
Crysis Benchmarking on Vista 64
Timedemo: benchmark_gpu
DX10 1600x1200 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 13.15
Those setting wouldn't be playable with my system. The only component that I have in common with the Techpowerup test system is the PSU.
http://www.azerty.nl/8-160-80998/engtx280-htdp-1gb.html
516 Euro.
GTX 260 is 354 Euro
That's probably the only website with such low prices. Everywhere else its 550 Euro. Plus the POV edition with 700 on the clocks will be 600 Euro.
Also, I have NO idea wtf you guys are talking about. The GTX 280 was never meant to be 2x the 8800 Ultra or the GTX :stick:
It's a monster of a card and will only get better from here on out. ATI is in NO WAY any form of clear.
Pekram
It's just for that price ($650 USD) people expect it to perform according to the pricetag, and it seems like HD4850 might put a 9800GTX run for it's money so I bet HD4870 will perform somewhere between 9800GTX and 9800GX2 depending on game it might be even very close or par with 9800GX2 and that with a pricetag of $350. Now if GTX 280 only performs about par with 9800GX2 or slightly better than it who would pay so much more money for so little extra performance?
I personally hope GTX 280 will perform not only par or slightly better than 9800GX2 but significantly better to make it worth that $650 pricetag as otherwise a pricedrop is to be expected and the INQ's $500 pricetag would sound more reasonable but yea I don't have much faith in those Techpowerup's faked or leaked numbers or whatever, so to me GTX 280 performance is still wide open mystery.
Just saying it doesn't suprise me people expect that much performance boost out of these cards. ;)
Honestly, it could be that nvidia wasn't expecting the 9800gx2 to be so powerful when they were designing gt200 (in other words they weren't expecting to have to worry about clocking the gpus high) or that ati forced nvidia to play its hand with the 9800gx2 and make it more powerful than they expected
nda broken ppl
saw zotac 2 280gtx on sale in lyn in malaysia
that shop sold 2 already
sishh the heatsink is bigger than the 9800gtx.. the size and weight similar to my 9800gx2..
even if is just a litle extra performance , but :
GTX280 is single GPU >> no problems/compatibility with any game , no stutering , 100% "clean" & smoother in all Games !
9800GX2 also has more performance than 8800GTX/ULTRA , however we see lots of gamers choosing 8800GTX/ULTRA because is single GPU
regards
hopefully others will follow suit now
people forget. yes it may have only a little faster numbers than the 9800gx2 but the 9800gx2 does have microstuddering. so ya the FPS is within 1-5FPS of each other but the real in game feel is more like 15FPS off. and at higher resolutions and AA settings the lead only grows. not to mention the drivers that will be coming out. and yes a 4850 crossfire will give similar FPS to the GTX 280 but then your limited to a X38 or X48 (maybe P45) chipset, you have microstuddering so again the GTX280 will run smoother and faster you have to deal with crossfire which IS NOT supported in all games and has some issues in other with glitches and artifacting, also some review still show crossfire in windows XP to yield low gains in games like crysis. the powerconsumption should be similar but the 4850's may be a tad more hunger, also heat goes back into your case with the 4850 stock cooler so that becomes an issue. the 4850's also do not have the features of the CUDA platform so you get no Ageia Physics support and lower support for other programs like video editing software and such. Crossfire and SLI are just not ideal solutions and when you can get similar performance from a SINGLE card as well more features. the way i see it the GTX 280 is being downplayed by the success of the 9800gx2 when the 8800gtx came out it was fantastic because the 7950gx2 was a bad card and drivers were poor. but Nivida did things allot better with the 9800gx2 and now that the GTX 280 ONLY has 100% more performance than the 8800gtx people are getting down at it?? looks to be a good card to me. another thing to remember is that most of these sites still use 3GHz Core 2 Duo's or 3.2GHZ Quads (max) and some still even use PCI-Express 1.1 mobo's. that's not fast enough and both the CPU and mobo will bottle neck the GPU and most resolutions. but put a 3.8GHZ or higher qaud with it and it will FLY.
Yea well that alone isn't worth paying $150 for imo, rich bastards yep but in general that wouldn't cut it as we have to remember every person aren't that much aware of the stuttering or the very various results with GX2 depending on game etc.
EDIT: Yes like hipno650 explained above if taken into account all that it surely gives a benefit with the GTX 280 but NVIDIA can't expect the card to get sold for those reasons alone, many buying decisions are made up from reviews where pure performance numbers are checked, not how it behaves in reality hench why I'd expect a price drop if performance is close or par with 9800GX2 and HD 4870 comes closer than expected.
honestly i would have no problem with a price drop:D that just means i can get one sooner.:up: but i think this is going to look very similar to the 8800gtx days. the 8800gtx is now the X1950XTX (people down play it i had mine running Crysis all on High playable) and the 7950GX2 is now the 9800GX2 just with better performance in comparison. and the GTX 280 is now the 8800gtx and the HD4870 will slot into become the HD2900XT but just with a lower price tag but worse performance compared to the GTX 280. and look what happened to the 8800gtx is still the fastest Single GPU for sale today (save the ultra) and is possibly one of the most popular high end cards ever made and has lasted how many years now on top?
has anyone observed microstutering on vista 64 with 4+ gigs of ram?
i have the feeling its a ram/os issue rather than an inherent flaw of crossfire/sli
Personally, I think you guys are overdoing the micro-stuttering issue, there are plenty of people that have xfire or sli systems that don't seem to notice it or at it isn't enough to really matter. But eitherway, until the 4870x2 comes out, there's no way to know if the card will have the same amount, less, or no microstuttering as 4870 xfire. Personally I feel that the 4870x2 will outperform the gtx 280 in terms of fps, the 4850 is just too powerful from the results we've seen and the gtx 280 doesn't seem to be a big improvement over the 9800gx2 which will very likely be spanked by the 4870x2. It just matters how smooth the game play is in games that fps don't matter (as both high ends get more than you'll notice), and smoothness could be affected by many things, especially ati's drivers. Even if they eliinate the micro-stuttering to some extend you never know what could go wrong with a dual gpu setup from the driver stand point
thank you and oh no i have not been paid but that's the way i see things now. i just think single GPU setups are far more attractive in almost all reasons over multi-GPU systems be they single card with2 GPU's or muti card. however i will not deny that to get the extra extra extra performance SLI or TRI SLI of GTX 280 would be the way to go:D
i have not tested or seen microstuddering for myself. i plan on getting a pair HD4850's sometime after is get a GTX 280 and testing the microstuddering issue for myself. most reports i have seen are Vista 32bit and i would be interested to see XP 64bit and Vista 64bit with 4GB or more of RAM. but to think about it it only shows up in Multi GPU systems so it may be a driver issue but for Nivida and AMD to have the same issue is questionable. and i would admit that the theory behind microstuddering does make sense. but only one way to find out...test it myself.
like i said i have not tested it for myself so i am just going by what what others say. and yes i have seen the videos of NFS pro Street being played on a HD3870x2 on PCGH and it looks brutal. the FPS is more than playable but the game looks like it is running at 5FPS:(
another reason i did not highlight is that of the AMD's drivers. from my personal experience they are not the best and i have used there new stuff (HD2900Pro and HD3850) and i find using them to be a tad frustrating.
this is were the unknowns come into effect. we know most single current cards display no increase in performance when on a 2.0 mobo compared to 1.1 however the GTX 280 is a beast and it might be reaching the limits of the 1.1 bus. time will tell how much if any performance will go up on GTX 280 from 1.1 to 2.0 but i would like to see reviewers using 2.0 mobos so we can totally eliminate it being a problem in there reviews.
Edit: that's good to know that microstuddering is noticeable more at lower FPS then higher FPS. the problem is however the most demanding games (like crysis) run exactly in that envelope on the 9800GX2 and the GTX 280 (from what GAR has told us) and thus the 9800GX2 will see microstuddering and even if they have the same FPS the GTX 280 will be more playable. this also opens up the discussion of what is a playable FPS and at what FPS do you notice differences. that is a topic that has ruined to many threads so i would prefer not to start it but i can i can notice the difference between 60HZ and 120HZ TV's and the difference between 50FPS and 60 FPS for me is noticeable in most games.
I'll take note of that :)
I was wondering why I got such terrible results with the above test. I noticed that some Vista 64 users were running DX9 and 32 bit so I decided to try it. I didn't know that the DX10 and 64 bit combination was such bad news.:mad:
DX10 1600x1200 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 13.15
DX9 1600x1200 AA=4x, 32 bit test, Quality: High, Overall Average FPS: 43.555
DX9 1680x1050 AA=4x, 32 bit test, Quality: High, Overall Average FPS: 33.905
DX9 1600x1200 AA=4x, 32 bit test, Quality: High, Overall Average FPS: 43.555
DX9 1680x1050 AA=4x, 32 bit test, Quality: High, Overall Average FPS: 33.905
DX9 1600x1200 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 42.78
DX9 1680x1050 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 33.3
Using 64 bit DX9 it takes a very slight hit. I'll have to see if I can find any tests that I ran using one 8800 GT at these resolutions using DX10, AA=4x, 64 bit and High. Is there a know DX10 problem using SLI?
This is the closest I could find.
SLI
DX10 1600x1200 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 13.15
Single
DX10 1680x1050 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 12.63
I've never tried DX9 32 bit or 64 bit with a single 8800 GT.
so
SLI :
DX9 1600x1200 AA=4x, 32 bit test, Quality: High, Overall Average FPS: 43.555
DX9 1600x1200 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 42.78
DX10 1600x1200 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 13.15
Single
DX10 1680x1050 AA=4x, 64 bit test, Quality: High, Overall Average FPS: 12.63
--------------------------------------
like i suspected its not 32bit/64bit problem , but DX10 mode @ SLI ;)
looks like only 1 card running when in Dx10 mode @ SLI
regards
can anyone tell me the exact release date, if its befor the 27th i can step up
sweet, thx
It's certainly different this time, if the 7950 wasn't so bad then the 8800 wouldn't have been the omgwtfbbq performance leap that it was. This time they made a decent job of their GX2, so the next single GPU was always going to be labeled disappointment by some people from the start.
i've been using 2x9800GX2 + E8500@4+GHz on Vista 64(4x1GB DDR3) and i was hoping i'd be able to play Crysis @1920x1200(since 2560x1600 is still a slideshow)
The framerate was decent,i think it was around 40fps(according to Fraps) BUT the game was not playable at all.The in-game delay was about 300-500ms when an explosion or gunshot occured,just like lagging in a multiplayer game.As you can understand,this effect cancels the extra power of the second card,it actually delivered "on screen" worst result than a single card.
I am not sure if this is called "microstuttering",i had to read/learn about it while trying to tune my system.Correct me if i'm wrong
So,i guess the GTX280SLI will perform much better compared to 9800GX2 SLI because of better synchronization
screenshot available
i get a bit of lag with explosions, when im in the middle of them....in bf2142 :eek:...with my gt @ 8xQ. but lag is very rare.Quote:
The in-game delay was about 300-500ms when an explosion or gunshot occured
crysis is the dog.
well, who's gonna be the guinea pig for this one? :lol:Quote:
So,i guess the GTX280SLI will perform much better compared to 9800GX2 SLI because of better synchronization
somethings bottlenecking those gx2's, and i dont think it's fixable. (:slapass:) would it be smooth at 5GHz? cpu? at those settings? somehow i doubt it. it is an sli/scaling/sync problem i think, until proven wrong.
:rolleyes:
Lol, they'll get a lot of interest at that price.
http://img115.imageshack.us/img115/6...gefozr1.th.png
that's a 9800gtx price. $328
yeah i'll get 2 for that price:rolleyes:...actually, no, i dont want the stutters :ROTF:
GTX 280 on Ebay for $699, apparently they're the one who sells the HD 4850 as well :rolleyes:
http://cgi.ebay.com/Brand-New-EVGA-0...d=p3286.c0.m14
When are the next gen cards coming out after the GTX 280? these ones are quite poor
D10U = 65NM GT 200
D11U = 55NM GT 200 with some minor enhancements mid next year
D12U = Completely new architecture.
That's what I've heard around the web. Plus, the fact that Nvidia has already stated that the G92b (9800 series on 55nm) will be around for 6-12 months more so yea, no new high end very soon.
Perkam
my god..
i guess its true, nothing is holding back any games bar crysis. Truely a bad sign, well good for cost but you know what i mean. I guess its not all bad, but at least newer games should run as smooth as butter (or almost) even on G80 cards.
do you know a time gap between the D11U and D12U? :shrug:
Aw hell I should just pick up another 8800 Ultra for SLI
Even saw an increase in performance over the 9800GX2 in at least one game I play..........even with the newest drivers.
question is: is the 280gtx worth the jump from a 8800gtx sli?
From a performance standpoint, I doubt it. Performance will probably be close, as 8800GTX SLI should be a bit faster than a 9800GX2 and the GTX 280 should be a bit faster than the GX2 at high res.
Previous high-end SLI -> New high-end has never given you much of a performance increase, except 7900GTX SLI -> 8800GTX.
but what about 8800 gtx sli to gtx280 sli??? :D
I'm going with the most performance from a SINGLE gpu card as SLI and crossfire/X2 cards have to many compatability problems. Not to mention the need for a cirtain mobo in most cases.
That puts the GTX280 at the top of my list. It will give me 2X the performance of my current 8800GTX with all the advantages of a single gpu card;)
In terms of pure FPS values, GTX 280 will probably be a bit higher than 8800 GTX SLI.
But in terms of fluidity/playability, GTX 280 will absolutely blow away any card to date, including 9800 GX2.
Nvidia managed to keep the Chinese sites contained this time around :down:
Great post by Mythbuster420 at EVGA forums , is the industry laughing WITH or AT PC gamers ?
With $650 for an incremental change , Gee I wonder. :rollseyes:
Quote:
Well, actually... The 8800GTX/ULTRA is a different beast.... Its the only card I can remember that lasted ( still "lasting" ?) almost 1 1/2 yrs as KING of the crop...I just did not expect the same kind of monster released this time simply because of lack of competition from ATI.. I wont go into the wholwe marketingbuisness 101 crap, but they wouldnt release new hardware that they have in the wings unless they have to in order to make $... That would be just plain retarded from a buisness standpoint. I am fairly certain both companies have the technology already developed to keep up w/ demand for at least 3-4 years in advance.. Meaning you can bet that right now, in some room at nvidia, sits some freaking monster of a core / prototype card etc that will run crysis at 100 fps MINIMUM.. If you dont believe that , then you are in denial lol... But the card makers dont release tech just so they can say "hey look what we made".. They do it for 1 reason.. TO MAKE MONEY.. Its always been my belief that the card makers have enough technology already developed to last at least 3-4 years in advance.. But only let it out "AS NEEDED".. or as they need to just to keep "ahead" of there competitors.. Sometimes we see "stop gap" products released to get by w/ out having to release there "next level" stuff if they dont really have to.. the GX2 is exactly that.. INTEL does the same things w/ cpu's all the time.. another name for it is the "tick tock" method..In a perfect world, each new card would be twice as powerful as the last card.. in EVERY ASPECT, EVERY RESOLUTUION ETC.. But of course this isnt that world.. Ive been growing more and more depressed with the continuing incline of the PC hardware ( mainly vga's / GPU's ) makers efforts to what seems like trying to make PC gamers comepletely dependant on buying new hardware every 6-12 mnths... Now with the birth of "SLI" and "Crossfire", these companies have the best possible avenue of milking every last hard earned cent from us PC gamers.. I used to have a crappy DELL and always longed to be part of the "leet" pc gamer club.. The ones with the uber pc's that could play any game at max settings.. I always felt "left out" and that I was missing soo much... It wasnt untill I decided to build my first $ 2.5k PC that I realized that its just not all that its made out to be..There are so many dishonest,imorrall and sometimes just plain outright illegal practices by software/hardware manufactures that I really do believe that PC gaming is currently the worst platform for entertainment ( bang for buck ) there is..And now with "SLI, Crossfire, TRI SLI".... These companies are going to the next level of gayness and trying to make it so ( some ) ppl have to buy 2-3 600.00+ video cards just to play a 50.00 game "the way its meant to be played" .. Right now is THE WORST time for anyone to be into PC gaming or building/benching as a hobby.. ANd if things do n ot take a turn soon I think PC gaming as we know it ( somewhat mainstream ) will die and the only ones that will be able to play a game the first 3 years after its released will be people that have way more money than sense..
That's only possible if there is NO competition, or that there is an agreement between the two companies. And if there were such an arrangement, AMD/ATI wouldn't be willingly losing so much money from their previous generations. Or there won't be a price war between AMD and Nvidia over their new generation cards. At least if the rumours of the 4800 series card being so cheap is true.
So no, I don't believe in that bullcrap. A prototype that runs 100fps in Crysis.... Someone who really believes that is the one in denial. :rofl:
Quote:
And now with "SLI, Crossfire, TRI SLI".... These companies are going to the next level of gayness and trying to make it so ( some ) ppl have to buy 2-3 600.00+ video cards just to play a 50.00 game "the way its meant to be played" ..
For 1.
Loved that bit. ;)
lol this guys doesn't have the slightest clue what development cycles are. :ROTF:
Thought he mentioned the tick tock strategy of intel he fails to understand it....
He just asumes that company allready (now) have the architekture ready for the next 3 years and just wait to release them... this is utter BS.
And if 40-100% more performance is a marginal increase then i dont know what people want to have....
whats up with crysis @ Medium?
why not high,very high or ultra high? and with some AA and Af
I suggest Mythbuster420 buys a PS3... then he will have less hassles... think he never ever changed an autoexec.bat or config.sys in his life to make a game even install, like we had to do in the 386i days...
He seems to be missing the point that software pushes hardware forward and not the other way around... and there's still money that needs to be made...
Asus GeForce GTX 260 Unveiled
:rofl:
i'm glad there are company's like evga xfx etc. to make respectable looking graphics for their cards
those asus cards are still "heart touching" :rofl:
VR-Zone used an Intel Core 2 Extreme QX9650 (3GHz, 12MB L2) @ 4GHz.
I'm using the following:
8800GT 512 SC SLI (650/1900)
Q6600 @ 3.5GHz
My 1680x1050 results:
DX9 1680x1050 AA=No AA, 32 bit test, Quality: Medium, Overall Average FPS: 69.16
I don’t have 1920 x 1200 so this is my 1920x1080 results:
DX9 1920x1080 AA=No AA, 32 bit test, Quality: Medium, Overall Average FPS: 68.435
And my 1900x1200 results:
DX9 1900x1200 AA=No AA, 32 bit test, Quality: Medium, Overall Average FPS: 68.34
This was interesting as the scenery moved by I watch rocks pop up on the beach. I didn’t know rocks could grow so fast!
From this little exercise it looks like a GeForce GTX 280 is only slightly faster than a pair of 8800 GTs. Could that really be true? I want to see what happens Monday.
There preliminary conclusions say they are hard at work on Crysis High settings as well along with AA so we can hope it gets better.
Rock keep popping due to the default draw distance setting. tweaking your settings a bit would get rid of popping objects. hehhee
Geez some people really care too much about what their video card looks like :D
When you're sweaty-palmed in the middle of a session of Crysis and that korean you swear you saw a moment ago is nowhere to be seen, where the hell did he go?
You look around, cursing your energy bar for being too low to yet engage the cloak. All of a sudden BAM the guy is on top of you making you jump out your seat but mash the mouse button to take him out at the same time. After a few clips his bullet riddled corpse slumps to the ground, you enjoy a sigh of relief, and think 'man.. i wish i had a better looking VGA'
24 left before we know the absolute truth:p: :up: