The only reason to buy the new card is benching, and playing Crysis if you haven't so already.
8800GTX will last you long enough to wait till the prices drop massively.
Printable View
The only reason to buy the new card is benching, and playing Crysis if you haven't so already.
8800GTX will last you long enough to wait till the prices drop massively.
There are a bunch of new games that benefits from a faster than gx2 videocard nowadays and being released in the second half of the year.
Myself, I dont play crysis, but I want higher framerates at 1920x1200.
'nuff said
Age of conan beta. I can tell u about the final version next week.
Is the 9900gtx or gts? Or just a fake?
http://www.custompc.co.uk/images/fro...o_91374_26.png
http://www.custompc.co.uk/images/fro...o_91377_26.png
From Digtimes 4/16:
GPU makers to cross blades in June with the launch of Radeon HD 4800 and GeForce GTX 200
Latest news
Monica Chen, Taipei; Joseph Tsai, DIGITIMES [Friday 16 May 2008]
AMD and Nvidia are ready to cross blades again in June with their next generation GPU families – the ATI Radeon HD 4800 and GeForce GTX 200, according to sources at graphics card makers.
AMD is planning to launch its ATI Radeon HD 4800 family in the middle of June. The first model to launch will be the ATI Radeon HD 4850 with a price set around US$229.
The company will follow with the 4870 in the late June, with the same specification as the 4850 but will add support for GDDR5 memory. Expected pricing will be set around US$349.
In the fourth quarter, AMD will launch the dual-GPU ATI Radeon 4870 X2 priced around US$529-549.
Nvidia will counter with its new high-end GeForce GTX 200 (D10U) family in the middle of June, featuring a second generation Unified Shader architecture. Launch products will come in GeForce GTX 260 and GTX 280 versions.
Link-A-Dink-A-Doo
DX10 or DX10.1 ???? :confused:
is this reliable?
10.1 sure better be there
And I know I'm asking for a lot, but a mobile version to go with the Montevina platform would be check writing time for me :)
fx57 is a reliable source but what wants to say with "50% more gaming performance over 1st generation" ? 50% over 8800 Ultra or 9800GX2 ?
1.- DX10.1 is not there.
2.- 2nd gen. unified architecture = 240SPs instead of 128. :clap:
Who cares about DX 10.1, who cares about DX10 all together, its not faster like it was promised to be, and it didnt look like it was "pre rendered" DX9 looks amazing and runs amazing, DX10 never broke through.
i so agree man,"dx10" was martketing ploy for vista.lol faster..
:clap:.:rolleyes:me being a true die hard gamer i could give a crap about vista\dx10 for ATLEAST another year before it goes on ANY of my my rigs .it took like 5 years+ to get xp to run right and its till the preferd os of choice for headache free gaming\benching.dx10 and vista are still very immature imo for headache free fun.i read quietly about so many ppl pulling hair out of head trying to get this and that to run right etc on vista.god i couldnt imagine.and forgot the poor souls running ati cards on any o\s and gaming,lmao.
the 9900gtx will drop and run flawlessly in my dx9 xp pro computers and i cant wait..
We aren't going to see what DX10 can really do until its a minimum requirement for games.......
i like dx10.1, but i think i would rather have physx and opengl2.1
Yeah I can agree with that. The physx might be more of a teaser and "future" item than 10.1 of course :)
I guess the question that stands out in my mind is "if I don't really 'need' this much gpu power, shouldn't I wait for the refresh and 10.1?" It's not like the 8800 gets overwhelmed even at 2560 res on most games, even things that used to be GPU crushers like Oblivion. Crysis is really in a world all it's own, though that makes it nice for testing limits.
If they bothered actually utilizing DX10 rather than just tacking on basic support afterwards just to say they 'support DX10' it would look amazing.
Time for XP to reach EOL, maybe we'll start seeing something then..
some more dx10 'only' software please. many people dont need dx10 to run 3d modelling and word processing:lol:Quote:
Time for XP to reach EOL, maybe we'll start seeing something then..
those specs look delicious!!! :D
Failing to put 10.1 support for now a 2nd product in a row 9xxx series being the previous that "could" have had it, will end up being a mistake. Given the strength of the 8800/9800 not a lot of folks are going to bite, because they won't need the extra power. And with a lack of additional features further souring the soup, the initial sales will be good as the first adopter wave hits and passes. After that things will stagnate, and you will be surprised when they see that mistake in action.
The holiday season and discounts may save things somewhat through Christmas. After Dec 08, sales of 280/260 series will go into the tank.
$.02
Why would you need DX10.1 anway. :confused:
If nVidia doesn't support it, I'll very much doubt you'll see it in many games, since TWIMTBP is very very strong, ATI really dropped the ball there.
10.1 supposedly adds a lot of features, and fwiw, what MS does, MS tends to get people to change
Exactly why many people, including alienware believe that microsoft should have released vista as x64 only, software would have been forced to catch up, and just about every cpu on the market is 64 bit right now anyways. I'm pretty sure that eventually microsoft will force its hand and dx10.1 will show up, its a shame nvidia caused assassins creed to lose dx10.1, that brought a good performance boost that ati users paid for when they bought the cards, not to mention dx10.1 offers more than just extra performance, it also affects the minimum image quality etc
Even Microsoft aren't shouting about DX 10.1 as they don't really see the merit in it themselves.
Can you prove nVidia made 10.1 disappear? No, you can't but like the other "fanATIcs" (to coin an old term) you'll happily spout conspiracy theories & conjecture.
It will eithery be a GTX 280 or a HD4870X2 for me this summer. I wonder which of them that will perform at the utmost top end of performance?
dx10 is born to die ^^
Fudzilla - Geforce GTX 280 to launch on June 18th
AWESOME. It looks like GTX 280 launches on June 18th, exactly 3 months after I bought my 9800GX2 ;)
But the only I question from Fud is that they claim the launch will take place on a Wednesday (June 18). Whereas Nvidia always tends to release its cards on a Tuesday. So perhaps they will correct the date to June 17?
http://www.xtremesystems.org/forums/...1&d=1211038256
If you actually read the image, it says that the extra perfomance is obtained through the use of 240SPs. Nothing about them being optimized, though they surely have done some kind of optimization.
50% more perfomance per shader? SURE :rofl:
why?
looking at the specs, its not only possible, but certain
(when drivers are good enough right on launchday; but after optimisation, shure)
I'm gonna wait for the new 55nm 300 Ultra :yepp:
I'm rather stunned that 10.1 has not been implemented yet, yet ATI have....there must be some logic to this. Perhaps holding off for another gen?
Why not implement 10.1? Because the die shrink of the GT200 is going to need a selling point. It might be DX 11 or 10.1, but since 11 is a long way off they are hedging their bets. Remember the changeover from 9.0 to 9.1? Based on Nv's experience I think they are saving it as a nugget for a future product bullet point, hoping instead that pure performance will sell this model. I believe they are correct, this strategy will work, it will just be short term.
The other thought is an attempt to implement 10.1 via drivers? I have no idea if this is possible anymore, but it's an age old method of approach if software can duplicate the needed commands.
they better do 1GB mem per gpu this time around
512MB: total joke @ high rez
i aint buying no freakn 512MB/gpu no more!!
Regardless whether it's expensive or not, it's the way forward, we had 256bit bus since forever.
And I'm sure that the 512mbit bus on the R600 wasn't what made it so expensive, rather poor yields had more to do with it I reckon.
What does that have to do with anything? Both companies taped out chips for this gen obviously.. whether it's DDR3 or DDR5 just becomes a matter of cost then.
By the way, changing memory as a method of avoiding taping out an extra chip makes no sense at all.
You really think the chip is going to be the identical chip that ends up in a laptop?
Neckbreaker and AMD is usually refered to as economics. And since RV670 is basicly a R600 with half the bus you should eb able to put 2 and 2 together.
GDDR5 will be cheap enough for the cards needing it. And its cheaper than going 512bit. AMD went 512->256bit. nVidia went 384->256bit. Notice the trend?
That doesn't mean its not necessary... we've seen cases where 256-bit hurts performance vs. 384-bit and 512-bit. It's not much of an obvious need-based trend when you're talking about each company needing to keep up with their competition in terms of costs for their products. If they choose to go the costlier route, they automatically lose out in a price cut war. No one wants that disadvantage, so they happened to match each others 256-bit specs last round because it was the logical thing to do (offer a value card).
This time it's two different costly approaches to increase back to 512-bit bandwidth levels. So much for any trend..
http://www.pczilla.net/en/post/11.htmlQuote:
GeForce GTX 280 got 7k+ in 3DMark Vantage Extreme Test
We learn from pcinlife forum that GeForce GTX 280 graphics card can get 7k+ scores in 3DMark Vantage Extreme test, double the score of 9800GX2 in 3DMark Vantage Extreme test. But now we still don’t know the specification of the test hardware and software configurations.
We learn that GeForce GTX 280 graphics card is equipped with 512-bit 1GB DDR3 video memory; it has 240 stream processors with 250W power consumption. GeForce GTX 280 graphics card uses dual slot cooling fan,
We also heard there’s no problem for GeForce GTX 280 graphics card to run 1920x1200 with AA and AF in Crysis.
Nice if it would be true. xD
That would be a huge performance boost jump to today's gpu market. Well if these rumored specs really are true then I don't see why it couldn't. Now this wouldn't necessary mean 2x performance in PC games but it's sure speaks for some nice performance boost in any case.
But that last sentence is a bit fishy to me. If Crysis runs fine at 1920x1200 with AA and AF, either AA & AF is handled severly more effective now (yea SPs numbers have increased a lot) but it would have to increase more than 2x the performance currently to run Crysis 1920 x 1200 fine with AA and AF. 1GB vidmem could help a lot I know here and the high bandwidth, I dunno, I'll just believe it when I see it.
ehh, that sounds like they are just posting what people want to hear to get hits, double the 9800gx2? That's at least 3 times the performance of g80, not going to happen unless if that setup was oc'd to the extreme or was a dual gpu setup
Well I don't think 3DMark Vantage for starters will be a good indicator how PC games will run like which will probably see a less benefit (when I hear 2x higher 3DMark Vantage score my mind goes automaticly like, ok prolly 50 ~ 60% pc game performance advantage) at least today's popular ones but upcoming PC titles might see a little bigger improvement.
I'd love to see some leaked PC game performance tests. :p:
hmm 250W... includes H20 cooling. :D
250W against 159W from RV770? That is kind of a big difference there, dont think it will be 250W unless it turns out to be a dual gpu card, which is ofcourse useless (unless u wanna brag about having the fastest card on the market) and a waste of electricity.
Bus size doesn't mean that much at all. My old G80 GTS has 320 bits bus, more than the current GTS with their 256 bits but its a slower card. They need to make the card powerful enough to actually use all the bandwidth in order for the size to matter.
512bit MC
1GB DDR3
240 Core
250W
2X performance of 9800GX2 : 7K+ Vantage Extreme preset
Crysis at 1920x1200 with AA/AF no problem
source:
http://forums.vr-zone.com/showthread.php?t=277971
More than 7000 points in Vantage Extreme?!! :shocked:
If its true than RV770 and R700 is history.
not really. if the price of the RV770 is lower, then its really a "you get what you pay for" situation.
(very) small pix of gt200 pcb
http://www.itocp.com/thread-8904-1-1.html
that chip is HUGE!
is it me or is that die HUGE?
linkie no worky ..
its only the heatspreade
same size as g80 (on first glance)
8800 gtx:
http://forums.prophecy.co.za/attachm...ly-8800gtx.jpg
gt200 gpu located even further in the back of the pcb --> lots of heatpipes
Could this mean anything? A little slip up by VR-Zone?
http://www.vr-zone.com/articles/H1_'...X2/5766-6.htmlQuote:
AMD is coming up with their much-awaited Radeon HD 4870 X2 next-generation graphics card, while NVIDIA has responded with its GeForce GTX 260 and GTX 280 dual core GPUs.
Its in the trinity review
Not sure about that..
Going by this article it should be fine and my own experience ( I have had a 8800gtx, HD3870X2 and 8800gts 640mb ) in that pc and didnt feel my E2160 @ 3.20ghz held the cards back on a 24"..
http://www.guru3d.com/article/cpu-sc...e-processors/3
Im going to go for it :D
Am I the only one waiting for the new wave of graphic cards to buy a new rig ?
*bites nails*
Its has no 8 pin, but has 6 pin, I have a 6 pin to 8 pin adapter..
Getting worried now, it did handle the 3870X2 fine, good sign?
Worried about my mobo too, its a p31 ds3l with no pci-e 2.0, will my super video card be bottlenecked by the slot?
Feck I will make a new thread :D
Gaming on a 24" samsung
This article says it all about the gpu...
http://www.guru3d.com/article/cpu-sc...e-processors/3
Sorry but like most of the Guru 3D stuff it's not tested extensively enough. From what I've read of their reviews lately it's wise to take all their conclusions with a large pinch of salt(not Inquirer large but large nonetheless). Besides looking at WiC results paints a very different picture.
Thanks can I have a link to WiC please?
Lots of CPU intensive games out there. I'm on 1920X1200 and I upgraded from a +3800 X2 @ 2.7Ghz to a E6420 @ 3.2Ghz, both running an 8800GTS 640MB, and I saw huge improvements in my Source based games (DOD:S, CS:S, HL2), Supreme Commander, and lots of RTS games. Most games there was little to no difference, but there were too many games that saw improvements for me to ignore. I went with the E6420 because of the extra cache, which CPU intensive games really like.
AMD does not need the extra cache, Intel does. This is because AMD has integrated memory controller and lower latency to the actual RAM than Intel CPUs have. So basically it doesn't matter if AMD CPU has 1 MB or 24 MB's of L2, but for Intel CPUs it makes a huge difference.
Going from my old x2 3800+ to Q6600 I saw a MASSIVE difference at 1920x1080P(42" plasma) as well. Crysis went from chop-chop-chop to playable, UT3 lost it's various slow-downs here and there in warfare, and a few other titles had similiar effects.
Generally speaking, faster cpu's do better in the minimum frame-rate department.
Specs confirmed
http://www.dailytech.com/Nextgen+NVI...ticle11842.htm
If the 280 gtx performs as good as it sounds, I will definately be buying it depending on whether the gtx or hd4870x2 performs the best. Do you think my corsait 520hx psu will handle it? I'm running an hd2900xt and a quad which are all doing good. Buying a new PSU is something I'm not tempted at doing.