think 2 x 8800 Ultra SLI will run this game on highest setting + high res ?
Or do we have to wait till the 9 series for that ? :confused:
Printable View
Any news on the 8700? I know it's supposed to be OEM only but still interested to see where it fits in...
Also for all the ATI naysayers, let's not forget all the info suggests the launch date of the 8800GT was brought up (mid Novemember appears to have been the original plan) and the price appears to have been brought down (originally everyone was saying $299) because of the RV670... All in all, it appears some level of competition is back which is good
Tweak the settings...for goodness sakes.
Why do people get dumber as they get better and better hardware ?? There IS an advanced video settings tab just for YOU.
If we don't use them, in the future companies will just come out with games with a 0 video options.
Perkam
I have'em tweaked as much as I can. I'm left with DX9 1280x1024, all high except shadows and post processing medium, that's the best I can do. That's with a GTX and Q6600 @3.6 mind you.
tweaktown has faked results and the proof is right there on their page
http://www.tweaktown.com/reviews/121...k05/index.html
on a quad core 3Ghz a HD2900XT with catalyst 7.9 in 1024x768 scores 15534 points.Well this is just a BIG LIE everybody knows that would score at least 18000 so there you go another paid article from Nvidia.And about crysis i don't think nvidia is so good in a shader intensive game like that i think HD2900XT with a driver will kick so much butt like 8800GTS 320,640 and maybe even an stock 8800GT i just impossible for HD2900XT to lose in crysis wich is so hard on shader power will see with new driver.
Different computer setup varies very differently on benchmark scores.
You don't think Nvidia is any good in shader heavy game ? 8800GT is built for heavy shader games in mind. :ROTF:
And your agruement on ATI driver are invalid until we actually have this 'new' driver. :rolleyes:
i will run a 3dmark 2005 with HD2900XT & Q6600@3Ghz to see how it scores .
However the hd2900XT 3dmark 2006 score (11200marks) in that review >> http://www.tweaktown.com/reviews/121...k06/index.html
http://img98.imageshack.us/img98/2941/xtfakexz6.jpg
And I Got 12500 marks with HD2900XT DEFAULT 3dmark 2006 also with quad-core @ 3ghz and also with Windows XP (same as review)
http://img150.imageshack.us/img150/2...viewsetpq0.jpg
but maybe in tweaktown review only the hd2900XT scores are wrong and the 8800GT could be OK , we dont know :D
:up:
we'll just have to wait and test it for ourselves! thats the only way were gonna have real results
litlle lower cloks than mine How??? default HD2900XT cloks are 743/828 for all cards .................
His HD2900XT has exactly the same clocks
>> http://www.hisdigital.com/html/product_sp.php?id=304
>>> http://www.overclockercafe.com/Revie...00XT/index.htm
they used catalyst 7.9
yep they used that new driver , thats why i said mybe only the Hd2900XT scores are wrong and the 8800GT scores are OK , we dont know :p:
regards
yeh , default core clock for hd2900XT its 740mhz( 743mhz to be excalty) :p:
yep maybe they used "old" HD2900XT tests done with Dual core ;) (not too old because they used CAT 7.9)
well i go run a 3dmark 2005 to see how it scores
regards
now where is that G92 GTS i heard so much about .... if it can beat GTX or even ultra for ~$350 im in
yep should be a great card!! ;)
rumors say 19Th november (1 week after 780i release) >> http://www.xtremesystems.org/forums/...23&postcount=1
:up:
OK i Just run now 2k5
tweaktown review (15534 marks) :
http://img81.imageshack.us/img81/6554/xtfake2sx7.jpg
my score (18857 marks ):
http://img136.imageshack.us/img136/5...defaultky7.jpg
regards
guys the MSI 8800GT used on tweaktown review is overcloked version and they used windows XP , thats why scores 13.7K :p:
Quote:
The standard clocks come in at 600MHz on the GPU and 1800MHz DDR on the GDDR3. MSI have decided to mix it up a bit; the core gets a 10% increase to 660MHz while the memory gets bumped to 1900MHz DDR.
http://www.tweaktown.com/reviews/1210/3
i got 700+ extra marks with 60mhz overclok in core and Only 58mhz in memory on hd2900XT
HD2900XT default (743mhz/1656mhz) = 12925 marks
http://img215.imageshack.us/img215/3075/2k6forumpz7.jpg
HD2900xt @ 800mhz/1714mhz = 13625 marks
http://img90.imageshack.us/img90/8314/2k6oc2vo9.jpg
MSI 8800GT used in twektown review has a Overclok of 60mhz core and 100mhz memory , i think that should give about +800 extra marks , so a Non OC 8800GT version should score around 12.9K ;) ( Windows XP )
with Nvidia cards from windows XP to windows Vista scores usualy go down about 900maks , so they probably scores around 11900marks with windows Vista ????
so maybe this score are correct:
Quote:
Intel Core 2 Duo QX6800 @ 2.93G
DDR2 800 4-4-4-12
NVIDIA nForce680i SLI
Windows Vista
http://img240.imageshack.us/img240/9...3d06thubx8.jpg
:up:
Well it's a factory overclock, it's not like Tweaktown was cheating.
There are 3 8800GT SKUs, so if anything this will be an average of the 8800GT performance. From price side you can just as likely find the stock one at the same price as an OC or OC2 with the wide variety of prices from etailers. This was the same MSI OC that Clubit was selling for $230.
If anything I'd be interested to see the OC2 performance, to see how much of a gap there is between OC2 and GTX, to get an idea of how nVidia will manage to insert a 8800 GTS 640mb SKU that really can differentiate itself from the GT and GTX unless we are all talking about minor percentage points. That reminds me, if the 8800GTS is going to be in the $350 range, shouldn't the GTX get dropped down to $399?
i didnt say that , did I???
i said that the card its overcloked (because nobody saw that , including me )
tweaktow even say they used a overcloked oversion (not they fault , our fault we didnt saw)
the only thing i said was that hd2900XT scores were wrong and i proved ...........
Quote:
Originally Posted by mascaras
:up:
Microsoft presents Direct3D 10.1 at SIGGRAPH
http://www.istartedsomething.com/200...-101-siggraph/Quote:
At the annual SIGGRAPH (Special Interest Group for Computer GRAPHics) conference - more exciting than it sounds, Microsoft presented a lecture on the new Direct3D graphics platform and even newer Direct3D 10.1 framework. The whole course has been subsequently published to Microsoft Downloads. Here is their Powerpoint presentation for a quick summary.
Direct3D 10.1 looks to be quite a significant release for DirectX. Some of the interesting points to note:
* Will require Windows Vista SP1
* Not “supported” by current-generation DX10 hardware
* “Extension” to Direct3D 10 - “complete” D3D10
* 5 new API
* Improves rendering quality
o Enforces floating point 32 rendering
o Enforces 4x multi-sample anti-aliasing (MSAA)
For argument’s sake, Direct3D 10.1 didn’t come out of the blue. Microsoft had always planned for a 10.1 update when they released version 10. However that is no excuse as to why is it not compatible with DX10 hardware? People who have to dish out hundreds of premium dollars to buy DirectX-10 hardware today will be outdated in less than 6 months.
http://www.next-gen.biz/index.php?op...=6824&Itemid=2Quote:
Even though DX10.1 will support current DX10 graphics hardware, today's DX10 hardware will not be able to support all of the features of DX10.1, which includes incremental improvements to 3D rendering quality.
date of article : 8th august
:up: