O yeah? PROVE IT!!1!
:rofl::ROTF::p:
Printable View
but will 4870X2 run out of memory? :shrug: is it shared memory or 2x with bridge.?Quote:
But regardless if the new bridge chip works as planned, you're better off buying a 4870x2 over 2x 4870 for both price and performance reasons
are we talking nvidia or ati here cmon ppl, so GAR will I be able to play crysis with 16Q Anti aliasing and 16 Anisiotropic filtering on 2560 x 1600 res with GTX 280, if not I wouldnt call it a monster :P! come to think of it I have never played a game with those settings :P
Yea based on 3D marks performance seems a little disappointed.:(
So how much fast is it? Crysis 1920x1200 AAx4/AF is able to play? Is there performance jump comparable to G71->G80?:rolleyes:
Difference between GTX280 and GF8800GTX performance in games is greater than 3D Marks show?:D
chinese sites we need you
Of course I was making a joke, but to call it a monster hope it has same performance jump we had from 7 to 8 gen, :D they it would be awsome, so GAR you dont have to give us numbers:D
from here.Quote:
Allegedly, Radeon HD 4870 X2 prototype 3 DMark Vantage may be at about X5500, and NVIDIA's flagship GeForce GTX 280 only X4800 points, a difference of nearly 15 percent.
3dmark is very rough idea for perforamnce but it gives some idea. Hopefully this result will pull prices down.
Shouldn't 3D Marks be the best tool for comparing GTX 280 to the 8800 Ultra? 3D Mark has never been a good tool for comparing cards from two companies, but it is a good tool for comparing performance between cards of the same architecture.
I would have thought that Vantage would scale very well because it depends heavily on shading resources (which GTX 280 should have plenty of, with 240SP's that are supposedly more efficient than G80).
And I don't think any card will ever last two years like the 8800GTX... that was a once a lifetime thing IMO. nVidia is supposed to come out with a 55nm refresh this year, and then next year I am sure AMD/nVidia will be doing wonders with TSMC's 40nm process.
:rofl::rofl::rofl::rofl:
QFT!
Latest news points to a bridge with 2x memory, but I highly doubt it will run out of memory
Thing is, last generation nvidia spanked ati in 3dmark vantage, by a long shot in most cases. So unless if ati's drivers suddenly became perfectly designed specificly for 3dmark vantage, I think this may mean that the r700 could beat the gtx 280. But still, no one knows until we've seen the benchmark.
And that includes you too GAR, you also know absolutely nothing until you see the chinese leaks:p::D
A 3dM Vantage fix came out from ATI in April that equalized the playing field m8. Please look it up.
Secondly, using 3D Mark Vantage to decide which card is better is about as effective as throwing both off of a 10-storey building and seeing which one still works X_X
Only For People Who Believe 3d Mark Vantage is actually useful right now, others please go to next post :)
Please understand this. 3D Mark 06, 05 and 03 had to go through 2-3 revisions before we could trust their consistency, and be able to translate marks from one benchmark to another.
Right now, 3D Mark Vantage is in its initial stages, and Futuremark will be trying fix glitches and anomalies in the benchmark because it has just launched and they have JUST started to get feedback on their benchmark. It is far from being an end product (this is why beta releases are so useful) and we have no idea with any form of certainty how a Vantage mark translates into a 3d 06 mark and back, not to mention that the XIPs and XLs are still trying to find out what particular attributes (core clock, memory clock, CPU cache, TMUs, ROPs, Shaders, Memory bandwith, bus size, etc etc) provide higher marks and which have minimal effects on the benchmark.
My biggest complaint is the lack of proper CPU integration (the CPU mark accounts for less than 15% of the total mark) which gives a mark that is significantly different than what would be seen on average in today's games. (I've cited this as the reason the Phenom is more competitive in 3dMV but one benchmark cannot justify poor performance elsewhere).
If benchmarks will decide this, it will be 3d06, 05, 03 and 01 that people will be waiting for, not 3dMV numbers when we know well that a second rev could come at any time and make all previous records useless (as it has happened with every new 3dmark revision).
Perkam
Correct, but only from a bencmarker's POV... Ultimately it's about in game performance(for most). That's all I want to see out of both GT200 and RV770, jeez both companies are killing me I want to see some legit #s.
Big deal, in how many dx9/dx10 games does it beat an 8800? I guess that sounds like blasphemy to people who bench 24/7... :p:Quote:
HD2900XT beats 8800 in 3d mark vantage now I think
maybe the 2900xt does, but the 3870 loses to the g92 gts by a large margin, and the 3870x2 loses to 9800gx2.
Not at all, when did ever say that, you need to my posts better before you call someone out like that. Besides you were the one that brought up the 2900xt beating the 8800 cards, not me, so I think you should be thinking about your own posts instead of mine.Quote:
AliG believes otherwise..
Since nobody knows what the actual 4xxx specs are, its hard to say how will they will perform in games. But judging by the x1800-->x1900 transition and how both major rumors have many supporters, I think we'll see a lot of power comming from ati. Even if the 4870 has only 480 shaders, it should have a much higher shader clock (compared to 3870) since the 1.2 TFLOPs number seems to be a given now, which still means lots of performance coming our way.
And thus, I believe this battle for the high end will truly rest upon how well ati's drivers are for the 4870x2, and how well games are coded for both sides, as this should be much closer than the r600 vs g80 battle last year
One thing I that I think will be interesting is how well the 3rd party companies will be able to stock overclock the GTX 280. Many might assume that it is already pushing its boundaries and that it won't have the typical headroom of cards from the past. Or perhaps It could be set to a current lower stock clock with that idea in mind. Unlikely? yeah...but possible? yeah....
Overclockability is important to many here I suspect, (call me crazy for that assumption) as it is to me. We wont know until the official release who has the best overclock potential. Common sense says the 55nm part has the advantage, but how far did each company have to push their cards to compete?
Overclocking is becoming more and more important to alot of companies. It has become its own industry.
My bet is that it will be very similar to the 8800gtx in that respect, I'm kinda surprised that nvidia is clocking the gtx cards higher than the g80 high end, especially with how much bigger and hotter the gpus are. We'll probably see a max vendor clock around 700mhz for the special cooling versions, and probably around 650 tops for stock cooling
3DMark is the ideal 3D application where you can make the most out of a multi-GPU card, both because Futuremark has designed it to take full advantage of multi-GPU and because it if the first application where the video card drivers are optimised for. Traditionally SLI & Crossfire performed better in 3DMark than in games, so I expect the ATI 2-GPU design to be better favoured in 3DMark than in gamesQuote:
Allegedly, Radeon HD 4870 X2 prototype 3 DMark Vantage may be at about X5500, and NVIDIA's flagship GeForce GTX 280 only X4800 points, a difference of nearly 15 percent
Regardless of the patch all scores from 1.0 to the latest patch of all 3DMarks were comparative – in other words if you run Vantage today, and within a year on a same hardware and the same drivers, but with say patch 1.5, you’ll get the same results!
First of all the worst thing about 3DMark 06 and Vantage is integration of the CPU score in the finall number!Quote:
My biggest complaint is the lack of proper CPU integration (the CPU mark accounts for less than 15% of the total mark) which gives a mark that is significantly different than what would be seen on average in today's games. (I've cited this as the reason the Phenom is more competitive in 3dMV but one benchmark cannot justify poor performance elsewhere).
Second, try running Crysis in DX10 Very High setting in only SVGA resolution, and you’ll find out that Phenom is not a single fps behind competing CPU! Quite contrary ;)
If the CPU is relatively recent it has little to do with games performance, you won't see a difference at the resolutions people actually play games at. The CPU score had way too much sway in 06, they fixed that in Vantage.
3DMARK is actually the worst way to test, an 8800GT overclocked used to beat an 8800GTX in 3dmark, but in games it was a different story, same goes for the GTX280, it has a wider bus, and more memory, so in order to see full potential, you need to use a benchmark which utilizes its full power, 3dmark at 1920x1200 resolution with aa/af is a good comparison, but not stock settings, this card shines really in high resolutions with aa/af, the 1gb and wide buss really help it achieve some good numbers, i dont really feel the need to show off and show my card, in due time the numbers will be revealed, if i show numbers, all im going to get is, its not real, its fake, this and that, so ill let the official review sites give you the numbers so you can believe them......... great product by all means.
You've got a believer in me, Crysis very high @ 1920 x 1200 numbers will do me..
C'mon GAR,give us some numbers plx!
I would like to know numbers too, but am also interested in if there are any waterblocks currently available that fit the GTX280.
http://tcmagazine.com/comments.php?s...=20024&catid=2
take a look m8
give us numbers or give us death!
http://www.imagehost.ro/pict/08211400484c2168ad54b.jpg
Just saw this at VR-Zone, which got it from http://we.pcinlife.com/viewthread.php?tid=946259&page=1
Doesn't look as good as promised in 3D06. I only hope that real-world gaming benches pop-out soon :)
Lets start a collection to pay someone to break NDA. I'll donate $100.
Is there any air cooling heatsinks already on the market that fits the GTX 280?
really depends on the mounting holes, if they line up with anything from the past (like how the 8800gt lined up fairly well with the 7900gt but not 8800gtx), then perhaps you can get it to work, but it should have a pretty good stock cooler because of the large die size, I doubt you'll find better short of paying ~$50 for something like an hr 03 uber extreme gtx 280 edition or whatever thermalright plans to make
I wont have any time to put on any graphs, or take crazy pictures, which i will soon, but i will tell you this, crysis is fully playable at 1920x1200 dx10 very high, but without AA, with AA its still playable but i choose without AA to have a smoother game....this is on vista x64 with my rig in my sig. cpu at 3.6ghz 450fsb, memory at 1:2 at 1800
meh, according to many reviews, without any aa dx10 crysis doesn't really tax your gfx all that much more than dx9 crysis, and playable is up to the user's discretion. you don't need graphs, numbers are just fine
if by not taxing it much more than DX9 High you mean 15FPS less then you would be right. i remember at CES 2008 Nvidia had a TRI-SLI demo with 3 8800 ultra's and ran crysis on very high at 1920x1200 with no AA and it was unplayable. most PC's can't handle crysis at 1920x1200 on DX9 high not to mention very high. im still waiting for official review though
a review by[h] showed only a 2 FPS difference, but that was for medium, but I doubt medium ---> high would cause that big of a difference
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
he is talking the difference between High and VERY high. which is allot bigger then 2FPS. you can run High DX9, High DX10 and VERY High DX10. the difference between high DX9 and High DX10 is about 3-5 FPS (my own testing) and the difference between High DX10 and VERY high DX10 is about 10-15FPS last time i checked.
NDA lifts up 16.6! Not 17.6!
For 100 bucks i can tell everything ....
there is this one for water cooling, http://www.xtremesystems.org/forums/...p?t=188698:up:
Japan IT Media website today brings us the CRYSIS 1920*1200 VH test result of NVIDIA next generation flagship -GeForce GTX 280 Graphics Card.
According to IT Media said, NVIDIA and an anonymous motherboard manufacturer hold a secret presentation to show the performance of GTX 280 outside the Computex 2008.
The visitors said that the demonstration room is very dim lighting. In addition to show the performance of GTX 280 graphics card, the secret presentation also shown parts of the motherboards which are compatible with GTX 280, they simply had been placed on the windowsill of the room.
IT Media site had the opportunity to run GPU-Z, CPU-Z and Crysis Benchmark on the GTX 280 demo system. From the photos, we can clearly see that NVIDIA GTX 280 presentation system used Intel Core 2 Quad four-core processor, the frequency is 2.66GHz, the Crysis Benchmark with 1920 x1200 VeryHigh settings indicated that the average fps of GTX 280 graphics card reached 36.81!
source
http://www.nordichardware.com/news,5560.htmlQuote:
Despite an overall powerful machine, Intel QX6700, 4GB DDR2-800 and BFG PhysX physics accelerator, the results show that Crysis is big challenge for the hardware of today.
With the highest image quality settings, no AA/AF though, they[8800gtx sli] averaged 43.9 fps at a resolution of 1024x768. When they cranked up the resolution to 1280x1024 the performance went down to 36.8 fps. Hardly impressive results for a machine with some of the hottest hardware around
i typed "8800gtx sli crysis" into google and it was the first article that popped up :shrug:
http://images.tomshardware.com/2007/...ches-chart.jpg
http://www.tomsgames.com/us/2007/11/20/crysis_sli/
November 20, 2007 11:08
so 280GTX is 2x8800GTX?
nice. :)
gtx280 > 2 x 8800gtx i think, at highest settings, in crysis, anyway.
an addiditonal 8800gtx gives +20% in crysis fps according to some reviews...
http://www.expreview.com/img/news/07...drivers_02.png
http://en.expreview.com/2007/11/01/s...0-performance/
:eek:
Very high @ 1920 x 1200 was unplayable ( 15 - 26fps ) for me on a 2 x 8800gtx E2180 @ 3.40ghz system.
Sli 8800gtx's only gained ME about 5-7fps more than single card
If the game is playable ( 30 - 45fps ) @ 1920 x 1200, then the card is a monster, 9800GX2 Quad Sli gets about 45fps..
http://www.maxishine.com.au/document..._quad_sli.html
yeah just look at some scores @ computerbase -> http://www.computerbase.de/artikel/h...schnitt_crysis
2x9800gtx with 4ghz quad just scores 26,8fps @ 1920x1200. :eek:
if this is really true this card will fly in every other game that is on the market and is going to be released in the next year. :D
interesting.
Those Expreview scores look dodgy Adam, 25fps @ 12 x 10, game was chugging on well enough on my 2 x 8800gtx at that res..
more waiting for more info.
at what settings?Quote:
Those Expreview scores look dodgy Adam, 25fps @ 12 x 10, game was chugging on well enough on my 2 x 8800gtx at that res..
This is utterly ridiculous. DX10 performance in Crysis is on par with DX9 performance (+/-10%) and you can run much higher settings than that article states. 10x7? Are they retarded? That's completely CPU-bound.. how could SLI help you there?
Their numbers are too low even for a single 8800GTX though so I think their test is broken somehow..
What kind of bogus comment is this? Of course I'm aware of speedstep.. but I've never seen it on by default, and I was pretty sure CPU-z read the clock in the spec line as the "stock rated" clocks the way that some BIOS screens do.. i.e. a description
What a bunch of douchebags you are to peruse the forums looking for people to mock when they make a simple mistake... pretty sad really.
"LOLZ EPIC FAILZ OMFRG LOL"
Are you guys preteens? :banana::banana::banana::banana:ing snobs:rolleyes:
ohhh noezzz my QX9650 only runs at 2ghz. :(
it is a Q6700 running idle... i dont know where you get the idea that it runs crysis at 1,6ghz...
I've never used speedstep. Simple as that.. it's always been disabled as I've now said *twice* (but it doesn't seem to register for you, maybe it takes 3 times for people in third world countries).
I know games where speedstep was denoted as the cause for people showing up as "speedhacking" on certain MMOs I've played. They narrowed it down to speedstep in laptops at the time. People I know were getting hassled by the GMs for it.
Its on by default by every Intel mobo I've used... and one of the first to go as soon as I start the OC'ing
And besides, it turns the multiplier on/off depending on CPU use and it certainly is at correct speeds when playing a game
as zerazax said its on by default and thats why everyone ignores/laughs at your statement.
On every pc i have encounterd so far C1E was on. Hence the comment Luka_Aveiro made about "noobs" with there 1,6 ghz problem. :p:
if you disable it, fine for you, but that doesn't mean its of by default.
The screen at pczilla shows a stock Q6700 @ 2,66 ghz and to some certain people here crysis is only "half multithreaded" so a quad should be more then enough to satisfy crysis.
i have just tested crysis at 1900x1200 NoAA very high x64bit Dx10 o my rig and i get around ~59 FPS Max ~24 FPS min and ~37 FPS avg
so i guss GTX 280 is 2x8800GTX
that's useful to know.Quote:
i have just tested crysis at 1900x1200 NoAA very high x64bit Dx10 o my rig and i get around ~59 FPS Max ~24 FPS min and ~37 FPS avg
Forget about it man, the fact is that MOST boards have EIST enabled by default, that's why we found so strange your comment :p:
You know, we are like a friends community here, you had a slight glitch, we made fun of it, so let's grab another beer and keep up the talking :up:
Qx9650 @ 4.06ghz
9800GX2 QUAD SLI
DDr3 @ 1900
http://www.maxishine.com.au/document..._quad_sli.html
Insane bit of gear, still I would expect 2 x GTX280 to get over 60fps
oh yeah that explains the fps. :D
i didnt use crysis benchmark tool i just played Act1 with EVGA Precision and recorderd the max and the min fps i see on my paper:rolleyes::p: let me download crysis benchmark try it and get back to you
rofl, ive never seen a board with it turned off by default. its a part of the cpu thats promoted as a good thing so why turn it off by default?
"oh hey i just bought this neato car - it does 250mph and 250miles to the galon but i only got 3 wheels as the 4th wasnt attached by default"
With Speedstep, YMMV. Some people have reported instability with it while running an overclocked CPU, others haven't. Right now, my daily OC with my Q6700 is 3.4GHz, and I'm running the machine EIST on but not C1E, so my voltage will remain at its constant but the CPU throttles itself down to 2.04GHz. No instability so far.
For people who run distributed computing apps like Folding at Home that loads the CPU at 100%, EIST/C1E won't do anything anyway, because the CPU never gets a break (I run it at school where my tuition has already paid for the elctricity). But for people who run their computers 24/7 but don't fold or crunch, there's really no reason not to if your machine doesn't lose stability with either of them on, and you'll only find that out through testing on your own system. But the general rule is C1E should only be turned on if EIST is, someone with a 4GHz overclock probably won't be stable if their voltage drops too far from the set value.
And yes, Speedstep in one form or another is on by default...otherwise there wouldn't be so many topics on "WHY IS MY CPU ONLY AT 1.6GHz?!!?!!? HELP Pl0x!!!!one!"
This score in Crysis are authentical ... i got very similar FPS on my GTX 280 ...
GTX 280 is not brutal powerfull card! performance is good, but i expected many more ....
According testing of Radeon HD4850 in CF i can say - radeons are better then GTX 280!
Wow crysis benchmark really deffrent from Act1
http://img403.imageshack.us/img403/8512/21hw5.jpg
http://img403.imageshack.us/img403/8...841ad11883.jpg
any way sorry for the frist post and yeah if GTX 280 can score 38 FPS its a monster really!!
I hope the GTX 260 can beat two 8800 GTS in SLI or a 9800GX2 at least, as both are about the same price.
As for the 280, Nvidia is currently in a bit of a dilemma on what to do with the G92b and might cripple them just to keep the 260 and 280's superiority. (If they make it too slow, they make the difference between the G92 and the 2x0 series bigger, but also lose out to the 4870, but if they make it too fast, the 260 could be threatened as well as the 280 if you put two of them in SLI).
In my opinion, the 9900 GTX will be about 10% faster at max than the 9800GTX, but run cooler and oc higher due to the new process. Beyond that, it won't be worth upgrading over the 9800 GTX :S
Perkam
crysis in the "snow part" is a lot heavier than the gpu test...
test there.
ORB please stop teasing................ post some benchamrks please.. :p:
so will there still be a 9900GTX or will it just be the GTX 280/260?
whats the use of the 9900GTX if we already have the GTX280/260?