Hi
can you post all the Previews in 1st Post ??
http://www.xtremesystems.org/forums/...4&postcount=66
:up:
Printable View
Hi
can you post all the Previews in 1st Post ??
http://www.xtremesystems.org/forums/...4&postcount=66
:up:
so what now, AMD enables sideport?
I love when new GPU previews come out. Free entertainment baby. I absolutely LOVE how worked up some of you people get.
http://www.pcgameshardware.com/aid,6...viewed/?page=9
Based on this review, it seems that the gtx295 is the card to get if you own core i7.
Well looks extremely biased, besides the guru review.
Prediction: ATI enables sideport, and cuts price of X2.
If sideport actually provided any meaningful gains it would've been enabled from the start.
that guru3d review is crazy! if the 295 had more mem, like a solid 1gig for each core, it would pwn at higher rez. overall, that's one fast card.
As a 2560 x 1600 gamer, seeing it die with AA at that res vs. the 4870X2 is dissapointing. Looks like i'll be keeping my 4870x2 a bit longer
dang i was gonna snag a 4870x2 but looks like ill order this instead!
Thanks for this :up:
What was refreshing here was to see a good old 8800Ultra slugging it out and suprisingly not falling that far behind the GTX 260-216.
Considering the volt mod and silly clocks my 8800GTX has been happily running at for the last year or so, I think I'm gonna hold out for the ATi/Nvidia next gen before jumping.
Judging by those benchmarks, I reckon I must be somewhere around stock GTX 260 performance, which'll do me for the time being :)
When will they review the new single core cards?
Why does it not have 1 full gig like the X2 again?
4870x2 in Canada is 700 bucks...so this thing should bring it down to 499 if not lower...riight? i have to get rid of my 1950!
$560 CND .... link: price canada
I suspect with the 295 launch in Jan it drop to maybe $500 ish .. cause with 499US is around $600CND.
I'm wondering what will be the fastest setup for folding: Triple GTX295 OC'd or Quad GTX285 OC'd. It doesn't seem like the GTX295 will have much OC headroom, but it'd be difficult for the 285's overcome the shader count. A more general question is, one GTX295 equals how many GTX285's?
Considering it doesnt exist we don't know yet....
Maybe comparing it to 280's since they do would work a little better.
Firingsquad overclocked the gtx295 pretty well.
http://www.firingsquad.com/hardware/...nce/page13.asp
OBR, maybe it's just a typo but it seems a little odd that you are using 6GB DDR3 with a 32bit (and hence 4GB limited) operating system for your benchmarking. Once you subtract the GPU memory from the 4GB aren't you left with a fairly small amount of memory (relatively speaking)?
looks like a monster.
thats weird because a 55nm revision seems kinda of new...not to mention nvidia is going to let manufacturers make there own designs for the cards like new heat sinks and pcb colours and zotac is rumored to make a 1.5gb version of the 260..i also heard that nvidia is dropping the 192sp version and 216sp,s will be standard...i guess were see what happens next month
not sure its worth step up from bfg gtx260-216 and paying £200 for extra 10-15fps in games like crysis warhead hmmmm will hold out see what other cards they bring out.
I think a lot of people are wondering the same thing. The core 216 was a buying point for many people. Step up was also a big consideration since many have that option now. Question is, is it worth it?
I think this is kinda bogus how the GTX295 reviews are out, but gtx285 reviews and 265 (if it exists) is not out. I was hoping to step-up, but not till I have the info.
Dan7777, I'm in the same boat. I do not see a point steping up AND spending 150+ USD on a new card for a 10% gain. If these cards are not 25% better than the ones we have now, I don't know if I could justify it.
Until we see microstuttering tests, all these comparisons are useless. How many of you guys are liking the taste of crow now that Nvidia is releasing a dual GPU card WAY before spring/summer 09?
Looks like the new fastest card. It's not a terribly compelling upgrade for me though considering all the new stuff right around the corner.
Also, Nvidia should have taken FO3 off their list.
Where is the ****ing GTX285 reviews? Stats even? This is getting ridiculous.
and how many games we saw in previews to make it fastest card in the world ? lets see : dead space far cry crysis fall out left 4 dead thats it ? and lets look at in game benchs :
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
lost in 2 won in one
Expreview said there will be a new GTX260 55nm from Zotac
http://en.expreview.com/2008/12/05/f...html#more-1550
regards
WHen is this out retail?
8th jan i think....
Truth.
And Fallout 3, Cod:WaW, FarCry2, L4D, are all very heavily played by most gamers, especially fps enthusiasts. And dead space is relevant, and its always fun to see what cards score on Crysis, so I m not sure that the choices for the preview were that irrelevant.. these are all popular titles, vs things like call of juarez etc that were always i felt a little obscure.
I know average and lowest framerate with lots of AF and AA are where its at. A better comparison would be to look at these cards on a 22-24" monitor. At that point it really doesnt matter as they will both pretty much max out a 1920x1200 res LCD anyway. A very small portion of people are running a 30" monitor.
So how much are two gtx280s?
Lets see how it compares to 4850x2 in crossfire. With the same drivers and few other games (grid?) ;)
Dead space isn't first person.
Out of those games I am playing L4D, Dead Space, and FO3 regularly. Yes, it does annoy me that ATI can't deliver proper dead space performance. But after seeing the performance of the GTX295 I won't be losing any sleep over sticking with my x2.
To me it looks like current x2 owners shouldn't bother with the GTX295. But those looking for a new card out of the current lineup should seriously consider a GTX295 over the x2. For me though, the economical choice would be a water block and soldering iron for vmods. :cool:
No need to sugar coat anything, reality is sweet enough already.
is it worth the jump from a gtx260-216 overclocked to 1 of these not so sure
I agree totally :yepp:
I think the majority of serious gamers are probably using either 1680x1050 or 1920x1080 so yeah, I agree that should be where the focus is.
Of course, there are those that'll be using the 'ideal' 30", 2560x1600 monitors and it is still nice to see what sort of performance they put in there, despite the fact it is a very niche market. :up:
FPS games are called FPS for a reason lol. Frames per Second matter much more in First Person Shooters, than fps in lets say supreme commander, as long as the threshold is met, lets say 30-40 fps in an RTS, anything above is eye candy. In an FPS for example if i m playing at 125fps in C0D4 and your playing at 90... my bullets have a better chance at hitting you first/ reacting etc.
LOL lonog time didn't see this much lame graph (NHF OBR):
http://pctuning.tyden.cz/ilustrace3/...w/cod_noaa.png
I'm waiting for clocks to be confirmed. My core 216 doesn't OC worth a damn compared to many guys here. I'm getting 653/1484/1057 vs stock @ 576/1242/999.
Like others have said, all these numbers at 2560x1600 are pointless for many of us. 1920x1200 and 1680x1050 stats are the most important to many gamers. I guess they aren't being shown since there's no need for this card at those resolutions and ati's card doesn't get destoryed at those res's either.
Lets just cross our fingers that prices on the 55nm line will be cheap and not super inflated like usual.
I'd like to see power comsumption and temps of this card. January 8th is just too late for this card.
When frame rates dip that low it should cause some stutter. But lets reflect on history starting with the 7950:
The 7950 was released on (or about) June 2006.
Next gen GPU (G80) was released on (or about) November 2006
-----------------------------------------------------------------
7950 had roughly a 5 month selling period before next gen gpu entered market
The 9800 GX2 was released on (or about) March 2008.
Next gen GPU (GTX 200 series) was released on (or about) June 2008
-----------------------------------------------------------------------
9800 GX2 had roughly 3 month selling period before next gen GPU entered market
The 295 will be released on Jan 2009 (roughly).
Next gen GPU (300 series) will be released on (or about) XXX XX, 2009???
------------------------------------------------------------------------
What will be the selling period before the next gen GPU enters the market?
This is something I ponder on if I were interested in this video card. As with the other incarnations they also performed well yet only had the market to themselves for only a few months.
Side note:
I believe the dates are accurate but if they are not let me know, thanks!
Hmm I almost forgot the GT212 GX2 rumored for Q2 2009 release. Therefore:
The 7950 was released on (or about) June 2006.
Next gen GPU (G80) was released on (or about) November 2006
-----------------------------------------------------------------
7950 had roughly a 5 month selling period before next gen gpu entered market
The 9800 GX2 was released on (or about) March 2008.
Next gen GPU (GTX 200 series) was released on (or about) June 2008
-----------------------------------------------------------------------
9800 GX2 had roughly 3 month selling period before next gen GPU entered market
The 295 will be released on Jan 2009 (roughly).
Next gen GPU or the GT212 GX2 will be released on (or about) Q2 2009 (or sometime after)???
------------------------------------------------------------------------
What will be the selling period before the next GX2 or next gen GPU enters the market?
Again, if the GT212 GX2 is true. Honestly when you spend that kind of money it's something to think about.
yea.....I remember when I bought my 7950 GT, then the 8800 series came out then my 8800GTS/8800GTX a few months after that for a slight price drop. Then i lived off that till i got my GTX 280, So I'm guessing I'm waiting till I get a GT300 based card for my next one because a 3 month step-up isn't enough. =/
I'mo most of the people buy the best hardware any manufacturer can offer to date, they don't sit and wait to rumors come true, also even if is true when they wait for that "GT212 GX2" (for example) and when the time comes another rumor or annoucement comes out of another powerful card, its a vicious cycle if you follow it you never buy anything waiting for new stuff...:shrug:Quote:
Again, if the GT212 GX2 is true. Honestly when you spend that kind of money it's something to think about.
its strong, but not enough of a performance gap on current models to call for a upgrade.
If will be vertical axis from ZERO, you didnot to recognise difference, point will be at same place ... because this in some games are vertical axis above zero ... dont read axis but NUMBERS!
dinos: Vantage 3D Perf: with Core i7 3600MHz
X2 - 16 927
280 - 12 276
295 - 18 971
260 - 10 715
4870 - 10084
Orb, with respect, the graph is very misleading.
The mathmatical difference in FPS is 1.1% yet your graph spacing shows a 20% distance in space. (~63.5 pixels of space in a ~317 pixel graph) That is a 19% divergence from what the actual spacing should be and what you show in your graph.
You can't just say "You're stupid if you don't read the numbers" as the graph is misleading no matter what.
Lol, most misleading graph I've seen in a while. Which is why you always have graphs start from 0.
Misleading Graphs
Others disagree ;)
im very surprised that the gtx295 only just beats 4870x2 by a few fraps i think people with there 4870x2 cards will be happy to see this....
Yeah, just ONE FPS faster -which is within benching error margin- and the graph makes it seems got beaten to a pulp and left 4 dead -pun intended. :rofl:
289w Tdp Card! :(
Did any one check this
http://www.hardwarecanucks.com/forum...preview-3.html
They were using ATI 8.12 WHQL for ATI HD4870 X2, yet GTX 295 performed better in almost all cases
Gurus was using 180.88 Beta while hardwarecanucks was using 180.87 for GTX 295
It'd be nice to see.
Yeah, if the graphs are gonna be drawn like that might as well have put it in a table.
I don't look at graphs for the values and work the difference out myself I expect the graph to show it, and fairly :)
Whats the big deal with the graph, The numbers are clear to see so its not lying and if you are worrying that "average joe" will be "tricked" then don't, Just because average joe is not a computer geek does not mean he cannot read a graph and "average joe" will not even be looking at these graphs anyway because he is not a computer geek.
Some of you guys need to get over yourselves.
A graph exists to see a graphical representation of the data. If you say "well only the numbers matter" and are completely removing the accuracy of the graph then there is no need to make a graph in the first place.
That is like saying when documenting a crime scene that only the data is important so instead of taking a picture with a digital camera you finger paint the crime scene.
Now theres some people who know how to make a graph :ROTF:
Nice looking one too ;)
I agree that the graph is horrible. There is a mere 1 fps difference and yet the lines are miles apart when they should be practically on top of each other in a realistic visual representation of the performance difference.
But with that said, it's kinda hard to entertain complaints of intentional misleading. He not only clearly labeled the axes but labeled the individual data points as well. Seriously, who only looks at the colored lines and bars without looking at the words and numbers? Idiots?
lol. :rolleyes:
I never accused anyone of making graphs for whatever reason, others may have, I was merely saying they are very misleading.
Yep, misleading for no apparent reason. Well, seems to me they put nV on top by one point-only for that specific graph though. I just hope that 1 point is really worth it, but its obviously on the margin of error zone, as others stated.
Most powerful card on the market but no maxumum game settings?
Bit-tech review
Fallout 3 : 2560x1600 8AA+16AF Max Detail
4870X2 73.6 avg 23.0 min
295GTX 48.3 avg 14.0 min
Farcry 2 : 2560x1600 4AA DX10 Ultra High (versus Very high for Hcanucks)
4870X2 41.4 avg 29.0 min
295 GTX 44.8 avg 29.0 min
Left For Dead : 2560x1600 8AA+16AF Max Detail
4870X2 67.8 avg
295GTX 72.3 avg
Funny you use that particular graph for Crysis Warhead :rolleyes:
I can see why :D
Not many of us have a 30" Monitor. Not because we can't afford it but because of it's shear size and lag. 1920X1080 is the most important resolution. Those of you who want the 30" Monitor res's need to opt for something different.
The whole reason nvidia kept the frame buffer down instead of going with the 280's memory size and buffer, was to keep the voltage down. The extra memory would have put this card at over 320 watt TDP
This card would be nice 5 moths ago. Way too late and not a whole lot better than X2. So basically it's a fail. I'll wait for the next gen from NV and ATI thank you.