uhm... i think higher, much higher.
Printable View
Aren't there situations though where even in GPU bound scenarios, sli'd GX2s have benefited from a cpu clock boost?
depends on the game, and ive seen no proof, whether cpu plays a factor for the paricular game.
sli'ed gx2's arent my concern :hehe: if you have 2x gtx280's then chances are you'll have a 4gig cpu anyway.
3DMark Vantage of Single GTX 280 with QX9770 @ 4GHz
http://forum.amtech.com.vn/attachmen...1&d=1213188657
http://forum.amtech.com.vn/attachmen...1&d=1213188657
http://forum.amtech.com.vn/attachmen...1&d=1213188657\
http://forum.amtech.com.vn/attachmen...1&d=1213188657\
It is an old version. The GPU-Z v0.23 revision history says that GTX 260/280 has been fixed and tested. Victor download it and don't cover up all that info.:D
http://www.techpowerup.com/downloads...-Z_v0.2.3.html
Just run 3dMark vantage on my system:
QX6700@3.2Ghz
8gb Kingston DDR2
ASUS Striker Extreme
BFG 9800GX2
& i score 10050, my GPU score is 9982 & my cpu score is 10258
on extreme i score X3956 a GPU score of 3832 & Cpu score of 10276
Those GTX 280 numbers are accurate. However, a 4850 with a 4Ghz Quad does X2800 in 3DM Vantage at 700/1100, and the 4870 will most likely do around X3200 at stock and close to X4000 at 800/2000.
Perkam
You're right, an 800 point increase is a tad high for the low oc, but it still should be around X3600 with the mild oc for the 4870.
Which other part did you find wrong? Comparatively a GTX 280 OC'ed should reach 5.5k and a GTX 260 should reach 4.5k with a mild oc, with a 4Ghz Intel Quad that is.
Ofcourse, the above means NOTHING at all in games :p:
Perkam
I don't know man, first you come with "unleashing performance" for HD4850's to counter G92b cards, then you come with predictions about Vantage scores, when no one knows how HD4870 will really be and how GDDR5 will affect it... I mean... I've seen better posts from yours :)
Well, you can ask anyone, I've been a fan of guesstimations once actual results begin to be leaked and you have a good point, the GDDR5 will affect it as well, but you have the results now for the GTX 280 and the 4850 (both of which will be released first) so I don't see why it would be too hard to figure out their respective scores.
Because benchmarks are usually more linear in their scoring system than games, it's usually easier to estimate scores :)
The point I was making, was that though dual 4850s and 4870s will beat the GTXs in benchmarks, dual GTXs should smash 10k in Vantage with only two GPUs, which will be an industry first :) And it will require two 4870 X2s to beat the GTXs, which isn't even on the roadmap till later on.
Also, we can say with confidence now that the new cards from both sides are truly evolutionary :up:
Perkam
http://en.expreview.com/2008/06/13/g...cially-priced/
"Official" pricing... ?
http://www.evilavatar.com/forums/showthread.php?t=54223
Crysis Very High w/ 4x AA ????
Quote:
Quote:
GeForce GTX 280 2 and 3 Way SLI card photos
http://we.pcinlife.com/attachments/f...7QJlWkHjNi.jpg
http://we.pcinlife.com/attachments/f...H1z6o3ORR9.jpg
http://we.pcinlife.com/attachments/f...B3TXyxy14W.jpg
http://we.pcinlife.com/attachments/f...gURmiIVb3c.jpg
http://we.pcinlife.com/thread-948441-1-1.html
Quote:
GTX 200 finally officially priced
We recived an official massage from nVIDIA which provide the final official price of GTX 200, and some imformation about sale date.
The official suggested pricing to manufacturers is the following:
o GeForce GTX 280 - $649
o GeForce GTX 260 - $399
The embargo time for the GeForce GTX 280 and GeForce GTX 260 is on Monday, June 16th, at 6:00 a.m. PDT.
The GeForce GTX 280 will be available in quantity on Tuesday, June 17th, 2008. The GeForce GTX 260 will be available the following week on Thursday, June 26th, 2008.
http://en.expreview.com/2008/06/13/g...cially-priced/
regards
http://madpixelz.net/nv/ :D:toast:
$399 is an awesome price. The $50 pricecut will REALLY help their sales and undercut the 4870 in price/performance.
The cards look beautiful, but the Tri-SLI thing look hideous...you can't shove an A4 paper between those things at that distance !!
http://we.pcinlife.com/attachments/f...gURmiIVb3c.jpg
Perkam
GTX280 & GTX260 >>> 2560x1600 tests
http://bbs.chiphell.com/viewthread.p...a=page%3D1
Few shops here already anounced GTX280 = 565€
:up:
erm, the top two cards will get 0 airflow....
Silly design
yep , acording to last rumors in 3dmark vantage CF HD4850 should be +- same or better , but i also want to see Crossfire in games (not the FPS number but if there is any stuttering , etc) , when i had 2x HD3870 they scored very well in 3dmark , but in few games the Crossfire :down:
lets wait for Crossfire reviews @ Games
regards
3- Way SLI GTX 280 3D Mark Vantage Score, check this out: http://www.tweaktown.com/news/9672
3-way SLi >> http://www.xtremesystems.org/forums/...postcount=1579
and single card >> http://www.xtremesystems.org/forums/...postcount=1606
and single card GTX260 & GTX280 >>> http://www.xtremesystems.org/forums/...postcount=1625
:up:
Medusa stars in NVIDIA's upcoming GT200 tech demo
Always at about this time before the launch of a major new GPU, AMD or NVIDIA will begin distributing a technology demo which is designed to showcase the power of its new graphics chips. it creates some early excitement for their upcoming product.
IT168 in China has got hold of the GT200 tech demo which is called "Medusa".
They ran the demo at 1680 x 1050 with eight times AA enabled and if you look closely at the screenshots, you can see that the frame rate ranges between 7 FPS and 17 FPS. Can you say highly intensive? By the look of the screenshots, we are in for a treat!
Full Article Here:http://www.tweaktown.com/news/9668/m...200_tech_demo/
http://images.tweaktown.com/imageban...o-09l_full.png
Why people keep posting 3dmark benches? Doesn't matter anything in games and just get people (myself included) confused
Don't mind me...
regards
great find mascaras where did you pick that up
Damn fast Oo
No need for anybody that bought a 9800GX2 to be kicking themselves :)
Looks like a beast of a card! Keeping up with / beating a 9800GX2 means its almost 100% faster than G92 :o
The PC Gamer article says that it runs Crysis at 1920X1200 and high settings at 34FPS. At the bottom it says it runs Crysis once again at 34FPS and high settings, but the resolution is at 1600X1200. WTF? And I'm guessing this is with no AA, correct? Does anybody else find these scores disappointing? I was hoping for a card that could do Crysis at 1920X1200, 2X AA, 4X AF, at 60FPS constant. Bollocks.
at 1920x1200 the FPS should be pretty much the same as 1600x1200 ....
2560x1600 @ Crysis @ High settings = 15 fps (9800GX2 & hd3870X2 = 1 fps ) !
note : Beta drivers used @ only available for now!
next week :
Quote:
Nvidia changes GT200 dates again
and adds a PhysX driver
NVIDIA IS CHANGING the GT200 launch date again, this whole 'let's prove them wrong when they leak' thing is getting tiring. That said, there are a few goodies here and there in the email that Ken Brown sent yesterday.
The meat of the email that went out to reviewers is that the launch date has moved from the 17th to the 16th at 6am PDT. They claim that the 280 will be available the next day, and the 260 comes later, on the 26th. One word to reviewers, make sure you check the retail prices with partners before you quote price/performance numbers, NV has a dirty tricks campaign lined up here, we told you they would have to drop the price when they saw the 770 number, and they did.
There are also a bunch of new things on the NV press FTP site, including 177.34 drivers, up from the 177.26 we tested with. We would be shocked if these were not special press-tweaked drivers, so beware of scores tested with these last-minute releases. Also included are a folding@home client, now possible due to the unbreaking of their architecture this time around, the Elemental "Badaboom!" encoding application, and a bunch of documentation.
Speaking of tweaked drivers, the next new one is coming next week, and it is a PhysX driver. Look for this one to pump 3DMark Vantage scores to the moon, you can do that when you own the API. Sigh.
The more things change, the more they are gamable. µ
INQ
regards
You were hoping for too much for 60fps constant. Remember when Crytek said their engine wouldn't be tamed for a good 2-3 years? I don't think any single gpu will do 1920x1200 4x AA VHIGH DX10 ( in the 30-60fps range )for a year at the very least ( and I do mean single pcb, single gpu ) TriSLI 280s probaley will and quad 4870x2s may as well.
Crossfire 700's will probably leave the 280 choking dust. And since many, many folks won't touch a 790 chipset, and SLI being what it is, I think Nvidia is in for a rude awakening in the sales dept. First adopters will always be the first wave, but I'm taking a long, long look at dual 770's or dual 700's this time around.
Good find on the PCG article, mascaras.
Seriously why does everyone care so much about Crysis, it's just one poorly coded game and there aren't any titles in the immediate future that will require so much graphics power to run at highest setting anyway. This new range of cards is still a massive jump in performance over what we currently have.
It's not poorly coded... it does a LOT more than any other engine out there and looks a heck of a lot better to boot. At similar visual fidelity levels to other games, it performs about as well as they do. There are plenty of games that require Crysis-level performance out there both released and coming soon:
Crysis
Crysis: Warhead
Company of Heroes
The Witcher
STALKER Clear Skies
Assassin's Creed
Age of Conan
Warhammer Online
Devil May Cry 4
Unreal Tournament 3
Fallout 3
Project Origin
Farcry 2
Oblivion with mods such as QTP3/Parallax
HL2 with mods such as Cinematic Mod 7
World In Conflict
Team Fortress 2
Grid
Mafia 2
Empire: Total War
Alan Wake
Mafia 2
Left 4 Dead
among others. 2560x1600, 1920x1200, really even 1680x1050, with any AA + 16X HQ AF and higher/highest in-game settings all demand more graphical power than a 8800GTX or 2x 8800GTS G92 can deliver, unless you consider 20-30FPS "acceptable"
Yup. I don't think we should use Crysis as the next-generation hardware benchmark. It doesn't even use multi-threading heavily like Alan Wake baby! And I bet multi GPUs scale way better.
CPUs multithreading in games may never be on as you may think, because the CPU needs to wait for GPU info, games are not like Folding@Home or the un-lauched Alan Wake you refer (Intel marketing tbh) that scale very well.
CPUs may be used for physics, but you will have physics on the gpu as well, so a quad-core is good for games may only be a myth.
And Crysis IS multithreaded, there was a thread a week ago or so that pointed that out clearly. :)
*whispers* UT3 *whispers*Quote:
so a quad-core is good for games may only be a myth.
They require more horsepower than we have to run smoothly at high res as I said... not as much as Crysis but enough. Even Team Fortress 2 maxed out with 4x AA and 16x HQ AF kills my 2x 8800GT SLI setup at 1680x1050 in a good firefight to 30-40FPS which is not acceptable to me. Witcher is more demanding than it...
Grid may not be as demanding then... probably should have left that off. As far as Assassin's Creed, UT3, TF2, they DO indeed take a heavy toll on a rig at highest settings + AA + AF as I said. I really don't call 20, 30, even 40 FPS "runs perfectly" but whatever floats your boat :).
Who on earth originated this whole 'crysis is poorly coded' FUD? I can't believe it's still lingering on. Like current hardware not being able to max it out means it's "poorly coded" ? :shrug:
It might not be so insubstantial if anyone ever went into more detail than this 'poorly coded' catch phrase.
I totally agree with you there. I've been using SLI 8800GTS 512s since December and alot of the games maxed on that list don't run well enough for my taste at 1920x1200 and even 1680x1050 in some cases. In some of those with 4xAA we are talking drops in the low 30s frequently, and for me that just doesn't cut it.
I still think a single GTX 280 will be more consistent than Crossfire and to be fair in the end its down to what games you play. If you plan on investing alot of time in a particular game and crossfire/SLI don't deliver in it, then its really a big waste of money at that point. I'll be playing AoC a good bit so whoever offers the best performance in this game will get my money.
crytek engine not tamed because they did not optimize very well. they should take notes from max payne 2
Heh i forgot to mention 'unoptimized' it's almost as popular as the 'poor coding'.
Max Payne 2? Correct me if i'm wrong but that's a 5 year old game :shrug: what's it got to do with this?
That medusa tech demo doesn't look very impressive.. why so low fps for that ?
How Dare Thou Speaketh Ill of UE3 :eek: :eek: Be Ready To Digest Mine Sword Traitorous Knave !!! :mad::mad::mad:
http://img73.imageshack.us/img73/743...kknightrx6.jpg
Justice Shalt Be Served !! :stick:
Perkam
UT3 is good but in comparison to UT2k4 its a massive massive let down :(
Here's a listing GeForce GTX 200 series of EVGA cards. I'm not sure but I believe this is a Turkish site so if someone can give us a heads up if there's anything else we should know.
Well i can't see this being already posted.. scan of an article from PC gamer.
http://img175.imageshack.us/img175/8...1320082sg7.jpg
From past experience PC gamer should be shot several times, but I guess the benchmarks might be interesting
It was posted several hours ago :).
evga article, lol I didnt know SSC stands for supersuperclocked its ridiculous! im guessing the 280 BP with hydro cooper will cost at least 1000 dlls :?
http://img233.imageshack.us/img233/4402/gt200cc2.jpg
The name of these cards are quiet...dull. They might have change to something like "Uber super duber clocked" instead of "supersuperclocked". :ROTF:
And LOL @ FTW, might as well have the GTX WTF :rolleyes:
lol the performance increase is rubbish
and look at the benchmarks in the mag review, and 97%!! ROFL
Nvidia have all this time to come out with something great and all they do is increase performance of games that already run speedy on current cards and increase Crysis to near playable levels.
http://img.photobucket.com/albums/v2.../Capture-3.jpg
Just came across this
Being sold for only $704.99 USD :rofl:
$704!!!
launch prices=:down:
I say from the given results the card doesn't taste nowhere near like a can of whop-ass for sure.
Go suck on a stick NVIDIA. I'll never give you my jar savings!
$704 can't be right. Who is going to pay that? At that price point it limits the volume next to zero. Maybe Nvidia expects to seed all the review sites with the thing so they can have the biggest line on the graph, and hope people will buy the $300 version because of it.
I mean it works for NASCAR, people buy Monte Carlos. :shrug:
I'll stick to my 3870x2 until a single card comes out atleast 50% faster. Pricing is just...
...waits for e-lamers..i mean e-tailers, to list with: "preorder" "no eta" "eta unknown" "no stock" "$999", etc :ROTF:
I believe that I’ve found the part number for the EVGA GTX 260. It’s 896-P3-1280-AR. A search turned up 3 site that are selling them for $733.80 to $746.45 but all out of stock. The part number for the EVGA GTX 280 is 01G-P3-1280-AR. I found one site selling it for $760.78 also out of stock.
I would bet when these cards come into stock the price will come down. At least I hope so.
Added:
I found a second site selling the 01G-P3-1280-AR for $704.18. Did you notice below they list it as 8500GT!
Here's the first one.Quote:
MfrPart#: 01G-P3-1280-AR
EVGA 01G-P3-1280-AR E-GEFORCE 8500GT 256MB HDCP 512BIT DDR3
Price: $704.18
OutStock
Quote:
MfrPart#: 01G-P3-1280-AR
VDO 1GB PCI-E GeForce GTX 280 DDR3
Price: $760.78
OutStock
Does the 260 have the same profile as the 280, or will it be shorter like the GTS in comparison with the GTX?
wow $700+ for the GTX 280... i thought stepping up to my ultra was expensive but dayum.
are there any reports of "shuttering" on the GT280 under sli ?
It will be the fastest single GPU card. But I do agree, price is a f* joke. A hard thing to sell when 2x 4850 is suppose to beat it for $400.
That pc gamer review is a joke. since when does anyone care about a hardware review from pc gamer? Some people are so gullible