www.teampclab.pl
MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12
Test bench: empty
Eh. While it was a pretty rough article, he did accurately point out the whole honoured tradition of 'milking the stupid.'
It's true.
E7200 @ 3.4 ; 7870 GHz 2 GB
Intel's atom is a terrible chip.
Same here, I suspected it was theinq after the "NV is in deep doo-doo right now..." part. So I checked the "source" and stopped reading.
Initial gtx280 price was due to the absense of something faster at the time....aimed at early adopters. Nvidia probably knew ATI was going to launch an dual GPU card, and likely expected/planned price drops, just like it happened. I may or may not be wrong, but theinq stance/interpretation of facts annoy the hell out of me. Very subjective and unprofessional.
Last edited by Tonucci; 10-09-2008 at 06:30 AM.
The GX2 card won't be based off the GTX280 due to heat and power. In the best-case, it'll be based off a 55nm 9TPC GTX270-core216. As we can all probably guess, the GTX270-core216 probably only beats the 4870 by a little (and loses with 8xAA), so I don't think that the GX2 card would be much of a threat to ATI (marginal win for NV).
Tech Report just posted this - http://www.techreport.com/articles.x/15651 . It's the GTX 260 Core 216 vs a 1GB HD 4870.![]()
AMD's ATI Radeon HD 4870 with 1GB of GDDR5
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
The Radeon HD 4870 1GB: The Card to Get
http://www.anandtech.com/video/showdoc.aspx?i=3415
To sandwich and cool a dual-460mm^2 GPU solution is probably going to require water cooling.. if that's what you mean by engineering. Or maybe it will have to be done on a single 12-inch long PCB so that a dual-slot cooler can be used like the 4870X2? Or maybe it will have to have special external heatsinks connected with heatpipes? Just so that it can beat the 4870X2 by 10%?
The only way Nvidia could keep on using the same 9800GX2-style sandwich cooler is if the new GTX270 cores do not use any more power than the 9800GX2 cores, which is certainly possible. However, unless they are cherry-picked like the INQ said, they would not be any faster than a single 4870 core. The 4870X2 already has a dual-slot cooler, so Nvidia could try to match it at best, just like the 9800GTX+ managed to match the single-slot HD 4850.
What I'm glad to see is the GTX 290 further leading the single-GPU performance (since I'm a retired SLI veteran, preferring to avoid all the compatibility and microstuttering headaches).
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
I have no idea what kind of crazy inq ish you are smoking, but a gtx 350 would absolutely dominate the 4870x2. Why? The bandwidth of a gtx 280 w/ gddr3 is already more than a 4870/4870x2 with gddr5, due to the 448/512 bit bus. PUT two of those together with some gddr5, and you will have a card that will leave the 4870x2 face down in the mud. (the power consumption at load would be high, but it would be 55nm so i doubt it would be that much more than the 55nm 4870x2, which runs hotter than two 65nm g98's in a 9800gx2.
As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
"I am become Death, the destroyer of worlds."
Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki
As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
"I am become Death, the destroyer of worlds."
Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki
Bandwidth >100GB/s matters very little.
See this:
Where did the fillrate, bandwidth, texturing advantages go? You have a chip half the size of the other that is unplayable.
Future games will be made close to this type of specifications. Starting with Operation Flashpoint 2, but it'll be widespread in a while.
ATI can spam a LOT of shader units and even TMUs inside their chips without using much die size. The same does NOT go for nVidia.
Wow...
People who love companies and defend them to the end are funny.
On both sides.
"This one is going to kick that one's butt... one day"
"This one is better than that one"
"This one has one GPU and that one has two, so it's not as good"
The same damn arguments over and over and over and over. The responses are very often predictable and you can tell what brand is going to be in the sig before you even get to the bottom of the post.
Give it a rest...holy crap.
Single fastest card on the market = 4870 X2
Single fastest GPU on the market = GT200
That is what we know right now. All the rest is just speculation for the future.
Either card is awesome... don't worry, your penis won't change size, no matter which brand you own.
RIG 1 (in progress):
Core i7 920 @ 3GHz 1.17v (WIP) / EVGA X58 Classified 3X SLI / Crucial D9JNL 3x2GB @ 1430 7-7-7-20 1T 1.65v
Corsair HX1000 / EVGA GTX 295 SLI / X-FI Titanium FATAL1TY Pro / Samsung SyncMaster 245b 24" / MM H2GO
2x X25-M 80GB (RAID0) + Caviar 500 GB / Windows 7 Ultimate x64 RC1 Build 7100
RIG 2:
E4500 @ 3.0 / Asus P5Q / 4x1 GB DDR2-667
CoolerMaster Extreme Power / BFG 9800 GT OC / LG 22"
Antec Ninehundred / Onboard Sound / TRUE / Vista 32
![]()
Dude, did you even look at the 4870 512 vs 4870 1gb in the above graph?
Take a look at that graph you posted when you get a chance.
Thats where fillrate, bandwidth, and extra gddr5 advantages went.
Grid is not the most graphically demanding game. All the above shows is that it scales well on a 4870x2, arguably due to it's gddr5.
we'd have to have a 4850x2 to really know.
Last edited by fragmasterMax; 10-09-2008 at 07:11 AM.
As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
"I am become Death, the destroyer of worlds."
Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki
E8400 @ 4.0 | ASUS P5Q-E P45 | 4GB Mushkin Redline DDR2-1000 | WD SE16 640GB | HD4870 ASUS Top | Antec 300 | Noctua & Thermalright Cool
Windows 7 Professional x64
Vista & Seven Tweaks, Tips, and Tutorials: http://www.vistax64.com/
Game's running choppy? See: http://www.tweakguides.com/
Thanks for showing me who you really are. I'm more than glad to block you (the first user I've ever blocked on any forums)! Go, you king of maturity!After all, what are you doing here on the forums with your rotten attitude, failing to see the humor? Aren't you too mature to talk about GTX 290 rumors in the first place? I reported you because of your rudeness.
Last edited by Bo_Fox; 10-09-2008 at 07:28 AM.
--two awesome rigs, wildly customized with
5.1 Sony speakers, Stereo 3D, UV Tourmaline Confexia, Flame Bl00dr4g3 Fatal1ty
--SONY GDM-FW900 24" widescreen CRT, overclocked to:
2560x1600 resolution at 68Hz!(from 2304x1440@80Hz)![]()
Updated List of Video Card GPU Voodoopower Ratings!!!!!
What about in the summer time?
How much is it per k/w hour where you live.
Man i wish we would take you guy's lead and build some ultra efficient heavy water reactors.
As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
"I am become Death, the destroyer of worlds."
Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki
Whats crazy is that the 256 bit 55nm 4870 uses more than 50% more power than a 448 bit 65nm gtx 260, both cards at idle.
WHATS CRAZY is that the 55nm 256 bit 4870 uses more power than a 448 bit 65nm gtx 260 at load, and you all are thinking it would be IMPOSSIBLE to put two die shrinked 55nm gtx 280's on a single card. [/URL]
:ROFL:
Stop being tards,
Mark my words in a few months the 4870x2 will be old news like the 9800gx2 is now.
http://www.techreport.com/r.x/gtx260...power-idle.gif
Last edited by fragmasterMax; 10-09-2008 at 07:30 AM.
As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
"I am become Death, the destroyer of worlds."
Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki
Rofl
You guys are silly!
There are alot of misconceptions on this forum, i'm not trying to come off as an nvidia fanboy, but the fact is the only thing that differentiates the 4870 from a 4850 (8800gts 512 performance) is GDDR5, which nvidia will soon get.
When it does, the tables will be turned.
As Los Alamos director J. Robert Oppenheimer watched the demonstration, he later said that a line from the Hindu scripture the Bhagavad Gita came to mind:
"I am become Death, the destroyer of worlds."
Test director Kenneth Bainbridge in turn said to Oppenheimer, "Now we are all sons of b**ches." wiki
Apparently the "new" 260 is not as much crap as had been thought.
http://techreport.com/articles.x/15651
This is quite the reversal from techreport compared to their earlier R770 vs GT200 reviews. If the new 260 is capable of this, then the new 55nm versions may actually be worth while.Despite the fact that these are tremendously complex chips with hundreds of millions of transistors, AMD and Nvidia have achieved a remarkable amount of parity in their GPUs. In terms of image quality, overall features, performance, and even price, the Radeon HD 4870 1GB and the GeForce GTX 260 "Reloaded" are practically interchangeable. That fact represents something of a comeback for Nvidia, since the older GTX 260 cost more than the 4870 and didn't perform quite as well. If anything, the GTX 260 Reloaded was a smidgen faster than the 4870 1GB overall in our test suite.
The GTX 260 is based on a much larger chip with a wider path to memory, which almost certainly means it costs more to make than the 4870, but as a consumer, you'd never know it when using the two products, so I'm not sure it matters much for our purposes. Even the GTX 260's power consumption is lower than the 4870's, and its noise levels are comparable.
Stop spread lies Fanboy:
1Gb HD 4870 use 10% more power at Idle then GTX260:
1Gb HD 4870 use less power at loadthe GTX260:Originally Posted by =fragmasterMax
http://www.bit-tech.net/hardware/200...tx-sli-pack/11
Bookmarks