finally my plan to conquer the world by stealing everyone's electricity is about to happen !
finally my plan to conquer the world by stealing everyone's electricity is about to happen !
I take back my "Me want" statement. I hate video card solutions such as these.
"To exist in this vast universe for a speck of time is the great gift of life. Our tiny sliver of time is our gift of life. It is our only life. The universe will go on, indifferent to our brief existence, but while we are here we touch not just part of that vastness, but also the lives around us. Life is the gift each of us has been given. Each life is our own and no one else's. It is precious beyond all counting. It is the greatest value we have. Cherish it for what it truly is."
i5 2500k @ 4.6GHz - Corsair A70 | Biostar TP67B+ | G.Skill RipJaws DDR3-1600 2x2GB | MSI HD7950 TF3 | X-Fi Titanium | WD 750GB Black | CM 690 II - Corsair TX850 | 2xDell 2407WFP A04/A03 | Win 7 Pro x64
http://www.heatware.com/eval.php?id=48222
why do they need 2 seperate gpus?
cant they put 2 or 4 gpus on a single chip like the cpus do?
jc
FX-8350(1249PGT) @ 4.7ghz 1.452v, Swiftech H220x
Asus Crosshair Formula 5 Am3+ bios v1703
G.skill Trident X (2x4gb) ~1200mhz @ 10-12-12-31-46-2T @ 1.66v
MSI 7950 TwinFrozr *1100/1500* Cat.14.9
OCZ ZX 850w psu
Lian-Li Lancool K62
Samsung 830 128g
2 x 1TB Samsung SpinpointF3, 2T Samsung
Win7 Home 64bit
My Rig
true, this got me thinking...
it would super funny if they came out with Crossfire for mobo's, like not pci-e slots but mobos connected to each other ...so i could have a setup with 4 x phenom cpus and lets see... quad crossfire x 4 so 16 video cards, haha
anyways ill stop thinking now...
it seems like they have really hit a road block with video cards....in terms of running crysis on super high settings at 1080p res. there is nothing out now that can run it with respectable frame rates where u wouldnt notice it in game play.
FX-8350(1249PGT) @ 4.7ghz 1.452v, Swiftech H220x
Asus Crosshair Formula 5 Am3+ bios v1703
G.skill Trident X (2x4gb) ~1200mhz @ 10-12-12-31-46-2T @ 1.66v
MSI 7950 TwinFrozr *1100/1500* Cat.14.9
OCZ ZX 850w psu
Lian-Li Lancool K62
Samsung 830 128g
2 x 1TB Samsung SpinpointF3, 2T Samsung
Win7 Home 64bit
My Rig
AMD shows off Radeon HD 3870 X2
Working pairs: clockspeeds and pricing revealed
By Theo Valich: Sunday, 18 November 2007, 3:20 PM
AMD'S ATI Radeon HD 3870 X2 marks first real dual-GPU supported and manufactured by AMD.
Previous attempts were mostly made by third party manufacturers such as ASUS, Sapphire and MSI, but this time AMD is coming into the frame.
With R700 looking quite interesting, it is clear that AMD wants to get as much experience with multi-chippery on a single PCB as possible.
Two Radeon 3870 X2s working in pair. Note the longer Crossfire connector
As you can see in a picture above, the boards are connected via single bridge. The reason for first single-bridge appearance in ATI cards (HD2600 X2X has two bridges, just like regular parts) is the fact that one bridge is wired through the PCB and links the two GPUs locally, so "black magic" was not used in order to connect two the GPUs. Just logic and available resources.
This arrangement of four memory chips on back, four memory chips on top brings back memories of the Radeon 9700. Of course, the 9700 relied on old, DDR1 style memory working at 310MHz, while the 3870 X2 comes with memory almost three times as fast (GDDR3).
A look at cooler reveals that there are no visible heat-pipes
Beneath the cooler there are two chips. Each has its own 512MB of memory. But even with this board producing a decent amount of heat, this cooler does not use any visible heatpipes. It is just a longer version of the concept we saw with the 2900XT from the outside, but from the inside, there is nothing but copper fins.
This board will consume give or take equal power as a single 2900XT, and ATI opted to use one six-pin and one eight-pin PEG connector. With the 8-pin connector being specc'ed as a PCIe Gen2 requirement, it'so wonder that both AMD and Nvidia will use this connector in the future.
Catalyst 7.11 shows that Radeon HD 3870 X2 is already supported in the driver
Software support is already there. While drivers have to be significantly optimised for different applications, there are still around three Catalyst releases to go before the product is ready to hit the market.
Overclock two GPUs on a same PCB
ATI OverDrive is supported, and you can see that two GPUs work at 777MHz each, while 1GB of on-board memory is working at 901MHz, yielding a combined total of 115.32 GB/s.
The temperature tool will probably have to be tweaked to recognise the number of GPU cores on the board itself, but the interesting part will be just how much power savings RV670 can achieve.
It turns out that AMD is dead-serious about taking the RV670 to new heights. The firm is promising a whole lot, and seeing a system with two prototype boards running Call of Duty 4 with all bells'n'whistles in a Quadfire combination only leaves you thinking how great 2008 will be. It all started with three great products, Geforce 8800GT, Radeon HD 3850 and 3870, and as soon as Geforce 8850GX2 (or whatever Nvidia decides to call its dual-G92 series) and Radeon HD 3870 X2 make an appearance, we'll be ushered in a new era of affordable high-end computing.
According to AMD, the time of big and expensive high-end cards is over, everything is now about scalability. Nvidia is starting to sing the same tune.
It seems that both Nvidia and AMD finally learned that it is far better to create a monster of a mainstream chip that can be scaled with as many GPUs as you want. An eight to-16GPU setup is a possibility for both ATI and Nvidia, but don't think about games here. Think about medical imaging, videowalls and so on).
When the board debuts (current target is February) with higher clocked Phenoms (B3 rev), getting two of these cards will set you back anywhere between 800 and 1000 US Dollars or Euro, meaning buyers of four Radeon HD 3850s today will not lose their value when these two pop along. µ
http://www.theinquirer.net/gb/inquir...eon-hd-3870-x2
![]()
Last edited by mascaras; 11-18-2007 at 12:12 PM.
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
will be interesting to see how a 3870x2 does against a future single highend gpu from nvidia.... my 8800 GTS (old version lol) is waiting to make place for something better.....
AMD Announces R680, RV620, RV635 Graphics Cores
Get ready to enter 2008 with a bang: AMD has a bunch of GPUs on the way
AMD's newest R680 graphics processor might look a whole lot like the ill-fated R600 GPU, but the reality couldn't be more bizarre. Instead of one 80nm behemoth-of-a-GPU, the R680 consists of two 55nm processor cores.
Representatives from AMD would not confirm that the R680 is essentially two RV670 GPU cores on the same board, though the company did confirm that each core has the same specifications of an RV670 processor.
The RV670 graphics core, announced last November with the Phenom processor, is the first 55nm desktop graphics adaptor. AMD does not target this card as a high-end adaptor, though reviewers were quick to herald the RV670 as AMD's best product of 2007.
The company also made quick mention of the RV620 and RV635 GPU cores. These cores are nearly identical to the previous RV610 and RV630 processors, but will be produced on the 55nm node instead.
All three of AMD's new GPUs are scheduled to launch next month.
Dual-GPU technology is not new. 3dfx's flagship Voodoo 5 family also resorted to multiple processors to achieve its relatively high performance. ASUS, Gigabyte, Sapphire, HIS and PowerColor all introduced dual-GPU configurations of just about every graphics processor on the market, though these were never "sanctioned" ATI or NVIDIA projects. Ultimately, all of these projects were canned due to long development times and low demand.
Cross-state rival NVIDIA isn't sitting on idle hands though, either. The company publicly announced plans to replace all 90nm G80 graphics cores with G92 derivatives by the end of the year. G92's debut introduction, GeForce 8800 GT, met wild support from reviewers and analysts alike. G92's second introduce, GeForce 8800 GTS 512MB, was met with similar but less enthusiastic acceptance during Tuesday's launch.
NVIDIA's newest roadmap claims the DirectX 10.1 family of 65nm processors will also hit store shelves this Spring. The chipsets -- codenamed D9E, D9M and D9P -- are architecturally different from the G80/G92 family.
http://www.dailytech.com/article.aspx?newsid=10033
![]()
[Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
[Review] ASUS HD4870X2 TOP » Here!! « .....[Review] EVGA 750i SLi FTW » Here!! «
[Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
[Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «
Would be nice to see them turn the power connector toward the side edge of the card, kind of a PITA if final revision is like the pictures.
when's the release date?
The real question is how will the availability be?
System Specs: * i7 2700K @ 4.8 Ghz * Zalman CPNS9900-A LED * Asus Maximus IV Extreme -Z * 16 GB Corsair Dominator GT CMT16GX3M4X2133C9 * Sapphire HD7970 crossfire * Creative X-Fi Titanium Fatality Pro [PCI-E] * Corsair AX 1200W * WDC WD1002FAEX + WDC WD1002FAEX * Optiarc AD 5240S * Dell U3010 @ 2560 x 1600 [DVI-D] * Steelseries 7G * Logitech G9 * Steelseries SX * Coolermaster Stacker STC T01 * Logitech Z-5500 * Sennheiser HD598 * Windows 7 Ultimate x64 SP1*
2nd gen UVD on R680?
Leadership? i would like to see that happen
although ATI is still the king OF the latest 3DBenchmarks![]()
I like it very bad !
Just good card's.
And still I will not own everyone
[/FONT]
wow just 800$ for 2 x 3870x2 sounds good
Bookmarks