finally my plan to conquer the world by stealing everyone's electricity is about to happen !
Printable View
finally my plan to conquer the world by stealing everyone's electricity is about to happen !
I take back my "Me want" statement. I hate video card solutions such as these.
why do they need 2 seperate gpus?
cant they put 2 or 4 gpus on a single chip like the cpus do?
jc
true, this got me thinking...
it would super funny if they came out with Crossfire for mobo's, like not pci-e slots but mobos connected to each other ...so i could have a setup with 4 x phenom cpus and lets see... quad crossfire x 4 so 16 video cards, haha
anyways ill stop thinking now...
it seems like they have really hit a road block with video cards....in terms of running crysis on super high settings at 1080p res. there is nothing out now that can run it with respectable frame rates where u wouldnt notice it in game play.
This is already implemented into multisocket servers
http://www.v-t.co.jp/en/products/hpc...v8000_open.jpghttp://www.mikerohard.de/iwill-h8502.jpg
:up:Quote:
AMD shows off Radeon HD 3870 X2
Working pairs: clockspeeds and pricing revealed
By Theo Valich: Sunday, 18 November 2007, 3:20 PM
AMD'S ATI Radeon HD 3870 X2 marks first real dual-GPU supported and manufactured by AMD.
Previous attempts were mostly made by third party manufacturers such as ASUS, Sapphire and MSI, but this time AMD is coming into the frame.
With R700 looking quite interesting, it is clear that AMD wants to get as much experience with multi-chippery on a single PCB as possible.
http://img45.imageshack.us/img45/205...ogetherap6.jpg
Two Radeon 3870 X2s working in pair. Note the longer Crossfire connector
As you can see in a picture above, the boards are connected via single bridge. The reason for first single-bridge appearance in ATI cards (HD2600 X2X has two bridges, just like regular parts) is the fact that one bridge is wired through the PCB and links the two GPUs locally, so "black magic" was not used in order to connect two the GPUs. Just logic and available resources.
This arrangement of four memory chips on back, four memory chips on top brings back memories of the Radeon 9700. Of course, the 9700 relied on old, DDR1 style memory working at 310MHz, while the 3870 X2 comes with memory almost three times as fast (GDDR3).
http://img45.imageshack.us/img45/702...oling01bm0.jpg
A look at cooler reveals that there are no visible heat-pipes
Beneath the cooler there are two chips. Each has its own 512MB of memory. But even with this board producing a decent amount of heat, this cooler does not use any visible heatpipes. It is just a longer version of the concept we saw with the 2900XT from the outside, but from the inside, there is nothing but copper fins.
This board will consume give or take equal power as a single 2900XT, and ATI opted to use one six-pin and one eight-pin PEG connector. With the 8-pin connector being specc'ed as a PCIe Gen2 requirement, it'so wonder that both AMD and Nvidia will use this connector in the future.
http://img102.imageshack.us/img102/4...alyst01ls5.jpg
Catalyst 7.11 shows that Radeon HD 3870 X2 is already supported in the driver
Software support is already there. While drivers have to be significantly optimised for different applications, there are still around three Catalyst releases to go before the product is ready to hit the market.
http://img102.imageshack.us/img102/2...alyst02gi7.jpg
Overclock two GPUs on a same PCB
ATI OverDrive is supported, and you can see that two GPUs work at 777MHz each, while 1GB of on-board memory is working at 901MHz, yielding a combined total of 115.32 GB/s.
The temperature tool will probably have to be tweaked to recognise the number of GPU cores on the board itself, but the interesting part will be just how much power savings RV670 can achieve.
It turns out that AMD is dead-serious about taking the RV670 to new heights. The firm is promising a whole lot, and seeing a system with two prototype boards running Call of Duty 4 with all bells'n'whistles in a Quadfire combination only leaves you thinking how great 2008 will be. It all started with three great products, Geforce 8800GT, Radeon HD 3850 and 3870, and as soon as Geforce 8850GX2 (or whatever Nvidia decides to call its dual-G92 series) and Radeon HD 3870 X2 make an appearance, we'll be ushered in a new era of affordable high-end computing.
According to AMD, the time of big and expensive high-end cards is over, everything is now about scalability. Nvidia is starting to sing the same tune.
It seems that both Nvidia and AMD finally learned that it is far better to create a monster of a mainstream chip that can be scaled with as many GPUs as you want. An eight to-16GPU setup is a possibility for both ATI and Nvidia, but don't think about games here. Think about medical imaging, videowalls and so on).
When the board debuts (current target is February) with higher clocked Phenoms (B3 rev), getting two of these cards will set you back anywhere between 800 and 1000 US Dollars or Euro, meaning buyers of four Radeon HD 3850s today will not lose their value when these two pop along. µ
http://www.theinquirer.net/gb/inquir...eon-hd-3870-x2
will be interesting to see how a 3870x2 does against a future single highend gpu from nvidia.... my 8800 GTS (old version lol) is waiting to make place for something better.....
:up:Quote:
AMD Announces R680, RV620, RV635 Graphics Cores
Get ready to enter 2008 with a bang: AMD has a bunch of GPUs on the way
AMD's newest R680 graphics processor might look a whole lot like the ill-fated R600 GPU, but the reality couldn't be more bizarre. Instead of one 80nm behemoth-of-a-GPU, the R680 consists of two 55nm processor cores.
Representatives from AMD would not confirm that the R680 is essentially two RV670 GPU cores on the same board, though the company did confirm that each core has the same specifications of an RV670 processor.
The RV670 graphics core, announced last November with the Phenom processor, is the first 55nm desktop graphics adaptor. AMD does not target this card as a high-end adaptor, though reviewers were quick to herald the RV670 as AMD's best product of 2007.
The company also made quick mention of the RV620 and RV635 GPU cores. These cores are nearly identical to the previous RV610 and RV630 processors, but will be produced on the 55nm node instead.
All three of AMD's new GPUs are scheduled to launch next month.
Dual-GPU technology is not new. 3dfx's flagship Voodoo 5 family also resorted to multiple processors to achieve its relatively high performance. ASUS, Gigabyte, Sapphire, HIS and PowerColor all introduced dual-GPU configurations of just about every graphics processor on the market, though these were never "sanctioned" ATI or NVIDIA projects. Ultimately, all of these projects were canned due to long development times and low demand.
Cross-state rival NVIDIA isn't sitting on idle hands though, either. The company publicly announced plans to replace all 90nm G80 graphics cores with G92 derivatives by the end of the year. G92's debut introduction, GeForce 8800 GT, met wild support from reviewers and analysts alike. G92's second introduce, GeForce 8800 GTS 512MB, was met with similar but less enthusiastic acceptance during Tuesday's launch.
NVIDIA's newest roadmap claims the DirectX 10.1 family of 65nm processors will also hit store shelves this Spring. The chipsets -- codenamed D9E, D9M and D9P -- are architecturally different from the G80/G92 family.
http://img99.imageshack.us/img99/164...403fb09ff2.png
http://www.dailytech.com/article.aspx?newsid=10033
Would be nice to see them turn the power connector toward the side edge of the card, kind of a PITA if final revision is like the pictures.
when's the release date?
The real question is how will the availability be?
2nd gen UVD on R680?
Seems like it won't appear on the R680.
http://img.clubic.com/photo/00699084.jpg
Leadership? i would like to see that happen :)
although ATI is still the king OF the latest 3DBenchmarks :)
I like it very bad !
Just good card's :) .
wow just 800$ for 2 x 3870x2 sounds good