Not true.
They improved OpenGL performance quite a lot in the past, and depending on your card there are improvements across the board in various games.
Printable View
i think the main thing we can take from the information we have so far is that with the right cooling these cards have a large overclocking headroom
i will reinstall windows tomorrow and re-run it. please give me an extra days
but all other ppl have better result. just driver issue.
i believe R600 will be really good card once new driver out.
Hi Denny ....
Could you post some photos of your HD2900 card .... or you are still under NDA :slap:
THX :D
Temp of the card?
regardsQuote:
Radeon HD 2900XT
http://aycu17.webshots.com/image/151...5869153_rs.jpg
http://aycu07.webshots.com/image/164...4250356_rs.jpg
http://aycu28.webshots.com/image/139...9974738_rs.jpg
http://aycu31.webshots.com/image/156...8063797_rs.jpg
http://aycu16.webshots.com/image/171...7714818_rs.jpg
http://aycu16.webshots.com/image/171...3437993_rs.jpg
Radeon HD 2900XT’nin Teknik Özellikleri:
- Grafik İşlemci Hızı: 757MHz
- Bellek Hızı: 1656MHz
- Bellek Veri Yolu: 512-bit
- Bellek Miktarı: 512MB
- Bellek Tipi: GDDR3
- Toplam Transistör Sayısı: 720 Milyon - 8800GTX’de ise yaklaşık 675 Milyon
- ROPs ve TMUs: 16 ve 32 adet
- Ramdac: 2×400MHz
- Bellek Bant Genişliği: 105.98(GB/sn)
- Unified Shader İş Hattı: 128
- DirectX ve OpenGL: DirectX 10 / Shader Model 4.0 ile OpenGL 2.0 desteği
- FSAA: Smoothvision HD + Adaptive AA
- HDTV, HDMI ve HDCP Durumu: Var, HDMI 1.2 uyumu, Var (HDMI Modları: 480p, 720p, 1080i)
- PCB: 12 Katmanlı
http://www.fx57.net/?p=637
757 / 828 mhz (core/mem)? hmmmm
it says 105 GB/s bandwith, that's more than the 8800GTX :slobber:
It would appear that the HD 2900XT may be in competition with the GTX. Have to wait and see for the final reviews. Can ATI provide enough batches to the retail chains to control price gouging?
Also, I would like to know if there is any tangible difference in performance when 2 different types of PSUs are used (regular 6-pin PCIe vs PCIe 2.0). Now I know this is more for an official review but it would be interesting to know if:
PSU A with 2, 6-pin PCIe cables perform better/worst/same
AS A
PSU B with 2 PCIe 2.0 with 2, 8-pin PCIe cables.
Now it suppose not to matter but I wonder if the video card is take more power if it's made available when heavy intensive 3d application is used.
From my understanding (someone help me here):
75 watts from the PCI-E video card slot
75 watts from the PCI-E 6 pin cable
150 watts from the PCI-E 8 pin cable
------------------------------------------------------
300 watts available for use
75 watts from the PCI-E video card slot
75 watts from the PCI-E 6 pin cable
75 watts from the PCI-E 6 pin cable
-----------------------------------------------------
225 watts available for use
Is this correct?
so how much are these going to cost? when are they coming out, and would a 550w power supply be alright for this?
When these come out on May 14th, there may not be sufficient supply to prevent price gouging. In that case, they will be overpriced.
However, after that initial period, I have read that they could be released as low as $399 or as high as $599. I'm not sure which is correct, but I'm hoping for the former :D
And yes, a 550W should be enough to power this bad boy.. perhaps only just though. Really depends on your other system components.
My 8800GTS wich is a monster in overclocking, 675/1566/2160, core/shader/memory, scores 13045 in 3D06........nice to see my card wont vbe obsolete, but anyways, isnt ATI the company that just improves on 3dmark scores and doesnt improve apon gaming with their drivers, so i dont think it would be fair to compare since nvidia has stated they stoped optimizing for 3dmark in their drivers, lets see some gaming comparisons!!!! :slap:
Wait, where did ATi ever say they optimise only for 3DMark?
Do you remember that whole fiasco about nVidia putting in driver tweaks for 3DMark in the 79GTX vs X1900 race.
I'm pretty sure both companies do it ;)
That's why the 88GTS can score so well in 3dmark, yet doesn't fare TERRIBLY better than the last generation X1950XTX. Not by the percentage increase shown in 3dMark anyways.
03 was nVidia, 05 was ATI, 06 is nVidia. The benchmark seems to balance with every other release type deal. One might do better at one but the next release the other guys card does better. DX10 changes it if I understood correctly making it much easier to program for on the developer side since it will be sharing the same main commands instead of special ones between the two companies.
Ahh yes, the ATi cards were always good in 05.
At any rate, I don't let 3dmark be the be all and end all. It's simply another benchmark to help you gauge performance. It's a tool, and a useful one at that.
Anyhow, I am really looking forward to this card. I am probably going to pick one up after the price gouging.
hi
3 new x2900 serires on 14th
-x2900xtx 1gb gddr4
- x2900xt 512mb gddr3
- x2900XL 512mb gddr3
>>> http://www.xtremesystems.org/forums/...&postcount=653
:toast:
dammit, the 1gb card will be out only for oems... daamit!! damned daamit!
you can read about it here: www.fudzilla.com
Quote:
UNFINISHED R600 REVIEW @ VR-ZONE
http://img182.imageshack.us/img182/299/r6000ocfq0.jpg
http://www.vr-zone.com/?i=4946&s=12
Look at this titles:
- Overclocking...
- UnReal Overclocking!!
UNREAL OVERCLOKING :D
....
link doesnt seem to work for me.. anyone else?
unfinished R600 Review @ Vr-zone
hehe *hi my name is mr no-eyes* my bad!
it does not work for me either!! Daamit i wanna see this new card in action!!!