eeee???
TMU=texture mapping unit ??
It was even in geforce 1
Printable View
Good Morning..........argh.....stretch
Guess we will have to wait and see before we know for sure.
What did vr-zone do? Why would it keep numbers from coming out on the launch day?
Well, that ruins my Wednesday...
[QUOTE=deathman20;2163687]What I think they mean by that is ATI cards can use crossfire on any board. nVidia can't since they require there board unless you get modified drivers.
Not any board, just any crossfire enabled board such as intel or ATI based chipsets. We can't run ATI based crossfire on SLI chipsets. Same is true for Nvidia based cards needing an nforce based chipset to run SLI, you couldn't run SLI on an intel or ATI based chipset.
All these ppl undermining the cards lol...I seriously need to get some popcorn.
Though I'll be saving that for the Xtreme 3d Section in a couple of days :)
Perkam
As I said, blame VR-Zone for spreading that crap rumor of the date moving. The launch date has been set for well over a month and hasn't changed.
But we saw how well that worked for the 8600 GT and GTS. They have great 3D Mark 06 scores but are terrible in games.
I'm not going to say anything about how it performs, I'm simply saying that 3DMark can paint a shady picture. ;)
i think 8800Ultra will perform about the same as R600XTX , as b4 ... R600 might win on some apps, and Ultra might win on some apps ... i dont think there will be a huge gap between the 2
Does anyone know, if there's supposed to be software voltage control on hd series cards?
Both are going to be able to do physics, and as with graphics in general, their respective algorithms will likely see better realism in different circumstances with each. The issue is getting to a point where games are able to begin offloading the work to the GPU rather than the CPU, and to experiment with newer algorithms. It'll be a little while before both DX10 and Physics become more of a battle ground for ATI and Nvidia I suspect...
Both GPU's cand do physics but they are both "vaporware" if no one is taking advantage of their capabilities in games....:(
Ageia Physics is on the market for a year or more and how many games realy take advantage of it's capabilities?
Most games use in-house physics engines...
Phys-X has Rainbowsix vegas, SC: DA, Ghost recon, Bet ON soldier, Infernal, City of Villians, and a few others. At least there are supporting titles.
Vegas and SC: DA do not mention support, but both install Phys-X driver when installing game.
Gears of War also uses Phys-X tech...although how...is up to you to find out.:lol2:
the new Alpha Prime which was just released yesterday in English supports it as does the new cell factor revolutions which is out next week.
yeah, the titles are finally starting to roll in...I've had my Phys-X since they were released.
I could see the Phys-X card being nice but its a slow add on to the market. I might look into it more this summer. Has anyone with the games out now seen a difference between the 128Meg and 256Meg model?
Its a nice added feature, surely the GPU will start taking more of it when Havok starts releasing some of there newer versions as well.
Yes I know that,I played the games but I said realy take advantage of the card,in Gost Recon for example there aren't many differences between having a physics card and not having it,for ~150$-200 you should get more than this not to mention that IMO the screens without physics being used look more nice but that's just me...
The game that I hope will take advantage of physics is Cell Factor but it will be released ~1 year after the card was launched...
I have had my card since launch as well. Does anyone even have the 256mb cards? I know they were rumored but talked to no one who actually has one.
Cell Factor is released on the 8th of this month for free. I will surely be putting that game to work, hopefully on a new HD 2900XT if it seems fit enough!
I guess that last sentence was a hint to get back on topic lol.
is the nda over yet its, is this not the launch date today!!!
All ASUS cards are 256mb(i have ASUS cards), but 128mb is disabled. apparantly the ppu is not capable of addressing that AMT of ram in it's current format.
What I'm interested to see is how, if R600 can do the physics, at the same time as 3D, and if a Phys-X can do the same work....which one will get the work? is the R600 capable of communicating with Ageia's card?
so is the NDA tomorrow? or just a rumor?
2900 series will be launched today. :D
EDIT: It was reported in VR-Zone but the article is "gone" :o
http://sg.vr-zone.com/?i=4931
I hope you have the cash for the power bill or don't pay for your electricity. That said I'm feeling hopefull for the r600 again, not that I'll be able to afford it but I'd like to see awesome performance at a lower price and a good battle between nvidia and amd/ati camps. Really looking for ward to great reading in shortly in the quiet time at work.
I don't know about you but these R5xx cards are such a headache when working with ATI tool. It's probably due to there implementation of 2D/3D clock switch.
When overclocking and loading perfectly stable clocks/volts the screen goes garbled for 5secs then back to normal and 10% of the time it will garble and then lockup forcing a reboot. ATI tool or R5xx being a :banana::banana::banana::banana::banana:?
Don't want it with R6xx. I hope they go back similar to R4xx or just improve it a tonne better.
talk about VRZone
you guys seen these slides
saw these at VRzone and thought i'd put them up here
i'm going over to an ATI event in a few days....my bets are it will be about these slides heheheh
Quote:
Originally Posted by newzhunter
Quote:
Originally Posted by newzhunter
HIS Radeon HD 2900XT 512MB GDDR3 VIVO PCIe
http://www.hisdigital.com/newimages/...2900XT_250.jpg
http://www.hisdigital.com/html/product_ov.php?id=304
any updates on the price yet?
http://www.fudzilla.com/index.php?op...d=784&Itemid=1
3 games bundled: HL2 Ep2, TF2 & Portal. If it's true then it's a really strong selling point.:)
Thats a nice bundle, it's enough to push me over and buy this thing when it comes out.
regardsQuote:
HIS HD 2900XT :
http://i21.photobucket.com/albums/b2...79b190eac6.jpg
- Superscalar unified shader architecture
- 320 stream processing units
- 512-bit 8-channel memory interface
- Comprehensive DirectX® 10 support
- Integrated CrossFire™
- High-speed 128-bit HDR (High Dynamic Range) rendering
- Up to 24x Custom Filter Anti-Aliasing
- ATI Avivo™ HD video and display technology
- Built-in HDMI and 5.1 surround audio
- Dynamic geometry acceleration
- Game physics processing capability
Cables
VIVO Cable
HDTV Output cable
DVI to VGA Dongle x 2
DVI to HDMI Dongle x 1
Crossfire™ Cable x 1
Crossfire Cable? Hope thats just meaning internal connector not an actual big cable on the outside.
Dang to bad we can't get it yet.
nda is over or what ? No performance update ?
Looks like the May 2 NDA was for the 8800 Ultra and not the R600/RV630/RV610.
so we have to wait till the 14th now
Wish the waiting game would end !!!!
the focus on HDR lighting in those slides is interesting
Quote:
HIS Radeon HD 2900XT Information
As we reported to you earlier today, HIS Digital recently updated their website with the online product presentation of their upcoming HIS Radeon HD 2900XT 512MB GDDR3 VIVO PCIe graphics card. It seems like HIS took the website down now but we still have all the pages for your viewing pleasure.
http://www.techpowerup.com/img/07-05...GB_250_thm.jpg
http://aycu34.webshots.com/image/140...7687134_rs.jpg
http://aycu37.webshots.com/image/161...7015708_rs.jpg
http://aycu13.webshots.com/image/150...6225444_rs.jpg
Source: HIS Digital
http://forums.techpowerup.com/showthread.php?t=30504
regards
they have the memory and core clock blanked out as -- which is odd. If this was a final product page why not list those parts of the specs? Perhaps in the next two weeks the cards bioses will be updated with the real and final clocks for the card. Maybe we will see higher than 742/825?
My only question is are they going to have a 6-pin to 8-pin converter supplied with the card cause I really don't want to go out and buy another PSU for this card (yes, I want to have the overdrive option).
Yes they will. It will come with a 2 x molex to 1 x pcie 6 pin adapter as well as 2 x 6 pin pcie to 1 x 8 pin pcie adapter.
wow they need 2 x 6 pin to power a singe 8pin PCI-E??? wow...
why cant they just make molex + 6 pin > 8 pin??
6 pin PCI-E can provide 75W so molex only need to provide 25W in order to get the 100W rating of 8 pin PCI-E ... and on 12v thats like 2.1A ...
hell u can power a 6 pin PCi-E with 2 x molex ... and thats 75/2/12 about 3.1A over each molex
the adapter appears to be needed just so you can turn on the video card. IE to get it running at standard clocks of 225 watts.
really i heard u can have only 2 x 6 pin PCI-E plug in and still work, but u cant OC with 2 x 6 pin (inq and many other says u cant, ... so flame them if its false )
without benchmark numbers this thread is a waste of storage and bandwidth.
From what I've read, that one /\
Which is silly as there is no real power difference between a 6 pin and 8 pin plug, just an increase in rating. Both have 3 x12v conductors. The 8 pin has 2 extra commons is all. It wouldn't take much to modify a 6 to 8pin adapter. In fact some of the 8 pin modular PSU's have 8 pin plugs running back to 6 pin outlets, 2 common wires jumped over.
http://www.jonnyguru.com/review_details.php?id=103
Quote:
One nice thing about the X3 is that it has native support of the upcoming 8 pin PCI-e standard. No adaptors needed for this baby although as of now the 8 pin cables feed off a 6 pin plug on the unit. A little birdie tells me that Ultra has a redesign in the works which will put the 8 pins on native 8 pin plugs on the PSU body and two of the 6 pin cables will adapt down from them instead of it being the inverse as it is now. Does it make much of a difference? Not really owing to the fact that as of now the extra two pins on the 8 pin plugs are grounds. There's all this hoopla over the 8 pin handling more wattage when in fact there are still three 12V power leads and five grounds rather than three +12V and three grounds. The new standard will eventually implement a 12V sense where the new grounds reside. How this equals more power capacity I really don't understand but I suppose that better minds than mine have decided it will work so that's all that matters.
JonnyGURU has a good thread on his forum if you want more info.
damn then u need 6 x 6 pin PCI-E to power 2 x HD2900XTX (if u dont have the 8 pin PCI-E)... i have the thermaltake 750w PSu which only has 2 PCI-E, might as well get the new 1200W when it comes out
Some news on real-world price :banana:
A local dealer/retailer has HD2900XT512 on preorder
And it's just HuF 85 145 - (~$468) with 20% tax - that's $390 w/o tax! :eek: :banana:
It's lower than the GeF8800GTS640 (~90 000), which has been on sale for half a
year - and it's still preorder, meaning it has a price premium :banana:
Not to mention the overpriced/underperforming 8600GTS around HuF 50 000
:woot:
8800 Ultra = $799
Two HD 2900XTs in Crossfire = $799
Wait for the pwnage May 14th.
heh yeah $399.99 ... 2 of em is actually $799.98 .. its actually cheaper !!!
Back it up a bit.
1. They're moving graphics to 65nm, so power will go down.
2. If a card needs 2x8 and you crossfire that's 4, you still don't have enough without adapters.
3. I would think the reason it has three of each is to power 3 current GPU. One maybe used for physics? Like the RD600 motherboard for example.
4. PCIe 2.0 will be out soon and the slot will be 150W. By the time any cards with 2x8 connectors appear (if ever), the new PCIe V2 motherboards will be out and have lots more power.
5. If a card needs 2x8. Quad cpu with crossfire will still be 725W/60A. Maybe 8 core crossfire needs a 1200W PSU. Good luck with that!!
Buy what you need, when you need it. The specs change to much to guess that far ahead. Personally I'll be making/buying 6 to 8 pin adapters. What's the combined 12v output for your 750W Thermaltake, 60A? I'd bet that'll be quite happy running 2x2900XT's.
yeah good point ... QX6700 + 8800 GTX SLI dont use more than 550w load ... even Corsair 620W can power that
still the native 8 pin PCI-E is nice , and what i was thinking was to get a PSU that has as many 8 pin PCI-E as possible ( and only those 1000+w PSU has 2+ of em )
and yeah it has 60A , but for adapters i will need to have 4 x ( 2 x molex > 6 pin PCI-E ) > 2 x 8 pin PCI-E + the 2 x native 6 pin PCI-E to power HD2900XTX Crossfire
If you intend to run ur pc at stock...sure...
But if you're ocing at the level of the average xs member ("I needs ma QX @ 3.6 Fashizzle !" :p ) then you'll need at least a 750w pcp&c, zippy, or etasis base silverstone or preferably a 850w if you'll be ocing both cards or if you intend to mod or water cool them.
Perkam
You all are forgetting something. In the conference video AMD had about their new products(the video is somewhere around here), when a reporter asked about power usage (loose quote) they said:
"The R600 is the maximum power card we are most likely going to produce. We are aiming for efficiency similarity to that of how CPUs are decreasing power with each generation."
This was the same conference video in which they accidently slipped out the R650 word following R600 release shortly with 1/3 less power requirements.
So putting the R600 xt at a power requirement of 250watts, we can expect the r650 to be around 165watts @ 65nm.
I don't think that we will ever be seeing those killer PSU requirements, and I hope that they do stick to their guns and provide efficient solutions in the future that run faster and suck up less power. The mainstream is to cheap at buying PSU's anyways, much less spending over $100 on one is out of the question to most people.
they better ... i thought 2 x 6 pin PCI-E for a single card was much ... 8800GTX
i apologize for not keeping up with the thread...
but when is the release date for ati's new stuff?
then i'm going completely bezerk...all those info and it's impossible to tell what's true and what's horse-s***t
Wow this thing is going to be dirt cheap for me then when its available through ATI. when the 1900 and 1950 launched they were both $299. I'm hoping to see thing around $250, I have a feeling ATI is desperate though due to all the negative PR lately, so their probably isn't too much Margin attached to this card.