Color blind? :p:
Printable View
MSAA sample readback is possible on DX10.0.
MSAA Z (?) buffer access IIRC is only possible through 10.1 though.
And nVidia's obviously not going to enocourage devs to use shader AA in ANY kind of situation ever. So they're supporting it is a kinda weak excuse at best.
This was my initial thought at the idea that the 4870X2 has the lead. A dual chip, single card xfire vs a single chip single card nVidia solution will likely yield to the 4870X2.
If nVidia decided to do a dual chip approach, then not sure what would happen. Though, thinking about it... at the monsterously high power and the massive die, it will be difficult for nVidia to go the dual chip route in this incarnation.
G80 had a smaller die size, used less power and generated less heat and we didn't see a GX2 card with that core until the G92 at 65nm so I doubt we will see GT200 in a GX2 card form until 40nm at least
I did read the link and the entire article when it came out.
What Nvidia is actually saying is, we CAN support DX10.1 features by coding drivers to "expose" these features in hardware. In other words, get the same result by exploiting specifics in the hardware. But Nvidia does not have a DX10.1 part.
The entire page needs to be read to get proper context. If NV could actually exploit DX10.1 features from a performance standpoint does anyone actually think they would force the removal of 10.1 support in Assassins Creed?
Quote:
We know that both G80 and R600 both supported some of the DX10.1 featureset. Our goal at the least has been to determine which, if any, features were added to GT200. We would ideally like to know what DX10.1 specific features GT200 does and does not support, but we'll take what we can get. After asking our question, this is the response we got from NVIDIA Technical Marketing:
"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."
The policy decision that has lead us to run into this type of response at every turn is reprehensible. Aside from being blatantly untrue at any level, it leaves us to wonder why we find ourselves even having to respond to this sort of a statement. Let's start with why NVIDIA's official position holds no water and then we'll get on to the bit about what it could mean.
The statement multisample readback is the only thing some developers are interested in is untrue: cube map arrays come in quite handy for simplifying and accelerating multiple applications. Necessary? no, but useful? yes. Separate per-MRT blend modes could become useful as deferred shading continues to evolve, and part of what would be great about supporting these features is that they allow developers and researchers to experiment. I get that not many devs will get up in arms about int16 blends, but some DX10.1 features are interesting, and, more to the point, would be even more compelling if both AMD and NVIDIA supported them.
Nope, not me. I haven't believed NV's marketing in years; since the viral marketing initiative was exposed. Assasins Creed seals the deal on how NV is holding back progress IMO.
From techreport:
http://techreport.com/r.x/geforce-gt...ssinscreed.gif
http://techreport.com/articles.x/14934/13
Nah. I'm not buying that. You think Nvidia is just going to stand by while AMD takes the lead until they get to 40nm? No way. The 7950GX2 was a 7900GT sandwich at 90nm. There was no die shrink necessary to make that. Sure. Cooling will be a problem, but it always is with the GX2 cards. It's not like Nvidia has to even engineer a new card. They just need to stick two cards together and put them on a single PCIe slot. Nvidia is going to take back their lead no later than spring 2009 and a GX2 card is precisely how they are going to win it back.Quote:
Originally Posted by zerazax
The fact is you can only get so much out of CF/SLI before diminishing returns kicks in. It's basically a hack introduced by 3DFX. We're lucky it works at all. Now maybe the 4870x2 is going to revolutionize the world of GPUs by changing that. But I'll believe that when I see it. Even in a post 4870x2 world Nvidia is still going to have the advantage in a sense because I doubt the 4870 is going to be faster than a GTX280 at 65nm and it's even less likely when the GTX280 is shrunk down to 55nm. We can all enjoy AMD's victory this summer, but they are going to have to pull out quite a few rabbits if they want to keep it.
isn't ati droppong to 45 or 40nm soon,like end of year or q1 09?
yes...2 fps in a resolution no one plays.
What about this :
http://techreport.com/r.x/geforce-gtx-280/cod4-1680.gif
sorry i play at 25x16
ok 1% of the world population will maybe play in 2560x1600 with max details :up:
The 9800 GX2 is still a better card compared to 2900 XT ..
With a large heatpipe cooler like the TRUE or the the Scythe Orochi, but engineered to attach to the video card. You have almost the same problem with GTX280 SLI but I haven't heard of cards melting just from the stock cooling.
steam shows widescreen 88,856 running over 24" 19.38% most of that is probley 30" lcd's the 27" is cheeper but not much
Omg LOL. 21 inches eh??? Are you an uber leet pwnzor as well?
You would be wasting your time an oh so much buying a card to run a game at 90 FOS when the most your monitor will display is 60. This is xtremesystems, half the ppl here including myself play at 19x12 or more
I love e-fights. :rofl:
Crysis wont play smooth for me on 1680x1050 with very high settings, so yes, bring on next generation video cards. :D
Great performance from the HD 4850, and it seems like a very efficient card, the minimum frame rates are high, even equaling the GTX260 in a couple of games. It beats the 8800Ultra/9800GTX overall.
Can't wait for the 4870 reviews. I think I'm getting an HD 4870 1GB GDDR5, hope they won't cost more than $350 :)
BTW, Sapphire HD Radeon 4850 in stock selling for $189
AMD Radeon HD 4850 512MB Preview - RV770 Discovered
http://www.pcper.com/article.php?aid=579
So this card performs evenly with an 8800 GT and costs up to $50 more. :down:
I've said it before I'll say it again, to all those claiming 8800GT superiority:
4850 > 9800GTX
4850 CF > GTX 280
So if 8800GT > 4850 :p:
then you are saying 8800GT > 9800GTX and
8800GT SLI > GTX 280
Choose your words correctly, this is one fast card, but yes, as is the 8800GT :)
Perkam
wait for cf stutter reports? :D...always looking for the negative.
but anyway 4850cf looks good in the graphs.
http://images.tweaktown.com/imageban...f4ghz_g_04.gif
...not so much this one however.
Point the fact that Iin Tweaktown review 280GTX from Zotac is an overclocked card 700 core /1400 shaders/2200 memory instead of 600/1300/2200 :rolleyes:
Wow oh wow X__X
These cards are the budget builder's heaven. I'm now waiting for the micro stuttering of death threads.
Big question now [for me at least] is min FPS 2 4850 and 1 4870.
Wow they use a 680i board and a OC 8800GT OCX at 700/1500/2000 (if memory servers me correctly) and a 9800GTX vs. the 4850 still bests it at standard clock! I know of no higher overclocked 8800GT from BFG, this is their "best of the best". Impressive... Looks like no highly OC nv card(s) are going to hamper the 4800 series so far.Quote:
CPU
Intel Core 2 Extreme X6800 - Review
Motherboards
EVGA nForce 680i Motherboard - Review
Intel 975XBX Motherboard (for CrossFire testing)
Memory
Corsair TWIN2X2048-8500C4
Hard Drive
Western Digital Raptor 150 GB - Review
Sound Card
Sound Blaster Audigy 2 Value
Video Card
MSI Radeon HD 4850 512MB
BFG 8800 GT 512MB OCX
BFG 9800 GTX 512MB
Video Drivers
NVIDIA Forceware 175.16
AMD Catalyst RV770 Beta
Power Supply PC Power and Cooling 1000 watt
DirectX Version
DX10 / DX9c
Operating System
Windows Vista Ultimate 64-bit
Their testbed needs a major upgrade ;)
[Good find btw]
Yeah I'm leaning towards the 4870 although I'm not exactly following what you are saying [S/M???]. :D
not sure if it's been posted but...
Whatifgaming 4850 review
Thanks for the link Motiv
Looks like 4xAA is when the card really starts to fly and pulls away from the G92's
The 4850 performs much better then a 8800 GT it's regular clock rate and it looks like it beats the GTS 512 in this this review.Quote:
Test System
CPU
AMD Phenom X4 9850 (Running @ 2.6GHz)
Motherboard
MSI K9A2 Platinum (AMD 790FX)
ASUS M3N-HT Deluxe (NVIDIA 780a)
Memory
4GB (2×2GB) Corsair XMS2 DDR2-800 (5-5-5-18)
Looks like edge detect is doing a much better job. Hmm, first time I've seen edge detect reviewed though.
Have ya self another on me
Tweaktown Review
(probably already done!)
Yeah who knows or cares :p:.
How long do we have to wait for the 4870 results again? Oh I'm so excited and it's weirding me out.
5 days I think
Tweaktown's review isn't .. biased .. gasp!
Some 4850 vs 4870 numbers, not sure if they're real though.
http://forum.donanimhaber.com/m_24110474/tm.htm
Under the pic with charts its written that test results are sended by Sapphire(translated)
So the 4870 is about 1.2 times better for an extra 100 buckaroos. . . .. Hmm not bad.
Core clocks being 20% higher than the 4850 and performance at 20-30% higher means that the improved memory bandwidth is in effect... 4850's will probably be starving for bandwidth with AA and AF on at 1920 x 1200 and higher resolutions. Shouldn't be a problem for the 4870 though, and it looks like the 4850 is targetted at users in the 1280x1024 to 1680 x 1050 range while the 4870 is at the 1920 x 1200 range and the 4870x2 at us folks with 2560 x 1600.
Dual slot cooler, dual 6 pin power connectors, and higher volts on the core should mean high overclocks
Only a hit from 9x to 8x? I thought there were performance INCREASES!
;) :D
Yeah no more ATI AA/AF hits of death :D
Ok it was one thing for ATI to 'fix' the AA/AF hit but the performance hit is actually less than that of nVidia.
if i remember correctly before r600 that was the strongest point in ati that they dont take much hit in performance when aa af are enabled but if i`m wrong someone correct me :)
TechPowerUp review is up
http://www.techpowerup.com/reviews/P...HD_4850/1.html
Just you :p:
Oh snap! Now me too.
Not getting the images either :(
Nvm, pls look at the note guys :D
PLEASE NOTE: This review is not finished, the benchmark numbers are missing. Since the NDA is gone I thought I'd give you at least a sneak peak of the rest of the card. Keep checking back for the benchmarks today. All the information that is included in the review so far is final though.
Look at the final score!
9.X LOOOOOL, without benchmarks :P
Not that I'm dissing them or anything :)
man it seems the lauch of the HD4xxx was a total pr flop.... the card perfroms, is allready available and no decent review of the big reviewsites online?
AMD fails in marketing.... but a good thing they dont fail in performance this time. :up:
http://i3.techpowerup.com/reviews/Po...mages/temp.gif
Hopefully we can crank up the fan speed a little using ATI Tool or RT.
Temperatures are nothing to worry about
Stock profiles are set for Quietness, not for best temperature. Fan usually won't spin unless its absolutely neccessary
I think it was JimmyZ who noted this as well and re-applied thermal paste and received lower temps (from what I remember). Also, I am not sure if the drivers are mature enough to allow for a lower 2d clock. If not that would explain the higher temps as well.
So the people whining over 8800GT cards say what now? I remember alot of uneducated 90C!!! comments.
The point is the 4850, like the 8800GT runs in ultra quiet mode even under load. Only speed up if the temperature reaches critical.
My 8800GT still havent run above the 30% fanspeed.
I have no idea if you are criticizing the 4850 or the 8800GT in your last post, and whether you were criticzing them for overheating or the ability to withstand higher temperatures. Remember the architectures are different, so you can't be naive and use the term "old days" to assume that certain GPUs are bound to run 100C+ temps 24/7.
Perkam
All this talk about GPU Durability, and yet, if you took a vote to see how many people actually take the time to replace the stock compound that comes with the card, you would be very surprised to see how many people just don't (especially air users).
@Shintai, kk got your point. Ty :)
Perkam
But there are a lot of people who are buying cards with non-reference cooling, direct from the factory :). So as I'm thinking to do :).
If the 4850 dies off like a certain batch of 8800GTs do, we'll talk.
Currently, they don't.
another review pops up...
http://www.legionhardware.com/document.php?id=755
http://www.hexus.net/content/item.php?item=13919
We've lots more to add, including CAT 8.6 numbers for the HD3870 and HD3850 (which should be done today), as well as 9800GTX numbers on our midrange tests (tomorrow morning). Results are interesting to say the least (HD3870 scoring higher than HD4850 in crysis:confused:) - all the ATi provided benches are run at higher settings where the HD4850 pulls ahead, but it seems at more modest settings the HD4850 isn't much of an improvement.
Im guessing those idle temps will drop even further once the power management stuff is implemented correctly. Didnt ATI say its idle power consumption was apparently < 25w
Those hexus numbers look very odd indeed, infact they are almost the exact opposite of pretty much ever other review so far.
Hexus and PCPer are getting incredibly predictable. *eyeroll*
AMDzone has put out their take on 4850:
http://amdzone.com/index.php/reviews...radeon-hd-4850
Interesting detail is the improvement list of RV770 architecture(a long one).
Also,interesting thing is the last 4 rows in the list:
Edit:i agree with other people,HEXUS' results are very odd and in contrast with other reviews...Quote:
ATI CrossFireX Multi-GPU Technology
Scale up rendering performance and image quality with two GPUs
Integrated compositing engine
High performance dual channel bridge interconnect
They are indeed unexpected - I'm guessing the HD4850 doesn't shine at medium settings/noAA/low-medium resolutions. If we tested the card as a high end card (with higher resolutions, details to max etc.) the results may look different, but as its a sub £150 card we're using our mid range benches.
In the near future we're going to look at redoing our benchmark suites to account for the performance levels expected from a particular segment of the market - as clearly some people looking at the HD4850 would like to see some 1920x1200, high AA numbers etc.
I think,i'll get one 4850 :)
It sounds crazy but by the reviews 4850>9800GTX :\
Price range is good too,so IMHO it might be one of the most buying cards :o
Wait for Anands, Xbit, Tech Report, and Hard OCP before making a conclusion.
I, myself, would add Firingsquad reviews, but that's just me. Hardware Canucks has recently got on the bandwagon by doing their reviews Techpowerup style with overclocked CPU and the latest hardware (3.5Ghz Q6600 + 4GB DDR3 to be exact :eek: ) :)
Perkam
163€ in Spain. Peanuts :yepp:
My mind was made up months ago :P having been using a 2900xt since launch and skipping the 3x00 range, its time for an upgrade :P rest of system is fine (q6600@3.8ghz, 4gb etc) just need a new gfx card to really make my 24" monitor worth while.
Looking like its going to be the 4870, would love xfire but mobo doesnt support crossfire :( could always wait it out till 4870x2 though.
http://www.techpowerup.com/reviews/Powercolor/HD_4850/
http://www.techpowerup.com/reviews/MSI/HD_4850/
HD4850 beats the 9800gtx in most benchmarks, and especially at higher resolutions and with AA/AF.
All using 8.6 drivers.
http://www.computerbase.de/artikel/h...ting_qualitaet
German one, with AA in most tests. Even approaches or beats GTX260 in some games.
Hope these haven't been already posted:
http://www.pcgameshardware.de/aid,64...enchmark-Test/
(Great review that uses Catalyst 8.6 for the most-part, overclocks the 4850) - Amazing at points the Single 4850 beats even the GTX 280!!!! Here are a couple slides where the 4850 just spanks the 280
http://www.pcgameshardware.de/aid,64...&show=original
http://www.pcgameshardware.de/aid,64...&show=original
Another review (although not as good) here:
http://www.computerbase.de/artikel/h...schnitt_crysis
Again don't know if these have been posted.
I could care less if they overheat. If anything, it's up to the buyer's pick of companies' warranties. And that is why EVGA has the best in the business right now. Covering any kind of damage. <3 They gotta make ATI cards!
Hurrah! They actually use Qarl Texture Pack 3 for their Oblivion tests - a good memory sub-system test as it can easily load more than 500 MB of textures in one scene. The 4850 sure can hang with the GTX 280 in Oblivion.
http://www.pcgameshardware.de/aid,64...iew=yes&page=3