@ Hornet:That means after gt200, but that still says absolutely nothing about when the specific launch date is, and besides, December of 2011 is after August, so once again, we still know just about nothing
Printable View
@ Hornet:That means after gt200, but that still says absolutely nothing about when the specific launch date is, and besides, December of 2011 is after August, so once again, we still know just about nothing
I'm not talking about stock Vgpu.
I believe you also know that 8800GT has lower stock Vgpu than 8800GTS512 and 9800GTX.
Nothing also dictates that nVIDIA will be using the same stock voltage they're using now with the 8800GT ( which is rather low ).
ATi has a GDDR4 channel domination ( and not only channel ).
I said that nVIDIA will never use GDDR4, never said they won't be using GDDR5.
The yields weren't that bad, and there was also another reason to "stop" the G92 flow when they introduced the 8800GT 512MB ;)
800MHz Core is very possible, and unless they keep the voltages real low to maybe use single slot cooling, it'll be it.
And there'll be more headroom as well, and they'd never push the chips close to their limits, not even the pre-overclocked vendors editions.
The 1GB RAM already upped the cost, using GDDR5 would get it even higher, so they decided to implement a wider BUS width and GDDR3. ( "GT200" )
Like I said, after "GT200" and more than likely in August.
http://www.hardware.info/en-US/news/...launch_HD4800/
Looks like no shared memory controller, it's still going to be 256bit x2.Quote:
The final card in the HD4800 line-up will be the Radeon HD4870 X2, which will have 512MB GDDR5 memory per core, busting earlier myths on shared memory.
good, finally this rumors is dead (for sure now).
if the x2 has 512mb per core then it looks like ill be going for a 4870 1gb * 2 :D
Why don t you just stop with it and wait until the REAL bencmarks comes???
NV and ATI can burn in hell with their GX2 and X2 crap.
Of course it is a great card and it has excellent perfomance... when it works properly. It's the same with CF or SLI. The drivers you call good are good only for certain games, if your usual games are in that list, GX2 or X2 will kill any other card. But if they're not... and I don't know why the games I usually play have close to 0 (if not 0) support in your "good drivers", so I call them crap drivers and crap card. I feel kinda insulted when my $600 card does weird things or doesn't work at all in those games I want to play, and you?
Until you hear it from the horse's mouth, it's all horse crap! A few more weeks and the world will know.
Well I've always said there had to be some bridge for the two to talk to each other since putting the two die on one chip just wasn't going to happen, no this soon at least.
Also, the memory config doesn't mean they are independent for each core or if they are shared.
Anyways, the fact that ATI has for R600 and for RV670 kept the internal ring bus, and I highly doubt ATI is leaving it around for the sake of wasting resources (in which case if they are, people need to be fired) - i expect it to be put to use one way or another.
Anyways, with all the hush hush about R700 (that "leaked" AIB notice, whether fake or not, didn't even mention R700 specs), concluding one or another way decisively on what little evidence we have is pointless. We'll know soon enough.
Well, looks like we asked for pics of the 48xx's and we shall receieve (need translator!):
http://www.tomshardware.tw/549,news-549.html
Some tidbits:
-Second generation 55nm process
-TeraScale engine (?)
-GPU can do physics, ray tracing, AI, stream computing
-2 x 6-pin PCIE connectors for the 4870SKU
Too bad no other specs but the cards themselves look good!
The 4870 looks really like a sports car ;p
Tomshardware got the real pictures of 4870 and 4850
http://www.pczilla.net/en/post/23.html
AMD castrated the speed of RV770Pro2008-5-26 18:36:25AIB graphics cards marker told us that AMD had castrated the core speed of RV770Pro.
We’ve reported that the core speed of RV770Pro graphics chip can be reached at 990MHz (http://www.pczilla.net/en/post/4.html). But now AMD set the speed of retail version of RV770pro at 625MHz. AIB graphics cards marker told us this is the marketing strategy, in order to get the performance difference between RV770Pro、RV770XT and HD3870.
According to the AMD's reference design manual seen by AIB graphics cards markers, RV770Pro design frequency went beyond 900 MHz, that is when RV770Pro run in the 900 MHz frequency is only when the frequency of its design, and the 625 MHz frequency is obviously down to use.
Considering the Power play function and energy-saving features integrated in RV770Pro, so in some cases the core speed of Radeon HD 4850 will be even lower than the frequency of 400 MHz.
Radeon HD 4850 graphics card will equipped with 512 MB GDDR3 memory with 1ns speed, if its core speed runs at 900MHz frequency, its effectiveness will obviously go beyond the current Radeon HD3870, for this, AMD limited the first batch of public version of the Radeon HD4850, AMD will not allow AIB to ship Radeon HD4850 with overclocking out of box. This decision clearly shows HD4850 to be cost-effective products in this summer.
http://www.pczilla.net/en/post/24.html
Looks like the 4850 is the price/performance card to get lol. So AIB's cant overclock the card... but we can do it for sure! 900mhz + 480sp for $240 might be gold!
Thing is the 4850 might actually be starving for bandwidth unlike the r600, as we're talking nearly 500 shaders here, so you'd either need some uber fast gddr3, or gddr4/5 (hopefully AIBs will release gddr5 versions sameway there are gddr3 3870s)
Yeah, this time the card might starve for bandwidth. I think AMD has to find a golden red line between the performance of the GPU and its bandwith.
So the 4850 should handle at least 50% Oc on the GPU, nice! This will give it a very nice perfomance edge over 3870.
That's easier said than done, as you don't want too little bandwidth, but you're wasting resources with too much, not as easy as you think to do so. Better offer more than necessary than not enough imo
So Sexy!!!!:up:
1ghz core? :shocked:
I think ATI is intentionally starving the 4850 this time so it is indeed the Pro equivalent to the XT (like X1950PRO vs. X1950XT). The 3850's were basically lower binned 3870 it seemed but this time around, 4850's are underclocked and starved intentionally to create a larger difference (to justify the $100 price difference at least). Nevertheless, overclocking can win the day again :)
who leaves things stock here?Quote:
we’ve reported that the core speed of RV770Pro graphics chip can be reached at 990MHz (http://www.pczilla.net/en/post/4.html).
And thats the RV770PRO GPU... if the RV770XT is the higher binned one, a true 1GHz+ stock air could just occur
:up:Quote:
First ATI Radeon HD 4870 and HD 4850 Pictures Emerge
The Taiwanese website version of Tom's Hardware is hosting a brand new first look at ATI's Radeon HD 4800 series cards. Unfortunately, if you want to read something more than the posted pictures and the "Redefine HD Gaming" marketing charts, you'll need someone or Google to translate from Taiwanese for you, as you might have already guessed. What's interesting are the pictures below, that show the upcoming cards in all their glory. They also confirm our previous story about the cooling solutions that will be used on both cards. The first card pictured is ATI Radeon HD 4870, the second one is Radeon HD 4850.
http://www.techpowerup.com/img/08-05-26/07RV770_thm.jpg http://www.techpowerup.com/img/08-05-26/08RV770_thm.jpg http://www.techpowerup.com/img/08-05-26/10RV770_thm.jpg http://www.techpowerup.com/img/08-05-26/11RV770_thm.jpg
Tom's Hardware.tw
http://forums.techpowerup.com/showpo...44&postcount=1
is this true:
http://www.fudzilla.com/index.php?op...=7511&Itemid=1Quote:
RV770XT to challenge 9800GTX
??? man this sounds bad for RV770
Anyone notices that Radeon HD 4000 series support Physics and ray tracing?
http://media.bestofmicro.com/9/J/105...al/05RV770.jpg
They also stated that the RV770 would come with 512bit 3 times. I would take that with a grain of salt.
Any gpu supporting GPGPU technically supports physics and ray-tracing. The R600 could technically do the same, as could the G80. It's just that with intel making such a big stink about Ray Tracing, you're going to end up hearing about gpus already being able to process R.T.
Hard economic times - bad for over sized chips. :D
(Throw out those SUVs!)
:rofl:
I dont know why ATI doesnt invest more money on a better single chipset, instead of little updates to older tech...........i know they are going for price/performance but id really like to see them come hard with a great chip, something like 9700/9800 PRO days............that was a monster card, but since the 9800 PRO, ATi has not been the same.
with what money? they are struggling to survive with sensible architectures instead of grasping for the performance crown.
AMD has fallen hard with the rather complex R600 and I guess they've learned their lesson. now they go for less complex GPUs with smaller structures. imho, that's quite clever.
Fud has no clue about ATI hardware much like the Inq has no clue bout Nvidia hardware
lol@ the 512-bit referenced 3 times
Whoever is feeding Fud his info sure has him good
I bet he was thinking of the rv770pro, the xt will be at least 10-15% faster than the 9800gtx
Considering the RV770XT is MSRP'd at $349 from more than a few sources, I'd say Fud has no clue what he's talking about yet again
Will the 4800 series actually beat the GTX260/280? It's really hard to say right now. But I have to ask something? Are we still living in a day/age where we must OC our top end video cards to get acceptable frame rates in current games? I honestly believe we are in such a day/age were OC'ing high end video cards appear to be more of luxury not a necessity when playing games. Therefore, what are you really gaining as a customer if you don't have to OC to play your games? Yes, there will be 3DMark and Vantage it's fun to benchmark IMO. But if you are a gamer at heart (and some of you are) knowing that this gen of video cards will only increase the standard of performance. There is little incentive to upgrade from a performance POV.
if your taking that view on things why overclock at all?
its the bang for buck factor of buying a less powerful graphics card and pushing it to beat somthing costing allot more.
cards like the 9800 and 3870(x2)? will handle anything you throw at it today except maybe for crysis at crazy settings or games with 4X+ AA?
:eek: piccies!
http://www.techpowerup.com/img/08-05-26/07RV770_thm.jpg
not far away now.
The situation isn't as black and white as to over clock or not. The situation is that today's high end video cards don't require the same OC attention as they once did. OC'ing now is a luxury not a necessity to play games with a typical size monitor at acceptable frame rates. :yepp:
Well the bigger issue is that games in general haven't come out that have made it worth it as well as the fact that technology has outpaced games in general.
I remember when FEAR came out and only SLI rigs could really play it maxed out. Then a single G80 crushed it utterly.
Granted though, I run at 2560x1600 and so having a top of the line rig, including multi-GPU will potentially help. But at the same time, seeing how the next generations multi-GPU configs aren't so appetizing (the high TDP's of the GTX280's and the joke SLI boards out there, and potential performance crown issues with ATI + CF) means I'm in a bit of a rut and will certainly wait and see for which rig becomes the best.
i assume 4xxx will beat 9800gtx.
so, wot do nvidia have to offer in same price range :)?
oh yeah 9800gt /g92 revision...?; no crazy gtx280 for me, but waiting is prudent; have all cards on the table so to speak b4 choosing.
i'll go the super shrunk version of gtx280 in a year or 3, or whenever it happens :hehe:
And the amount of corruption issues and so on the 790i's have had on this forum (granted, they get pushed hard) and instability isn't exactly appealing to me
well if 4870 can perform about ~ GTX 260 in crysis, for $100 less mssrp, good enuff for me,
i am getting tired of having to buy nv's boards for SLI... cant wait for 4870's bench
probably not going to happen though, that would mean the 4870x2 would be miles ahead of the gtx 280 which doesn't happen you're playing catchup
considering how nvidia is trying to leave dependence of a cpu for better graphics, i doubt the gtx2xx will scale as well as the ati cards with a faster processor.
LOL from the same website it says $329 for the XT:
http://www.fudzilla.com/index.php?op...=7520&Itemid=1
:shrug:
I always thought FUD was pro-ati.
Anyways, at least in terms of crysis, its actually possible that the 4870 matches 9800gtx performance. Since current rumor put it at 20 percent faster than the 9800gtx. However thats probaly an average, and crysis is one game it has alot of catching up to do.
http://www.tomshardware.com/reviews/...ew,1800-7.html
Crysis has always run better on NV hardware than ATI hardware. In some cases the lead of the 9800 gtx is 50 percent. So alot of people thinking the 4870 has to perform better than the 9800gtx shouldn't feel so secure about that. It has alot of catching up to do.
This gives us another lesson, don't base purchasing this card strictly off of crysis performance because it not necessarily a reflection of everything else.
Radeon HD 4870 GDDR4 ES On Sell Now!
http://www.pczilla.net/en/post/25.html
There's nothing that can be really done at the present time to make a GPU less CPU dependent or to scale better with a faster CPU ( unless we're talking about a card that's 8 times as powerful as the 8800 Ultra was ).
In real-life gaming ( normal resolutions with AA/AF ) the CPU plays a little nearly tiny part, and it'll be the same story with the already on sale GPUs.
I remember GX2 in SLI needed a 5-6Ghz Core 2 before really starting to show its power. I wonder what GTX280 SLi/TriSLI/4870 Tri/Quadfire will need.
hmmm, maybe a 5.5 ghz skulltrail setup:D
....it all depends on the game, used settings and used CPU + components speed. You CAN NOT say that "you need 5 GHz CPU to make the gfx card show its power".
Read that article too , but we would need total reprogramming of games and multiple quadcores are required to run games like eg Quake4 at 60FPS...
Think what Shintai suggests to see the cards only show true colours ( so at extreme resolutions and detail) aided by a hefty CPU clock...at low resolutions (eg 1280 x1024) these cards don't have to sweat much, but at extreme high ones the whole subsystem has to keep up ....
Modern high end cards scale well with extra CPU power... for me it's worthless to spend 500 dollars on a Gfx card to bundle it with a core2 duo at 2.4Ghz or an AMDX2 4800... a Gaming system is not solely based on GPU or CPU, but a mix of CPU/GPU/RAM/MOBO/HD....
And sorry if you don't feel any improvement coming from a 2.8Ghz opteron towards a E8400, something else is bottlenecking ya system.... I even felt an improvement coming from an 2.6Ghz Opteron towards a 2.8Ghz E6300...and not alone in benchmarks...
Quote:
4870X2 to be Released in Q2 2008, Will be Cheaper Than GTX280
I'm sure all of you guys know all the goodies that will be coming with the launch of the next series of high-end cards from AMD. In case you need a recap of the awesomeness headed our way, please check out the source link. Anyways, enough propoganda. For those of you who are interested in buying the latest high-end offerings from AMD, the current rumor says that we should see new cards in August 2008 for a price far below the current rumored prices of the high-end NVIDIA cards. NVIDIA cards have traditionally been more expensive, and the GTX280 is supposedly going to cost nerds somewhere in the ballpark of $500USD. There is no confirmed pricepoint for the new cards from AMD. However, we can be fairly confident that it will be quite a bit less than $500USD. Another interesting rumor to note is that there will be a much larger difference between the HD4870 and the HD4850, as opposed to the comparatively small difference between the HD3870 and the HD3850.
Nordic Hardware
http://www.techpowerup.com/61300/487...an_GTX280.html
regards
Since price usually mimmicks performance. It means the 4870X2 will most likely be slower than 280GTX
Where have you seen prices on 4870x2? 4870 will most likely go for around $300-$350 and x2 will be 60%-80% more. So i think its reasonable for 4870x2 around $480-$630.
Some new slides from Toms:
http://www.tomshardware.com/cn/382,news-382.html
Try the same game at 3.6 or even 4ghz and you will tell the difference with one of high end cards... I never play above 1600 res as my monitor doesn't support it... there will always be a res and detail setting where the GPU caps out... even with 10Ghz CPU power... but all these cards should handle games fine at 1600 x 1200... but I'm pretty sure at this res CPU power is still important in many games... (plu sit increases the overal performance of the subsytem too)
Don't talk to me about bottlenecks : I hate them :rofl:
I still play mostly online games at only 1280 x1024 and mostly the faster my CPU goes the better the game runs (I did not say faster but fps stay way more stable) which is crucial for online shooters...and to keep my feel...
I was more referring to the person that didn't notice any improvement going from the Opteron to the Core2 at the same clocks... I think his vidcard was the big bottleneck there as I noticed even a big leap forward with my 7800GT in those days...
Whats with the 512 Mb in tomshardware slides.
I would rather the price be 349 and 1gb than this 329 and 512mb crap.
RV770 vs G92Quote:
Originally Posted by Mascaras
Yes it is, Crysis is far for being the only game that eats all those 512MB, even without AA. All the reviews out there comparing the same models with 512 and 1024 only focus on FPS, not in slowdowns when loading new areas caused by the insufficient graphic memory.
Does anybody know if it is possible to run dual 3870X2s under windows XP/XP x64?
I have seen 1650*1050 show problems with 512, when AA and AF are being used.
When 1900*1200 is becoming more and more common as 24" monitor are becoming more affordable(got one this Christmas), 1gb will become more and more needed.
I don't have Vista 64 yet but is it a greater video memory hogg than vista 32 when playing games in 64 bit mode?
Edit, anyways, I looked at another slide and it looks like the 1gb ddr5 will show up, but I can imagine a price of 399 if the 512 version is 329. At that price, it looks way to expensive as its price is way to close to the 260 gtx.
It also looks like it will be showing up later(1gb). THis is a bad idea because price gauging for the GTX will be over and its will no longer have as much of a price advantage.