look back a few pages and look very hard at the supposed R700 reference cooler. :ROTF:
Printable View
Again, he directed that quote at the claim that R700 is just CF on-board, like the 3870X2. And again, there's no proof it is or isn't.
A bridge chip is a given anyways if two discrete GPUs are going to communicate - how they communicate, however, is a different question.
exactly what I'm talking about, we have no idea what this new bridge chip will do to how the chips communicate
sure we're talking about the same thing here? "micro-stuttering" is always apparent because of AFR, there is no in between. results differ from game to game but it isn't just some temporary thing--if you're experiencing stuttering during game play, it's caused by something else.
everlast, i think you're confusing micro-stuttering with input lag (responsiveness of your LCD).
some poeple see it and some don't, at 2560x1600 with 3870 cf i don't
maybe just old eyes
It depends of the game. With my 3870s I experienced micro-stuttering in almost every game, but a few ones were smooth as silk :up:
I wonder if that has to do with the game or the drivers
Both. Depends on the app. many issues at paly in this situation. hopefully this gen will fix the problem.
Launch was postponed to 26.6?
I have been hearing June 23...
http://www.tgdaily.com/content/view/37795/135/
I dont understand. Fudzilla report that the hd4850 runs 78 degrees at 0% activity. I thought hd4850 used around 10w when idle.
http://www.fudzilla.com/index.php?op...=7701&Itemid=1
http://www.vr-zone.com/articles/Rade...sted/5829.html
625mhz@78 degrees
[QUOTE=biohead;3037610]sure we're talking about the same thing here? "micro-stuttering" is always apparent because of AFR, there is no in between. results differ from game to game but it isn't just some temporary thing--if you're experiencing stuttering during game play, it's caused by something else.
everlast, i think you're confusing micro-stuttering with input lag (responsiveness of your LCD).[/QUOTE]
Aahhh possible. I experienced a noticeable stutter every two minutes or so that was unbearable. I tried the usual suspects like drivers etc and as I stated earlier, simply increasing the refresh rate from 60 to 75Hz made it smooth as duck poo.
If you are experiencing even smaller stutters may I suggest consuming 2 to 3 beers when playing games, you wont notice them with beer goggles on.:toast::rofl:
I would really like to see 3DMark06 scores for these puppies :D
Theres no delay so far.
AMD HD 4850 – RV770 GPU die shot
http://images.bit-tech.net/news_imag...ot/rv770-1.jpg
For more pics:
http://www.bit-tech.net/news/2008/06...gpu-die-shot/1
http://images.bit-tech.net/news_imag...die-shot/5.jpg
Hmm, nice, that should be easy to cool!
Holy :banana::banana::banana::banana: that GPU is huge.
It should be a bit over 250mm^2.
RV670 vs RV770 in-scale:
Attachment 79787
"Yep, that's AMD's forthcoming mid-range GPU, the Radeon HD 4850. It's a single-slot design, and it doesn't look all too dissimilar from the current 3800-series. Our sources state that AMD aren't allowing its partners to adjust the reference design at launch. If indeed that is the case, every Radeon HD 4850 will initially look like the above, with the various partner-provided derivatives arriving later on.
The card provides 512MiB of GDDR3 memory and features dual DVI-I outputs. Unfortunately, GIGABYTE weren't willing to cough-up any further details, despite showing the card. Nonetheless, there's support for CrossFire X and our favourite feature is undoubtedly the sticker that says "FAST"."
for some reason the back side of the pcb looks more complex then the R600 cards, like there is "more" on the card.
source; hexus.net
Same cooler mounting dimensions as HD3850?
If so, HR-03GT and Accerlero S1s will fit fine
most likely the same dealer as the card from the last page, Gigabyte
wasn't there a 4870 pic last week?
this one?
http://media.bestofmicro.com/9/9/105...al/07RV770.jpg
http://media.bestofmicro.com/9/A/105...al/08RV770.jpg
" In 3DMark06 reached the HD 4850, according to "Fudzilla" a score of 11,760, while the Nvidia card compared to 10,800 3DMarks came - two values that only by a factor of 1088 separately. At this point not tested 9800 GTX would probably come around 12,500 points.
Similarly, results in the colleagues of "ITOCP" emerged that the HD 4850 P5847 points in 3DMark Vantage has achieved. At the same system, a 9800 GTX P5816 points obtained. So far so good: It is interesting, however, that the X-mode, in 1920x1200 instead of 1280x1024 and there is also the default 4-times anti-aliasing and 16 times anisotropische filtering active, the score against a 9800 GTX with points instead X2609 X2104 points, 24 percent higher."
translated with google, source; http://64.233.179.104/translate_c?hl...%3Fnews%3D2124
Those are some pretty amazing scores for 1920 x 1200 4xAA and16xAF. It shows that it has high shader performance and that AF is fixed (granted, 9800GTX gets hurt starting at those resolutions and settings).
Either way, sub $200 for a card with 8800Ultra performance would be amazing
So how do you guys think the 4870 will do in comparison with the 280 GTX?
If when OC'd it beats the 260gtx i'm sold. i'm more into price:performance ratio's
So how do you guys think the VW Golf will do in comparison with the Mercedes S-Class?
GTX 280 has the be at least 2x faster.
I want to know about the physics part,is it actually better,how does it work, like will a x2 be crossfire physics?:up:
If you meant Physx, then the answer is no. Nvidia did a smart thing by swallowing Ageia to get their hands on the Physx and make it available only to Nvidia users.
Only way for you to have Physx capabilities is to grab one of the old PCI Physx cards. And again, that is if you meant PHYSX not PHYSICS.
We are at the point were the fastest card isn't really important as in days past. Having the fastest of the fastest is becoming more and more a novelty not a necessity IMO. Because of this, people are looking for more from their video cards then just 100 gimillion frames per second in crysis. What is becoming more important as video cards perform better is:
-Price
-Power Consumption
-Image Quality
-Feature Set
-Better Performance
-Heat Output
-Reliability
-Overclock Ability
Now this not saying that better performance isn't welcomed, it is. But the urgency of it isn't as important IMO.
As for the VW vs Mercedes S-Class, with gas at over $4+ a gallon the value for such a car has decreased at a increasing rate. Yes, it performance is better then most other cars. Yes, it's more comfortable and popular. However, in this economy vehicles like that are taking a back seat. The same that applies to a S-Class applies to anything else.
imo, what I think the performance should be is enough that you can play at your max resolution with some detail. Honestly my 6600gt can play quite a few fairly new games on my 24" widescreen, even the 8800gts g92 generally is enough to give you at least 30 fps for my needs. Honestly, 300 fps looks and feels no different than 61 (as the eye can't detect the difference above 60 anyways), so what matters most to me is minimum framerate (which has shown to be a problem for 3870x2 and even occasionally the 9800gx2). If either the 4870 or 9900gtx (not gtx 280) can do that for me, then I have no need for anything better
Actually, if the green team is to be believed they are in the process of making the physx API an open one, so that even AMD can implement it in their cards.
Can't find the quotes right now, but this definitely came from the horse's mouth.
No harm to ya but so not true! I'd be surprised if there isn't some old thread on this forum about it, but the human eye does not work in a way that facilitates the statement "can't detect above [number] fps".
Yep, I saw what they were going to do when they bought Ageia so I quickly snagged a used PhysX card on e-bay for cheap.
Wrong.
http://amo.net/NT/02-21-01FPS.html
Good article, pretty accurate. It goes into motion blur which is very important :) An example i like to use is that if you played a game at avg 500fps for a few days, then suddenly played it at 250fps, you would very much notice the difference. Obviously this has to be taken in context, we don't play modern games at that high an avg fps so it's arguably not important, but the point is there is a difference and the whole "anything above 60fps / 100fps cannot be perceived" is wrong.
Perhaps I was wrong (as you said there are tons of people and threads that talk about what I said)
It also must be taken into consideration that only the very best CRT monitor (impossible to find these days) will only be able to display ~240 FPS, limited by the refresh rate. Although technically a higher FPS than the refresh rate can be shown, there will be very noticeable tearing.
240hz CRT? Man I'd love to have one of those.
Faster LCDs are coming as well. 120Hz are out and samsung has a 240Hz prototype as well.
Let's not go OT over the FPS thing. Yes 61 fps vs 120 fps isn't all that different, but a card that can render 120 fps will have a much higher minimum frame rate, and thus a lower probability of lag in high poly situations (explosions, etc).
As for the 4870, ofcourse it won't beat the GTX 260. If it did it would be $449 as well. It is, instead, $329 and is meant to compete with the 9800GTX and the 9900GTX (G92b) that will be released this summer as well.
Only the 4870X2 is competing with the GTX 260 and 280. Remember that the D10U architecture will not be available in the mainstream market until it moves to 55nm, which won't happen till 2009, as the G92b's will be released for that segment this summer.
Perkam
a movie in a movie theater is what 24 fps
SourceQuote:
Originally Posted by Wiki
120Hz =! 120 FPS. That means that the monitor will be refreshed at a double rate and the only visible benefit you will see will be when you go the mirror and see that your eyes are still healthy after hours using the computer.
So even if they launch a 240Hz means 4 x 60 = 2 x 120 = 240Hz.
It depends it varies to 24 to 30 frames per second, with this monitor they can create movies at 60 FPS.
Metroid.
I'm not sure if all movie theaters use it or not. I am sure that Imax type of theaters do. From what I've understood 24p to mean is that it compliments fast moving objects with less micro blocking, stuttering, and artificial blur. It's not something you would care for in drama type movies. It's something you can see here and there in action type movies. However, it's something you really, really, really want in sport events. This is my understanding of it. This is were places like the AVS forums come into play. They really give you the low down on stuff like this. Check their HDTV and home theater sub forums on the subject.
"While we are away for Computex, CJ let us know that the Radeon HD 4870 X2 R700 prototype card is out and it beats the NVIDIA GeForce GTX 280 in 3DMark Vantage. The R700 card is basically made up of two RV770 GPUs with 2x256-bit memory interface to either GDDR3 or GDDR5 memories. We asked our sources about R700 in Computex and apparently AMD is going to let AIB partners to decide the specs themselves. Therefore, the partners will set their own clock speeds, PCB design, memory types, cooler solutions etc. and there will be Radeon HD 4850 X2 and 4870 X2 cards differentiate by the memory type. The R700 card apparently doing pretty well at this stage scoring about X5500 in 3DMark Vantage Extreme preset while the GeForce GTX 280 card is scoring X4800."-vr-zone
sorry posted to late
If the 4870 X2 scales as effieciently as claimed ( with the new crossover chip), I could see it out do the GTX 280. I'm still not passing judgment until I see official benchmarks however. So much info flying around right now and its not even always that consistent.
I want a 4870 so bad! i have cash set aside and all, i guess now all i have to do is watch and wait (and obsessively search e-tailers for them in stock :P) they could release them faster i think :roll:
Cant wait for the 4870X2 goddammit , when does the NDA gets lifted ??
Multi-GPU is a sack of :banana::banana::banana::banana:, don't let its numbers fool you, the FPS you see in 9800GX2, 3870x2 and any SLI/CF system IS NOT REAL and IS ABSOLUTELY NOT COMPARABLE to a single-GPU system.
Do some research before you blow $600 on a graphics solution.
My guess is R700 has a built-in PCIe 2.0 intercore interface connecting the two chip carriers without any external bridgechips.
That would allow higher efficiency onboard CF.
But still CF onboard using AFR, exactly the same as the 3870X2. Sorry but no. This should help ATI's ego.
"We have a card capable of rendering 500 frames per second. It looks like it's only rendering 250, but who cares, the only thing that matters for reviews is numbers. We win." :down:
to be honest im not really convinced by this whole microstuddering thing. like it makes sense but why is has this only came up in the last half a year?? why did no one comment about it back with the 6800ultra SLI days??? it has all seem to come out of no were and most of it seems to be focused about the HD3870x2 and windows Vista. is this new or has it always been a mutil GPU problem? and if it has always been there why did no one bring up up for years? also why are all the big names (magazines and review sites) not talking about it and still continuing to use SLI and CF systems in there "dream machines" and whatnot.
I can only speak about my CF experience, which is limited to 3870CF. The cards are dirty cheap, so CF is a possibility. Such thing was not possible before because of prices. You can now say CF for the masses, it's obvious the more people having it the more people complaining about its problems. The main problem is not CF o SLI, the problem is AFR and the way it's handled by today's drivers.
Anybody here has a Samsung F8 LCD TV and CF/SLI setup to test and report back.... :D :yepp:
I'm not sure about this microstuttering thing, so i'll request a Bottom Line here:
If i purchase another 8800GTX for my current system will there be any performance gains or won't there?
Thanks.
http://www.vr-zone.com/articles/Rade..._280/5851.html
Quote:
While we are away for Computex, CJ let us know that the Radeon HD 4870 X2 R700 prototype card is out and it beats the NVIDIA GeForce GTX 280 in 3DMark Vantage. The R700 card is basically made up of two RV770 GPUs with 2x256-bit memory interface to either GDDR3 or GDDR5 memories. We asked our sources about R700 in Computex and apparently AMD is going to let AIB partners to decide the specs themselves. Therefore, the partners will set their own clock speeds, PCB design, memory types, cooler solutions etc. and there will be Radeon HD 4850 X2 and 4870 X2 cards differentiate by the memory type. The R700 card apparently doing pretty well at this stage scoring about X5500 in 3DMark Vantage Extreme preset while the GeForce GTX 280 card is scoring X4800. Both sides are still working hard on optimizing their drivers for the new architecture so probably we will see the performance to improve over time.
I see, perhaps i misunderstood from what i'd read, it seemed like people were saying that although there was an fps increase in programs like fraps there was actually no increase!
So there is an actual juddering of the image? I guess i'll just get it and see for myself.
Check page 59 mate!
Already posted.
And tbh a bit hard to believe.
If true, then it better be reflected in games with AA/AF.
That'll be a miracle and a nice comeback.
The r700 doesn’t use software or hardware to do crossfire, its two beautiful GPUs working in harmony to kick some sever ass! SOO there wont be any of this "micro-studdering"
Very hot for HD4850 :shock2:
Surely it must use hardware if not software? Which if so is good! Doubt anyone knows but what about if you put two of them together? "Micro-studdering" then?
I think this whole "micro-studdering" thing is blown out of proportion anyways. I ran 7800GTX and 8800GTX's in SLi and I fully plan on running a multi-gpu based system with which ever card wins the gpu wars. Micro-studdering be damned! SLi has always proved to be a enjoyable gaming experience for me.
15% faster !!!!!!
maybe AMD's gonna ask 800$ for it !!!
I've always noticed the microstutter but after running multi gpu for 6 months, I'm used ot it and it doesn't bug me. Sure it isn't ideal but I'll still take the increased performance anyday. To say multi gpu doesn't help is completely ignorant. If ATI do manage to remedy this with the 4870X2 or better yet with multicard crossfire, that would be a push in the right direction.
Erm, its already confirmed that the R700 is the same basic structure as the 3870X2, We can see from the cooler that it is two separate dies and it was confirmed that the PCI Express bridge is now 2.0
So it uses the similar magic / hardware / software to the 3870X2 :p
Not really. A bridge of some sort is required regardless if its on-board CF or not - the chips have to communicate somehow anyways!
The real question is whether the communication is like CF today or if the GPU's themselves communicate differently - the bridge is just the medium for it to pass through
Ummm... R700 is due out in July/Aug.
Nvidia will need 45/40nm to even think about doing a GX2 and that won't happen for them until sometime in '09.
Overclocked card with the fan and GPU activity at 0%...
As far as I know which things might have changed in the last ~20hours but when I went to bed last night, nothing was "confirmed" about R700.
We are not 100% sure that black cooler is R700, not 100% sure about a PLX2.0 chip, nor 100% sure about it being the same as 3870x2.(AMD/ATi made a statement saying it was NOT going to be two RV770 on a PCB like the R680 was)
Now if you have a source verifying any of that please post it.
rv770pro: 625mhz
rv770xt: 750mhz
& there´s no shaderclk
3870x2 can beat a 9800gtx all day in benchmarks, but the 9800gtx kicks ass in most games, especially with aa/af turned on. 3dmark is garbage and misleading piece of software ever, i wish people would stop using it, its not representing the true performance of the chip.
So @ stock clocks are we looking at a 15-25% improvement from 4850>4870 in all tests? I think I'm gonna pick up an ATI card this time, mainly cuz I play EQ2 and nvidia cards are :banana::banana::banana::banana: for eq2.
Who knows really - supposedly R700's been in the works for some time and R600 was just the testbed for the architecture. Whether or not it actually "works" (it could always have been scrapped for sucking) or was delayed to another generation (after all the delays of R600 for instance) is hard to say. All we do know is that R700's design was rumored quite some time before and R600 had a lot of things in it that weren't fully utilized that people have hinted may have been used in a grander scheme, but nobody knows what's changed since then