Great card. Totally agree that 1gb is what it should have been at the start, but better late than never.
Competiton rules.
Printable View
Great card. Totally agree that 1gb is what it should have been at the start, but better late than never.
Competiton rules.
with each new driver more and more code optimization is done, so I expect even better scores with upcoming Catalysts for 1 GB cards, especially 'cos ATI is determined to establish 1 GB cards as a choice of performance users.
P.S
care imas najbolji avatar na forumu ;)
I would have looked for the link but i posted just before i went to work so i didnt had the time.
When i came back somebody already posted it, i dont see a big issue :S
unless I am missing something, everyone keeps saying the 1gb is the way to go but this is in the context of gaming higher than 1900 correct? even looking at the chart for GRID there is not even 1 FPS difference with the 4870 1gb vs 4870 512mb at 1900x1200. It still seems, depending on your viewing limitations, the choice of card can be determined by the gaming resolution. this is, of course, xtremesystems so maybe going overboard is the standard here :D
How could you come to the conclusion of trolling is beyond me.
I already knew before had that the sideport was not active.
Example:
Basically both parties should be putting the links forwards to back up their claims when they make then because to often when something is not following the expected line it would of been read from some sort of credible source & not just someone waffling on a forum, where as the common expected line would of just come from memory from what was originally expected from a product.
So if i say the sideport in is not active yet then it is likely that i read it from a credible source because we all expected things to be fully functional i would have no reason to say otherwise unless i had read from some sort of credible source just like Earzz & i cant remember the exact link either.
You assume sideport is functioning purely because it was an original advertised feature with no confirmation that it is actually working . but then ask me for proof about me saying that the sideport is not active is a making an accusation that it is like you know & you have read it to be so, when in fact you have not at all...so I'm just saying you should be the one to have a link also to back up your claims & not just ask the other to do so.
It always seems to be the ones who put the work & effort to really know what's going when things are not what they seem who then has to put even more work to re find the links for the people who did not put the work in, in the first place to know what's going on.
To often I heard becomes enough for many to take things as is & spread it.
BTW im not not suggesting this is how you are as no one know everything.
The fact is most people would not bookmark such a link & would likely have to goggle to find it again anyway which could be done by the people who didn't know them self to see if it is fact or not, so all the people who didn't know are asking for proof are really doing is letting the guy who did know do all the google work for them.
I think Anandtech should've included video memory usage graphs for each test as the simple framerate figures don't tell the whole story. Texture swapping is horribly irritating but it might not show very well in FPS figures.
On a sidenote, I plan on trying 1GB cards next year along with some tasty 100Hz 1920x1080 LCD.
True, but Vista does not allow for the monitoring of video memory usage. :shrug: It only works in WinXP. That's one of the many many reasons why I hate Vista. (Plus it limits my choice of resolutions if my monitor cannot be identified, unlike with WinXP).
About memory swapping--the same goes true for Company of Heroes. If I play it in max settings, sometimes it feels like 10-15 fps but it is reported as 27-29 fps sustained. It's not always the video memory, but also the system memory. I have only 2GB, and the video shows how the swapping feels like (strangely it does not affect the frame rates).
http://www.youtube.com/watch?v=b3ihuROhXtc
edit: I think its because during the memory swapping, the video card is STILL producing frame rates (while the physics is being temporarily paused for re-buffering).
Any compelling reason why they used Vista for such a comparison?
STEvil,
Isn't 100Hz better than 60Hz?
:x
100hz LCD? You know something I don't?
viewsonic already showed a 22" lcd monitor capable of 120hz
http://www.viewsonic.com/companyinfo...s_release=2094
im not shure about full hd monitors, but its only a matter of time (and most certainly next year)
I'm not that cynical, and I would LIKE to believe that there are true 120Hz LCD panels being released someday soon. Nvidia is strongly hinting at that, by revealing plans to manufacture Stereo-3D glasses.
Ak!
I meant 120Hz 1920x1200.
:D
tempting. . . this seems like a card that will last for a long time. I mean a lonnnggg time.
But hell my x1900xt is playable with COD4 and crysis so hard to talk myself into a new vid card besides for benching... uggghhhh I cant wait until i have money to blow on EVERY generation of video card and pc lol.