Quite so. It's only 5pm on Sunday here in NSW, Aus, and we're on DST as well, so really 4pm in this time zone.
Printable View
Facebook NVIDIA's Photos - PDX LAN 2010
http://photos-b.ak.fbcdn.net/hphotos..._3043626_n.jpg
http://photos-a.ak.fbcdn.net/hphotos..._7473292_n.jpg
http://www.facebook.com/photo.php?pi...&id=8409118252
You miss it on purpose? :ROTF: The sign is hilarious: ATI = Always Trouble Inside? :rofl::ROTF:
http://img16.imageshack.us/img16/999...8252840911.jpg
never seen so much nerds together :P
People are saying it's 9pm pacific time. :\ That would suck.
It's the typical time they give... Nvidia is on PST after all
Of i was thinking those box's were of GF100 but when i read 250 in the end my heart broke :(
G92 when will you die...
Oh ya did u see this the Nvidia eyefidelity :) so does nvidia have anything against ATI's 6 screen tech?
http://photos-f.ak.fbcdn.net/hphotos..._3337817_n.jpg
http://media.bestofmicro.com/,V-4-235696-3.jpg
Lol@nFinity
http://photos-f.ak.fbcdn.net/hphotos..._8382839_n.jpg
restricted to 1920x1080 displays.
Single link DVI.
rofl patchjob
I don't think so. NVIDIA 3D requires 120hz refresh rate which doubles the bandwidth from the standard 60hz. So dual link DVI wouldn't have enough bandwidth to drive 2560x1600 resolution in 3D.
I wouldn't trade higher resolution for 3D though. Given that 3D glasses also hinders the performance monitors; I don't think 3D is anywhere ready for prime-time.
too high clocks i think...
http://img44.imageshack.us/img44/5599/223dmvantg.png
5k above a 5870, which is around 25% faster...
http://img96.imageshack.us/img96/8576/guru3d.png
as fast as 285sli and a 5970... makes sense... i guess...
but isnt a P score kinda pointless for such a highend card?
how about X?
295 scores more than 22K? where? :confused:
cheers :toast:
so 100=360 and 104=380?
with this kind of performance... why would anybody want sli or a dual card unless its for benching? :D
but those are nvidia pr numbers...
the real numbers are probably notably lower... theres a reason so many people said ~25% faster than 5870, and i dont think it can beat the 5970 consistently... if it would be 48% faster than a 5870 it would have to score over 25K in vantage, and a card performing better compared to the competition in games than in 3dmark? thats not going to happen... 3dmark has turned into a 3D sisoft sandra benchmark, showing the maximum theoretical performance which is always higher than realworld performance, and the gap between theoretical max and actual max grows with every new card. so a new card that is 30% faster in 3dmark compared to another card, is most likely only 25-29% faster in games.
amazing visual realism and advanced dynamic realism... terrible marketing speak... :D
what tech is that? and have you seen it with your own eyes? i saw all the 3d stuff at the display tradeshow here in taipei over half a year ago, CMO CTM and AUO showed their panel tech back then, which samsung sony lg etc used to build displays they now show at CES... and the 3d tech that didnt require glasses was terrible... it was distorted and flickered and low res, and when you moved your head or even just your eyes slightly, the image changes...
100 = 360/380 104 = 350
X score was 15K
but it seems bogus anyway
How many hours back? :)
~7,5h
yeah that makes more sense...
15K X is very good... that is indeed 48% faster than a 5870 then...
btw, remember this?
everybody said it was fake, but the deep dive numbers seem to be about the same? :D
http://www.youtube.com/watch?v=kKbq6...eature=related
3d mark numbers don't translate to real world
How much did GTX295 with physx hack score anyways??
Sascha sign in on msn dude!
was a new company seen it over at avforums, ill find the link when I finish my uni work :) (javascript stuff :down:) but the people that seen it said although not as "immersive" as the other "way" of doing it, it's a lot better in thats its just no need for silly glasses :up:
i never got what all the fuss with the glasses was about tbh...
there are very well designed glasses that arent bothersome at all...
FFS, how many people with eye sight issues wear glasses 18 hours a day? :D
btw, just saw this clip of water tesselation on GF100, looks pretty cool...
http://www.youtube.com/watch?v=QLu8DyzoVMs
oh androfl :D :up:Quote:
thermonuclear meltdown!
Amm i have a +1/+.75 on my eyes and cant see much without my glasses, so that means in order to play @ 3d i need contact lenses :(
Oh ya Nvidia may have the tessellated hair but AMD has the tessellated face lol
I wonder if 1.5GB of VRAM is worth it for games. ATI might release 5870-SIX to counter Fermi soon I guess.
Edit: typo fixed.
Grand Theft Auto IV
S.T.A.L.K.E.R Complete 2009 with High resolution Texture mods
S.T.A.L.K.E.R Clearsky with mods
Half-Life 2 Cinematic Mod with High Resolution Textures.
Resolution 1920*1200 ideally with at least 4XFSAA too, however I can compromise the FSAA if performance is low.
High resolution textures look really nice, just they tend to choke the VRAM somewhat.
John
^good point, I have been struggling with Stalker too
It is annoying isn't it Bodkin
Without drifting too far off topic, I am willing to bet your Radeon 5870 is more than capable of handling Stalker... it just needs more VRAM!
I was quite excited when my News Letter said that the GF100 cards will come in 1.5GB and 3GB variants and a 6GB card (presuming dual GPU?) coming in Q3.
However those were all Quadro cards :down:
It looks like consumer based cards might max out at 1.5GB, (maybe 3GB if someone does a MARRS or some special edition like there were a few 2GB GTX 285's).
I would like to see the base model Fermi have more than 1GB and the top end have 3GB.
Not only is it useful for games, but OpenCL and CUDA applications love the extra VRAM too
John
15h left... :rolleyes:
good list there, i only wish to try them out, but my 512MB card is probably going to set fire if i did. 1GB on the 5870 felt like a slap in the face for those of us who expected to see a new age of textures. i for one do not like multi gpu setups, (too risky for perf increases) i also will not get anything less than 2GB, ive had 512MB since 2007. and weve had 5870s for how many months now? and not one of them has 2GB, such a waste. im almost wondering if third party vendors were not allowed to up the ram so they wouldnt compete with 5870-six or a 5890 if it will come out.
goddammit don't make non-news or non-rumors posts here, whenever I see a new post I immediately jump here thinking it's maybe a new rumor :D
I agree on the multi-gpu setups. Anyways, Saaya posted this review a while back which has the 4870 512 vs 1gb and the 285 1GB vs 2GB and the Asus Mars. It also has the cinematic mod 9.51 as well as FO3 with NMC's high res textures.
I was starting to wonder if I'm the only one here who uses mods since thats where more power comes in really handy.
if the glasses are designed well they will just sit over your regular glasses... idk... i dont wear glasses, so i dont know how annoying it is... :shrug:
more like thats where the additional power of highend hw is actually used :D
who added the thermonuclear meltdown tag, my monitor is full of saliva because I burst out in laughter :D
AMAZING NEWS: Not many hours left now.
what? I what it was only 2 hours :(
What excactly is gonna be anounced ?
it's 9am for PST now. So not even 5 hours, 12 hours :(
sorry , i'm not following the thread lately, 5 hrs left until what ??
People have received vacations in the past for playing around with thread tags.
There actually are many serious drawbacks of 3D shutter glasses. First of all, they reduce the contrast/brightness of the monitor, and they can affect color accuracy.
Secondly, "crosstalk" might be an issue when your screen doesn't response fast enough. Crosstalk happens when the pixels don't switch fast enough, so your right eye sees residue images of the left eye; and vice versa. In order to minimize crosstalk, a very fast panel is required, so 3D monitors today have to use TN panel out of necessity.
Thirdly, shutter glasses work by rapidly blocking your eyes. So ambient light gets blocked repeatedly which can cause a "flickering ambient light" problem. It's best if you use 3D shutter glasses in a very dark environment to minimize this.
Fourthly, your left and right eye never really see an image together; but they alternatively see an image for each eye. I suspect that this is the reason why 3D shutter glasses give me a headache in any fast moving scene. For this reason alone, I refuse to use 3D shutter glasses. Only polarized glasses are comfortable for me to use.
So, I'd wait until 3D polarized monitors to show up (no idea how they can technically do this though). 3D shutter glasses are deal-breaker for me.
Quote:
Design Article releases tomorrow 7PM CST with complete Whitepaper info.
New Features, new cache, new Memory setup, and yes it's about 100% performance increase over GTX-2xx so figure single GTX-285 vs 5870 then double the GTX-285 performance.
Then it handles triangles different, triangles on any given frame can number in the hundreds of thousands so that's very important.
It will fold a lot better.
Increased efficiency in several areas.
It's a revolutionary new design oriented toward tessellation (those pesky triangles) and geometric programming. Problem being every wire frame is made up of triangles, tessellation takes those triangle and breaks them down into many smaller triangles. This core is uniquely designed to handle that so geometric and shader heavy games you will see more than the 100% raw power increase.
520USD might handle it. At 2x GTX-285 performance that puts it above GTX-295 performance and it's DX11 ready and designed for that specifically. Current ATI offerings are really good but basically a double the hardware on the same core design to provide more raw power. GF100 is a core design to take advantage of what the industry needs today and for some time in the future.
Read the article tomorro cause that's about all I can say tonight.
Source: Here
:up:
omg sound great!
hmm sounds interesting.... really just want a respected site to review it though...
Very impressive, sounds worthy of the wait.
Yay now we'll be able to play Crysis as smooth as half life.
meh didn't saw that 100 performance increase in fc 2 bench sounds like marketing bs
meh didn't saw that 100 performance increase in fc 2 bench sounds like marketing bs
Dude, 5870-SIX has 2 GB of VRAM...
i think the 5870six should also be a 1ghz chip. just give it every damn thing possible.
LOL@ the tags of this thread
"gf100 napalm edition"
http://www.tomshardware.com/news/ATI...5870,9031.html
Sorry to break your heart but it has been rumored to be 850mhz, maybe they upped it to 1ghz since it is suppose to come with 8+6.. :)
i only can hope and wish that it was gtx 360 and that nvidia will sell it with reasonable price
I though Nvidia dropped the GTX.
everyone stop with the i thought's in a couple hours all will be revealed ;) just wait :)
Guys, take a deep breath
Whenever nVidia release a card (or usually) the x4 suffix has been used for the budget range.
e.g
G84
G94
and I think G214?
John
First whitepaper video is up
[yt]http://www.youtube.com/watch?v=dlgzfwIrmIc[/yt]
http://www.youtube.com/watch?v=dlgzfwIrmIc
edit: okay, so.. XS didn't have money for youtube embedding, understandable.. but PLEASE stop F'ing with the thread tags!
:rofl:
:ROTF:
Even if I do have some proof, would you accept it ?
Most people won't unless they saw things go on live in front of their eyes, so no, I'm not going to bother.
I'm a regular and not so... eco-friendly boy, so no again.
The question was ( you don't have to answer, but don't try to play dodgeball with me :p: ) if you're working for a hardware manufacturer... not if you're registered on other sites.
I wonder if these cards will have new FSAA techniques which work on games like GTA IV?!
so um, has there been any news?
Yea like price for example. I want something not higher than 300$.
5 and half hours to go. might aswell go sleep. lol