Page 8 of 42 FirstFirst ... 56789101118 ... LastLast
Results 176 to 200 of 1035

Thread: The official GT300/Fermi Thread

  1. #176
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by Helloworld_98 View Post
    I don't think eyefinity is really a major plus for it, since you need a DP monitor for it and to make it worth while by using 3 monitors, same model, it's going to cost you £1200+, and then you also have to pay another £310 for another card for CF to make sure you get good performance.

    also gaming on a 1080p plasma, probably better than an LCD monitor since you get a bigger screen, better contrast and you don't really get pixelation.
    I think the fight Eyefinity against big HDTV is without end.
    It depends of user preference and will.
    I don't think all games are suited to be play on big HDTV.
    In the same time i don't think all game are suited to be play on Eyefinity.

  2. #177
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by Helloworld_98 View Post
    I don't think eyefinity is really a major plus for it, since you need a DP monitor for it and to make it worth while by using 3 monitors, same model, it's going to cost you £1200+, and then you also have to pay another £310 for another card for CF to make sure you get good performance.
    Why would I need three monitors of exact same type? I'm aiming at one good 22" which I already bought and two slightly worse TN-displays for my peripheral vision, also 22" with 1680x1050 res. I want to see more, not only bigger.

  3. #178
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by flippin_waffles View Post
    DilTech, no soup for you. First, I understand that you'd be tickled pink to convince as many people as possible to wait for nv's silicon to finally be ready, whenever that may be ( judging from what Charlie has to say, 6 months isn't a guarantee either. And yeah, his track record on nv is an order of magnitude more accurate than anything nv has said). That argument you are using is the oldest in the book, and it's maybe time to update your way of thinking. The fact of the matter is, there are MUCH more compelling reasons to upgrade now than there was for either G80, or GT200. True, GT200 did flop because ATi 4800 hit the sweet spot and was a worthwhile upgrade for a minimal investment. 5800 gets you dx11, Eyefinity, and the best performing card on the market.
    And as for gaming on a 1080p plasma, how is that a PC again? where is the immersion in that? you might as well be running a console! lol Yeah, Eyefinity is where immersion is at, and that is reason enough to pick up a 5800 series card. Probably last a good 3 years without the need to upgrade.

    The only advantage and answer to Eyefinity that nv has, is that there is no way to even come close to producing and representing the immense level of immersion through a video over the internet. Marketing it will be tough, but the real enthusiasts will know cool this is.
    So while I don't doubt you have no intention of placing a 5800 series in your console, I think most will definitely have reason to NOT wait.
    So let me get this straight....

    First, not being able to max out any game you can't on a last gen card is a reason to upgrade, and more of a reason than the 8800GTX was? Last time I checked, gaming performance is the number 1 reason, and when it doesn't bring you anything you couldn't already do then it just becomes pointless.

    DX11? I'll care about that when we have a game worthwhile that runs DX11... AvP. Comes out in february.

    Playing on a plasma TV makes my PC a console? Why would one even say that? Most monitors these days are 1080p, which is the same resolution as my TV. At the same time, a good plasma has better black levels, better color reproduction, no ghosting, perfect color uniformity, less video delay, and more than most consumer level LCDs.. On top of that a much bigger screen than could be had by a computer monitor. If that makes my computer a "console", then I am much happier with my console than I would be with your computer. Think about it, a 42" screen sitting on your desk, covering your entire vision in all directions... How do you get more immersive than that?

    Now, about that eyefinity issue... I really don't care too much about it, especially since most games won't even allow you to set the FOV wide enough for it to make sense. Plus, you'd be stuck buying at least one new monitor in most cases, and dealing with the borders from the multiple monitors, which is a massive turn off to me. Maybe when a triple monitor in one frame comes out at a decent price it'll attract my attention, but in the mean time my 42" plasma is much better for me because I have no bars and the quality of the image is like looking through a window, which is something you can't get no matter how many lcds you put together because you still have parts interrupting your image.

    So yes, I can honestly say I see no reason not to wait for NVidia's part, because even if it turns out not to be worth buying it will at least drop the price of the HD5870, which is still a win considering right now there's nothing that I could do with it that I can't do with my current set up.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  4. #179
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    Quote Originally Posted by flippin_waffles View Post
    DilTech, no soup for you. First, I understand that you'd be tickled pink to convince as many people as possible to wait for nv's silicon to finally be ready, whenever that may be ( judging from what Charlie has to say, 6 months isn't a guarantee either. And yeah, his track record on nv is an order of magnitude more accurate than anything nv has said). That argument you are using is the oldest in the book, and it's maybe time to update your way of thinking. The fact of the matter is, there are MUCH more compelling reasons to upgrade now than there was for either G80, or GT200. True, GT200 did flop because ATi 4800 hit the sweet spot and was a worthwhile upgrade for a minimal investment. 5800 gets you dx11, Eyefinity, and the best performing card on the market.
    And as for gaming on a 1080p plasma, how is that a PC again? where is the immersion in that? you might as well be running a console! lol Yeah, Eyefinity is where immersion is at, and that is reason enough to pick up a 5800 series card. Probably last a good 3 years without the need to upgrade.

    The only advantage and answer to Eyefinity that nv has, is that there is no way to even come close to producing and representing the immense level of immersion through a video over the internet. Marketing it will be tough, but the real enthusiasts will know cool this is.
    So while I don't doubt you have no intention of placing a 5800 series in your console, I think most will definitely have reason to NOT wait.
    GT200 flopped? Really? Just because it may have sold less it's a flop? PC Gaming on a big HDTV = console gaming? What?

    and I thought some of the NV fanboys were bad.

  5. #180
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by flippin_waffles View Post
    The fact of the matter is, there are MUCH more compelling reasons to upgrade now than there was for either G80, or GT200.
    LOLWUT

    This is one of the MOST inaccurate posts I have read lately here.

    There are much more compelling reasons to upgrade (to 5870) now than there was for G80? To be able to say that you must have completely missed the G80 launch and the times before that.

    Fact is, G80 was a HUGE, HUGE leap. Before G80 it was a pain in the ass to even play oblivion maxed at higher resolutions. And super high resolutions like 2560 (and in some cases even 1920) were out of question for the previous generation (7000s/x1000s).

    With G80, suddenly nearly all games were playable at super high resolutions with max settings. THREE years after its launch, a 8800GTX is still enough to max a lot of modern games.

    Whereas today, Nvidia GT200s and ATI 4000s are able to play nearly all games at all resolutions (save Crysis and poorly coded games like Stalker) and there is absolutely no compelling reason to upgrade to something else right now. DX11? There are no DX11 games except Battleforge. And by the time important DX11 games (AvP2) hit the shelves, GT300 will be either ready or very close to launch.

    Why oh why would someone today need to upgrade from the previous generation to 5800s? No reason, nada.

    Whereas, 8800GTX was a huge, huge leap and there was EVERY reason to upgrade from a 7000s or x1000s series to the 8800GTX. What was that reason? Not actually being able to play games maxed at the previous generation.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  6. #181
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i only upgraded my 2900xt cause of power consumption, it was too much on my electric bill, but no game i played had any issues. and a 4850 for 100$ is hard to pass up, and will be just fine i bet until i see a 5850 for under 200$

  7. #182
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    115
    Quote Originally Posted by Clairvoyant129 View Post
    GT200 flopped? Really? Just because it may have sold less it's a flop? PC Gaming on a big HDTV = console gaming? What?

    and I thought some of the NV fanboys were bad.
    it did,quite badly for nvidia.

  8. #183
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    Alright fine DilTech. I didn't realize you sat 3 feet away from your tv when you are on your pc. In that case yes, it probably would have some immersion. No where near what Eyefinity offers, but that's your choice.

    And you know what I find pointless and very irrational? Waiting 6 months or more for a card that may or may not come, to play crysis at slightly higher settings than what the most powerful card on the market now can handle, providing it actually can. You want to lay down any guarantees? How many times do you think people want to play through crysis anyway? I'm willing to bet people are sick to death of it. You must be one of the only ones willing to spend $600 on a card to play a single game, at slightly higher settings. Yup, makes sense.

    For $600 I could get 2 5850's in crossfire, and mop the floor with a gt300, have it now, and have Eyefinity support, where the real immersion comes from. Isn't that what gamers and enthusiats have been asking for, for around a decade now? Thought so.

    So yes, I can honestly say it's silly to wait for gt300, to simply offer what you guess might be a better crysis experience.

    To annihalator bloviating about G80: Are you trying to say that G80 could play every game on the market at 2560x1600 at max quality settings? If not, then by DilTech's logic, what was the point of upgrading? Surely you could have waited until there was a card on the market to do so. It'd make little sense to buy something otherwise...

  9. #184
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    I enjoy a good ole Nvidia bashing as much as the next guy, but it really does make sense to wait and at least see what Nvidia has to offer.

    If it's crap, whatever, buy ATI. If it's good, buy it. What's the problem here?

  10. #185
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by Sly Fox View Post
    I enjoy a good ole Nvidia bashing as much as the next guy, but it really does make sense to wait and at least see what Nvidia has to offer.

    If it's crap, whatever, buy ATI. If it's good, buy it. What's the problem here?
    But you don't know how long that will take. According to Anandtech it's at least three months and indicating even longer (my guess late Q1). By then the 5870 is six months old. By then we'll probably hear rumours about something new from ATI to counter GF100..

    When GT200 launched it was reasonable to wait and see as 4000-series was only weeks away but now it is at least three months, probably nearly the double.

  11. #186
    Xtreme Addict
    Join Date
    Oct 2008
    Location
    The Curragh.
    Posts
    1,294
    I won't be buying a new ATI card or nVidia one. I always skip a gen or two after I buy a card.

    I went from a 4800Ti SE-6600GT-8800GTX-4870X2. I'll be on my current card until the 6xxx series from ATI or the GT400 is out from nV.

    By then there should be alot of DX11 games, not to mention new and better cpus.

    I do find Eyefinity intersting though, and I'm looking forward to seeing nV's new cards.

  12. #187
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by marten_larsson View Post
    But you don't know how long that will take. According to Anandtech it's at least three months and indicating even longer (my guess late Q1). By then the 5870 is six months old. By then we'll probably hear rumours about something new from ATI to counter GF100..

    When GT200 launched it was reasonable to wait and see as 4000-series was only weeks away but now it is at least three months, probably nearly the double.
    Good point.

    I guess in my case it's a bit different since the only game I'd possibly play that would require more GPU power than I have is Crysis. I don't mind waiting if I have to.

    For more active gamers or people using higher-res LCD's, I think you're right though.

  13. #188
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by Sly Fox View Post
    Good point.

    I guess in my case it's a bit different since the only game I'd possibly play that would require more GPU power than I have is Crysis. I don't mind waiting if I have to.

    For more active gamers or people using higher-res LCD's, I think you're right though.
    Well, that's always a choice you have to make. I mean, I haven't owned a single top end card ever, always waited for better and cheaper but that never seem to end

    I don't think people with 4870X2s or GTX295s should upgrade unless they see something they want more, like Eyefinity or better powerconsumption. Performance is roughly the same anyway... Still, these are the costumers that upgrade more frequently as well so to them it might not be that hard to decide (buy 5870s now and GF100 later if they perform better).

  14. #189
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by marten_larsson View Post
    But you don't know how long that will take. According to Anandtech it's at least three months and indicating even longer (my guess late Q1). By then the 5870 is six months old. By then we'll probably hear rumours about something new from ATI to counter GF100..

    When GT200 launched it was reasonable to wait and see as 4000-series was only weeks away but now it is at least three months, probably nearly the double.
    Guess people missed it... You see, NVidia were talking about the Tesla cards, which have always launched later than the desktop variant. That would still give the desktop graphics card a chance at launch Q4 of this year.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  15. #190
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by DilTech View Post
    Guess people missed it... You see, NVidia were talking about the Tesla cards, which have always launched later than the desktop variant. That would still give the desktop graphics card a chance at launch Q4 of this year.


    That^^ sounds more like wishful thinking than reality though...



    Coincidentally, you should read Shimpi's article more closely. He knows a great deal more than he's allowed to tell us, but he hints that Nvidia sacrificed some of their performance for greater sales in other markets. Looking over Fermi's architecture, I tend to agree.

    Nvidia, has moved it's business model from gaming, to Scientific Computing. Reading threw the article and seeing the "highlights" of the new architecture on Nvidia's own web sight, the only thing that is great for 3d rendering is the stacking and the efficiency. The rest is to aid Nvidia's move into other markets..


    Nvidia's got pushed out of the chipset business and has been looking to expand, C++ and CUDA is their new co-processor! That they'll market heavily, as "everyone needing". They tease us with price, but I highly doubt Nvidia will break the sub $199 barrier, so they will NEED to be able to sell these HUGE chips as "co-processors" to the scientific community, etc.



    Lastly, what makes you think the GT300 will be worthy of an upgrade over a HD5890, etc? Nothing in Fermi's architectural changes, suggest that games will play 3x greater than the GTX285... or am I missing something?

  16. #191
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by DilTech View Post
    Guess people missed it... You see, NVidia were talking about the Tesla cards, which have always launched later than the desktop variant. That would still give the desktop graphics card a chance at launch Q4 of this year.
    Umm... G200/G200b?
    Gefore parts were released afterwards... by a good few months.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  17. #192
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by Xoulz View Post
    That^^
    Lastly, what makes you think the GT300 will be worthy of an upgrade over a HD5890, etc? Nothing in Fermi's architectural changes, suggest that games will play 3x greater than the GTX285... or am I missing something?
    the whitepapers said there will be future versions with less double precision for gaming. that probably wont happen this gen though. no one is expecting 3x performance in games. 2x faster could be possible.

  18. #193
    Xtreme Addict
    Join Date
    Aug 2008
    Location
    Hollywierd, CA
    Posts
    1,284
    Quote Originally Posted by Chumbucket843 View Post
    the whitepapers said there will be future versions with less double precision for gaming. that probably wont happen this gen though. no one is expecting 3x performance in games. 2x faster could be possible.
    [rumor]i am hearing 2.4x increase over gtx280 performance[/rumor]. and while nvidia have added a lot of fp performance, why would that hurt gaming performance? it might take up extra room on the die and consume a bit more power, but i don't understand how it would hurt gaming performance. it seems to me that there are A LOT of people on here that want the chip to fail hard, why? why does a percieved lack of competition in the market give you joy? do you wish to pay more for gfx cards? it also seems to me that some here are forgetting that yesterday's show and tell was all about tesla. everything nvidia is talking about in terms of fermi now is related to telsa, they have said that they will not talk about gaming performance because they don't want to tip their hand.
    [SIGPIC][/SIGPIC]

    I am an artist (EDM producer/DJ), pls check out mah stuff.

  19. #194
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by 570091D View Post
    [rumor]i am hearing 2.4x increase over gtx280 performance[/rumor]. and while nvidia have added a lot of fp performance, why would that hurt gaming performance? it might take up extra room on the die and consume a bit more power, but i don't understand how it would hurt gaming performance.
    Exactly. People are being very thick with regards to the GT300. For example, consider this excerpt from Ars Technica:
    But Fermi marks the point at which NVIDIA has officially begin making its discrete GPU tradeoffs favor the HPC market at the expense of gamers. ... and quite possibly leaving the single-chip gaming GPU crown in the hands of AMD's more specialized Evergreen this time around.
    Seriously? Who is writing this garbage? GT300 will be at least as fast as the GT200 architecture with the SPs increased by 2.13x in games. However, in reality, it will be a bit faster due to the efficiency of the shaders being increased.

    While it is true a number of the HPC tailored features won't necessarily benefit game performance very much, they also will not hurt performance in any way.

    Regardless, when the GTX380 is finally released, all this garbage information will be laid to rest.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  20. #195
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Quote Originally Posted by flippin_waffles View Post
    The fact of the matter is, there are MUCH more compelling reasons to upgrade now than there was for either G80, or GT200.
    HA! GT200? Sure. G80? Not a chance. G80 was like the Jesus of video cards.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  21. #196
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    Quote Originally Posted by DilTech View Post
    That would still give the desktop graphics card a chance at launch Q4 of this year.
    You mean a paper launch?

    Anyway, according to the specs and the few details described in the articles(like the one from Ananad), GT300(or whatever it will be called) will kick ass. I think the same like you DilTech, it should be 40%-60% faster than 5870.

    As for the playing game with a 42" on your desk, IMO it's stupid. You'll have to turn your head instead of moving your eyes to locate things on the screen. I'm playing games on my 37" full HD TV, sitting on the sofa. It makes you feeling like you are playing ona console, but with much better graphics. Anyway, I am still 3~4m away from the TV.

  22. #197
    Xtreme Addict
    Join Date
    Jul 2007
    Posts
    1,488
    Quote Originally Posted by 003 View Post
    Seriously? Who is writing this garbage? GT300 will be at least as fast as the GT200 architecture with the SPs increased by 2.13x in games. However, in reality, it will be a bit faster due to the efficiency of the shaders being increased.
    Scaling isn't exactly linear. Some games will see around a 2.13x speedup over GT200, some will see less. In cases where the new arch removes bottlenecks, there may be a few games with more then 2.13x increase. But of course we will need to see benchmarks to know how it works out in reality.

    While it is true a number of the HPC tailored features won't necessarily benefit game performance very much, they also will not hurt performance in any way.
    They might not hurt performance, but they do cost die space and increase power consumption. Only time will tell if these end up being useful features for most customers, or just wasted space/electricity. It really depends on how the GPGPU market evolves.

  23. #198
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by Solus Corvus View Post
    Scaling isn't exactly linear. Some games will see around a 2.13x speedup over GT200, some will see less. In cases where the new arch removes bottlenecks, there may be a few games with more then 2.13x increase. But of course we will need to see benchmarks to know how it works out in reality.
    True. I'm referring to the people who run around like a chicken with its head cut off screaming that the GT300 is going to suck for games and it will be beaten by RV870. Honestly, in a WORST case scenario, it will be roughly twice as fast as the GTX285, which will trump a 5870 easily.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  24. #199
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    107
    You sit 12 feet from a 37" TV? You have damn good eyes. The standard convention in the TV industry for optimal viewing, the distance you should sit from a TV should not exceed 50% of your screen size. So a 37" TV should be watched from 4.6 feet away, or 1.4m.

    By the way, even at this distance, the perceived size of the screen is much smaller than a 22" monitor (which is considered small end for PCs) on your desk. Trying to read text from a typical PC game over time even at 1.4 meters away on a 37" screen will cause noticeable eye strain over time. I have 20/20 vision and my eyes start to hurt if I sit too far away from a screen with a PC game while reading text. Console gaming is completely different since the text size is normalized for typical TV subtitles.
    Last edited by astrallite; 10-01-2009 at 09:41 PM.

  25. #200
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Chumbucket843 View Post
    the whitepapers said there will be future versions with less double precision for gaming. that probably wont happen this gen though. no one is expecting 3x performance in games. 2x faster could be possible.
    Please quote or tell me what page that is on. I have read through the whitepaper 3 times and haven't seen ANY mention of that.

    Quote Originally Posted by 003 View Post
    True. I'm referring to the people who run around like a chicken with its head cut off screaming that the GT300 is going to suck for games and it will be beaten by RV870. Honestly, in a WORST case scenario, it will be roughly twice as fast as the GTX285, which will trump a 5870 easily.
    ~2x as fast is the BEST case scenario. Shaders are being doubled but completely overhauled which should bring IPC but not necessarily.
    Also there are many other parts of GF100 that could possibly bottleneck the architecture.
    Last edited by LordEC911; 10-01-2009 at 09:59 PM.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

Page 8 of 42 FirstFirst ... 56789101118 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •