Page 32 of 42 FirstFirst ... 2229303132333435 ... LastLast
Results 776 to 800 of 1035

Thread: The official GT300/Fermi Thread

  1. #776
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    Quote Originally Posted by LordEC911 View Post
    All fonbois said all that about GF100? Generalizations...
    you dont think its a little weird that most g300 rumours are negative? not saying it is specifically fanboys.
    Anyways, not sure where you think you read about 16core Larabee @ 1ghz being faster than GTX280... Certainly not at gaming since they will need 32-64cores on the final product.
    it was in a white paper published by none other than intel. still thats the first real performance figure of larrabee.
    http://techresearch.intel.com/UserFi...2009_FINAL.PDF

  2. #777
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by orangekiwii View Post
    Bring it on! I want 5870x2 (probably will release with gf100 or gt300 w/e) or fermi... my gtx285 stumbles in half my games (fallout 3 is nearly unplayable, risen lags like crazy)
    fallout 3 unplayable on gtx 285? are you playing on IMAX or something?
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  3. #778
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by annihilat0r View Post
    fallout 3 unplayable on gtx 285? are you playing on IMAX or something?


    But I was thinking the same thing. Fallout 3 is not exactly a GPU killer.

  4. #779
    Xtreme Enthusiast
    Join Date
    Dec 2008
    Posts
    752
    Well, with mods it is totally unplayable.

    And fo3 with mods is way better.


    Also benchmarks don't say everything... average frame rate in fo3 with gtx285 is like 70, max details with 4 aa

    I cannot get over 50 fps on max details no AA on default game at 1920 by 1200 and 50 fps in fo3 feels laggy... its often close to 30 - 40 to

  5. #780
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Chumbucket843 View Post
    you dont think its a little weird that most g300 rumours are negative? not saying it is specifically fanboys.

    it was in a white paper published by none other than intel. still thats the first real performance figure of larrabee.
    http://techresearch.intel.com/UserFi...2009_FINAL.PDF
    Again, not at gaming. 16core Larrabee is faster than GTX280 using a specific algorithm, seems to be directed at medical imaging, etc.
    Originally Posted by motown_steve
    Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.

    Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.

  6. #781
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by saaya View Post
    thats not the point, to be able to run it, at all, the architecture has to be working 100% like it should, all instructions and combinations of instructions need to work exactly as they should... there are usually always some bugs, and yes you can work around them on a compiler level afaik, but it takes time to figure that out... you need to know about a bug first before you can work around it, and do it in a way that doesnt cost you a lot of performance...

    they showed gt300 silicon which was supposedly so fresh out of the oven it was still steaming, yet they had it running highly complex maths pounding every transistor of the new pipeline like theres no tomorrow, at very high performance and without any bugs... im not saying its impossible, but its def something that raised my eyebrow... especially because its not the only thing that they showed supposedly running on gt300... according to those demos it seemed gt300 was 100% done, no bugs, no driver issues, nothing... just waiting for lame old lazy tsmc...
    That surprised me too.
    How rarely do you ever get a complex chip like that working flawlessly... never. Many people agree that Core 2 Duo was a stunning break through in performance for Intel. But just like P4 and P3 before it, there was a long errata list. Thankfully, many of these bugs can be fixed in microcode, and dont require any OS patches.

    Although GPU's dont have an ISA as complex as x86 with 800+ instructions, all those "programmability" features added with DX9, DX10 and DX11 add more logic and thus more chance for bugs. Fermi **MIGHT** have been running a limited set of instructions in whatever demos were real.

    Finally, why its much easier for AMD to get DX11 out the door: they already had 40nm chips for months. they already had GDDR5 for years. they already had DX10.1 since 3870. And ofcourse, they already had tesselation engine for years. The block diagram for 5870 says it all... just double 4870. Even the 5way SIMD stayed the same. As for nVIDIA, each of these items listed are hurdles for them to overcome.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  7. #782
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by saaya View Post
    thats not the point, to be able to run it, at all, the architecture has to be working 100% like it should, all instructions and combinations of instructions need to work exactly as they should... there are usually always some bugs, and yes you can work around them on a compiler level afaik, but it takes time to figure that out... you need to know about a bug first before you can work around it, and do it in a way that doesnt cost you a lot of performance...

    they showed gt300 silicon which was supposedly so fresh out of the oven it was still steaming, yet they had it running highly complex maths pounding every transistor of the new pipeline like theres no tomorrow, at very high performance and without any bugs... im not saying its impossible, but its def something that raised my eyebrow... especially because its not the only thing that they showed supposedly running on gt300... according to those demos it seemed gt300 was 100% done, no bugs, no driver issues, nothing... just waiting for lame old lazy tsmc...
    I'm seeing a lot of conjecture on your part, but not much else. While I'm willing to give them the benefit of the doubt, you're simply doubting, based on no less than what you assume the application demands. GPGPU in general demands very little from a substantial portion of a graphics chip, particularly the texture units and ROPs. To claim all transisters need to be pumping at full throttle at all times is a bit silly. Again, they might have also had to clock it way down, use crazy cooling, high volts, whatever to get the transistors (the ones related to computation) in working order. Who knows?

    But neither of us are going to get anywhere with this. Like I said, debating this is a waste of time.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  8. #783
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by LordEC911 View Post
    Again, not at gaming. 16core Larrabee is faster than GTX280 using a specific algorithm, seems to be directed at medical imaging, etc.
    haha thats like saying i can run faster than a ferrari because on completely jammed roads in taipeis city center i can get from a to b faster than a ferrari :P

  9. #784
    Xtreme Addict
    Join Date
    Apr 2005
    Posts
    1,087
    But it's mostly the main reason to purchase a brand new video card.


    All systems sold. Will be back after Sandy Bridge!

  10. #785
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Provided it is marketed (Labree) as being a "highend gaming sku" then I would agree. However if Intel, not unlike Nvidia, cater to the whole CUDA/OpenCL type deal then obviously that is their target market. Just don't say that these products should be "only" for gaming as that is ignorant to assume given the direction GPUs are going. At the end of the day Nvidia and Intel don't give a rat if the product is bought for gaming or processor purposes. If they can meet a demand and be competitive, their product sells, they create revenue and share holders are happy. Simple as that.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  11. #786
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    So this is gonna "hard launch" in one month? And we still know nothing? yeah, right
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  12. #787
    Registered User
    Join Date
    Oct 2009
    Location
    Houston,TX
    Posts
    22
    to much talk from invidia no action i want some new cards.

  13. #788
    Xtreme Enthusiast
    Join Date
    Jun 2005
    Posts
    960
    Quote Originally Posted by annihilat0r View Post
    So this is gonna "hard launch" in one month? And we still know nothing? yeah, right
    More like hard _to_ launch in one month.

  14. #789
    Xtreme Member
    Join Date
    Nov 2007
    Location
    Planet Express HQ, US
    Posts
    385
    it does support Dx11 right??
    "Thanks for the f-shack. Love, Dirty Mike & The Boys" - from Dirty Mike & The Boys

  15. #790
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Quote Originally Posted by annihilat0r View Post
    So this is gonna "hard launch" in one month? And we still know nothing? yeah, right
    I wouldn't say we know nothing. In fact we probably know more about it than we did about the 5870 in the same time period, especially about the low-level stuff. Rumors currently suggest 128 TMU, 48 ROP, and that pretty much covers the gaming side of the hardware. We're just missing tidbits at this point, and of course clocks, which is always the last to be finalized.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  16. #791
    Xtreme Member
    Join Date
    Sep 2008
    Posts
    449
    At this rate, would it be unreasonable to say that we might only see real availability of Fermi early Q2 2010?
    --lapped Q9650 #L828A446 @ 4.608, 1.45V bios, 1.425V load.
    -- NH-D14 2x Delta AFB1212SHE push/pull and 110 cfm fan -- Coollaboratory Liquid PRO
    -- Gigabyte EP45-UD3P ( F10 ) - G.Skill 4x2Gb 9600 PI @ 1221 5-5-5-15, PL8, 2.1V
    - GTX 480 ( 875/1750/928)
    - HAF 932 - Antec TPQ 1200 -- Crucial C300 128Gb boot --
    Primary Monitor - Samsung T260

  17. #792
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by LiquidReactor View Post
    At this rate, would it be unreasonable to say that we might only see real availability of Fermi early Q2 2010?
    Who really knows? I don't remember a launch so clouded as far as when the thing would actually be available. Isn't Nvidia sticking to the "end of 2009" release date?

  18. #793
    Xtreme Mentor
    Join Date
    Jul 2008
    Location
    Shimla , India
    Posts
    2,631
    GTX380 vs 5950 that will be the fight that goes down as GTX285 vs 4850 X2 did most likely. In games that are not CF optimized GTX380/Fermi would win "Since a single 5850 cant destroy a Gtx380/Fermi no matter how bad Nvidia made the card, it has to kill 5850 in game otherwise its a retard" and in games that are CF optimized i expect ATi's 5950 to win.

    But the Nvidia card would offer more than just gaming, you get a partial/fake CPU that can in theory do most of the work ARM cpu's can do. Its totally upon the end user to buy which card, i am on the value for money bandwagon how much the GTX380/Fermi costs and how much 5950/5870 costs is important factors for me.
    Coming Soon

  19. #794
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by Philip_J_Fry View Post
    it does support Dx11 right??
    Yeah, of course.
    Quote Originally Posted by ajaidev View Post
    In games that are not CF optimized GTX380/Fermi would win "Since a single 5850 cant destroy a Gtx380/Fermi no matter how bad Nvidia made the card, it has to kill 5850 in game otherwise its a retard" and in games that are CF optimized i expect ATi's 5950 to win.
    Speculations, speculations...
    5870 is 2x of 285. Are you saying that 5970 being 5870x2 is slower than Fermi? Delusional much? It would be such a massive die size for a single chip that the card would cost 1000$...
    Last edited by zalbard; 10-24-2009 at 01:05 AM.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  20. #795
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by ajaidev View Post
    GTX380 vs 5950 that will be the fight that goes down as GTX285 vs 4850 X2 did most likely. In games that are not CF optimized GTX380/Fermi would win "Since a single 5850 cant destroy a Gtx380/Fermi no matter how bad Nvidia made the card, it has to kill 5850 in game otherwise its a retard" and in games that are CF optimized i expect ATi's 5950 to win.
    well we don't know if the 5950 is dual 50's or dual 70's yet or gaming performance when it comes to Fermi, and adding onto gaming performance, how it performs in DX11 since it doesn't have a hardware tessalator.

  21. #796
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    466
    All speculation folks. From what I do know for sure I think ati's ideals of a graphics card is better then nvdia. To me I want a gfx card not a pseudo CPU. Bottom line though we really need new games, none are going to push the last, current and next gen.

  22. #797
    Xtreme Addict
    Join Date
    Jul 2009
    Posts
    1,023
    Quote Originally Posted by takamishanoku View Post
    All speculation folks. From what I do know for sure I think ati's ideals of a graphics card is better then nvdia. To me I want a gfx card not a pseudo CPU. Bottom line though we really need new games, none are going to push the last, current and next gen.
    well, we haven't seen any benches on Crysis 2 yet

  23. #798
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Yeah, can't wait for the adoption of tessellation.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  24. #799
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    466
    Quote Originally Posted by Helloworld_98 View Post
    well, we haven't seen any benches on Crysis 2 yet
    Crysis 2 was built with consoles in mind so i'd assume it would be less demanding

  25. #800
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Crysis 2 was already demo'd on the 5870's in EyeInfinity so if that's any indication, it might not be that demanding and/or its more optimized

Page 32 of 42 FirstFirst ... 2229303132333435 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •