Page 5 of 17 FirstFirst ... 234567815 ... LastLast
Results 101 to 125 of 421

Thread: HD 4850 Previews

  1. #101
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by Clairvoyant129 View Post
    Awesome card for an awesome price. Depending on how the HD4870X2s does, I may go back to the green team this year.
    Color blind?
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  2. #102
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by DilTech View Post
    Yes, in hardware, it just doesn't have total dx10.1 support. It supports some features in hardware. Read the link.
    MSAA sample readback is possible on DX10.0.

    MSAA Z (?) buffer access IIRC is only possible through 10.1 though.

    And nVidia's obviously not going to enocourage devs to use shader AA in ANY kind of situation ever. So they're supporting it is a kinda weak excuse at best.

  3. #103
    Xtreme Member
    Join Date
    Apr 2006
    Posts
    393
    Quote Originally Posted by WrigleyVillain View Post
    Color blind?
    I should have said red team, but then again red team is a subsidiary of the green team.

  4. #104
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by jas420221 View Post
    Until GTX280x2 comes out.
    This was my initial thought at the idea that the 4870X2 has the lead. A dual chip, single card xfire vs a single chip single card nVidia solution will likely yield to the 4870X2.

    If nVidia decided to do a dual chip approach, then not sure what would happen. Though, thinking about it... at the monsterously high power and the massive die, it will be difficult for nVidia to go the dual chip route in this incarnation.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  5. #105
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    G80 had a smaller die size, used less power and generated less heat and we didn't see a GX2 card with that core until the G92 at 65nm so I doubt we will see GT200 in a GX2 card form until 40nm at least

  6. #106
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by DilTech View Post
    Yes, in hardware, it just doesn't have total dx10.1 support. It supports some features in hardware. Read the link.
    I did read the link and the entire article when it came out.

    What Nvidia is actually saying is, we CAN support DX10.1 features by coding drivers to "expose" these features in hardware. In other words, get the same result by exploiting specifics in the hardware. But Nvidia does not have a DX10.1 part.

    The entire page needs to be read to get proper context. If NV could actually exploit DX10.1 features from a performance standpoint does anyone actually think they would force the removal of 10.1 support in Assassins Creed?

    We know that both G80 and R600 both supported some of the DX10.1 featureset. Our goal at the least has been to determine which, if any, features were added to GT200. We would ideally like to know what DX10.1 specific features GT200 does and does not support, but we'll take what we can get. After asking our question, this is the response we got from NVIDIA Technical Marketing:

    "We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."

    The policy decision that has lead us to run into this type of response at every turn is reprehensible. Aside from being blatantly untrue at any level, it leaves us to wonder why we find ourselves even having to respond to this sort of a statement. Let's start with why NVIDIA's official position holds no water and then we'll get on to the bit about what it could mean.

    The statement multisample readback is the only thing some developers are interested in is untrue: cube map arrays come in quite handy for simplifying and accelerating multiple applications. Necessary? no, but useful? yes. Separate per-MRT blend modes could become useful as deferred shading continues to evolve, and part of what would be great about supporting these features is that they allow developers and researchers to experiment. I get that not many devs will get up in arms about int16 blends, but some DX10.1 features are interesting, and, more to the point, would be even more compelling if both AMD and NVIDIA supported them.

  7. #107
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    Quote Originally Posted by eleeter View Post
    I did read the link and the entire article when it came out.

    What Nvidia is actually saying is, we CAN support DX10.1 features by coding drivers to "expose" these features in hardware. In other words, get the same result by exploiting specifics in the hardware. But Nvidia does not have a DX10.1 part.

    The entire page needs to be read to get proper context. If NV could actually exploit DX10.1 features from a performance standpoint does anyone actually think they would force the removal of 10.1 support in Assassins Creed?
    Nope, not me. I haven't believed NV's marketing in years; since the viral marketing initiative was exposed. Assasins Creed seals the deal on how NV is holding back progress IMO.

    From techreport:




    http://techreport.com/articles.x/14934/13

  8. #108
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by flippin_waffles View Post
    And imagine what the 4870 could push in AC, never mind the 4870x2. No wonder Nvidia saw it as a top priority to deep six 10.1 support.

  9. #109
    Xtreme Member
    Join Date
    Jun 2008
    Location
    British Columbia, Canada
    Posts
    227
    Quote Originally Posted by flippin_waffles View Post
    Nope, not me. I haven't believed NV's marketing in years; since the viral marketing initiative was exposed. Assasins Creed seals the deal on how NV is holding back progress IMO.

    From techreport:




    http://techreport.com/articles.x/14934/13
    lol at the 2900XT, beating the GX2.
    Antec 900
    Corsair TX750
    Gigabyte EP45 UD3P
    Q9550 E0 500x8 4.0 GHZ 1.360v
    ECO A.L.C Cooler with Gentle Typhoon PushPull
    Kingston HyperX T1 5-5-5-18 1:1
    XFX Radeon 6950 @ 880/1300 (Shader unlocked)
    WD Caviar Black 2 x 640GB - Short Stroked 120GB RAID0 128KB Stripe - 540GB RAID1

  10. #110
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350
    Quote Originally Posted by Clairvoyant129 View Post
    I should have said red team, but then again red team is a subsidiary of the green team.
    Come again?
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  11. #111
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by zerazax
    G80 had a smaller die size, used less power and generated less heat and we didn't see a GX2 card with that core until the G92 at 65nm so I doubt we will see GT200 in a GX2 card form until 40nm at least
    Nah. I'm not buying that. You think Nvidia is just going to stand by while AMD takes the lead until they get to 40nm? No way. The 7950GX2 was a 7900GT sandwich at 90nm. There was no die shrink necessary to make that. Sure. Cooling will be a problem, but it always is with the GX2 cards. It's not like Nvidia has to even engineer a new card. They just need to stick two cards together and put them on a single PCIe slot. Nvidia is going to take back their lead no later than spring 2009 and a GX2 card is precisely how they are going to win it back.

    The fact is you can only get so much out of CF/SLI before diminishing returns kicks in. It's basically a hack introduced by 3DFX. We're lucky it works at all. Now maybe the 4870x2 is going to revolutionize the world of GPUs by changing that. But I'll believe that when I see it. Even in a post 4870x2 world Nvidia is still going to have the advantage in a sense because I doubt the 4870 is going to be faster than a GTX280 at 65nm and it's even less likely when the GTX280 is shrunk down to 55nm. We can all enjoy AMD's victory this summer, but they are going to have to pull out quite a few rabbits if they want to keep it.

  12. #112
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by gojirasan View Post
    Nah. I'm not buying that. You think Nvidia is just going to stand by while AMD takes the lead until they get to 40nm? No way. The 7950GX2 was a 7900GT sandwich at 90nm. There was no die shrink necessary to make that. Sure. Cooling will be a problem, but it always is with the GX2 cards. It's not like Nvidia has to even engineer a new card. They just need to stick two cards together and put them on a single PCIe slot. Nvidia is going to take back their lead no later than spring 2009 and a GX2 card is precisely how they are going to win it back.
    How do you propose dissipating +450w of heat with a dual slot cooler?
    The limit for current dual slot coolers is obviously right around a 250w TDP.

  13. #113
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    isn't ati droppong to 45 or 40nm soon,like end of year or q1 09?

  14. #114
    Registered User
    Join Date
    Apr 2008
    Posts
    46
    Quote Originally Posted by nr2134 View Post
    Another sensitive NV fanboy. Its a ATI 4800 review thread. Keep your pointless crap out.
    yeah just move along and pray for nv to be more smart next time

  15. #115
    Registered User
    Join Date
    Oct 2004
    Posts
    50
    Quote Originally Posted by Dark-Energy View Post
    lol at the 2900XT, beating the GX2.
    yes...2 fps in a resolution no one plays.
    What about this :
    Something old ..

  16. #116
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    sorry i play at 25x16

  17. #117
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by Joor View Post
    What about this :
    You gonna buy a 280 to play at 1680x1050?

  18. #118
    Registered User
    Join Date
    Oct 2004
    Posts
    50
    ok 1% of the world population will maybe play in 2560x1600 with max details
    The 9800 GX2 is still a better card compared to 2900 XT ..
    Something old ..

  19. #119
    Registered User
    Join Date
    Oct 2004
    Posts
    50
    Quote Originally Posted by eleeter View Post
    You gonna buy a 280 to play at 1680x1050?
    Hmm.. yes since I have a 21" Widescreen monitor .
    Something old ..

  20. #120
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Quote Originally Posted by LordEC911 View Post
    How do you propose dissipating +450w of heat with a dual slot cooler?
    The limit for current dual slot coolers is obviously right around a 250w TDP.
    With a large heatpipe cooler like the TRUE or the the Scythe Orochi, but engineered to attach to the video card. You have almost the same problem with GTX280 SLI but I haven't heard of cards melting just from the stock cooling.

  21. #121
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    steam shows widescreen 88,856 running over 24" 19.38% most of that is probley 30" lcd's the 27" is cheeper but not much

  22. #122
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by Joor View Post
    Hmm.. yes since I have a 21" Widescreen monitor .
    Omg LOL. 21 inches eh??? Are you an uber leet pwnzor as well?


    You would be wasting your time an oh so much buying a card to run a game at 90 FOS when the most your monitor will display is 60. This is xtremesystems, half the ppl here including myself play at 19x12 or more

  23. #123
    Xtreme Addict
    Join Date
    Sep 2006
    Posts
    1,038
    I love e-fights.

    Crysis wont play smooth for me on 1680x1050 with very high settings, so yes, bring on next generation video cards.
    ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ Intel i7 3770k
    ░░░░░░▄▄▄▄▀▀▀▀▀▀▀▀▄▄▄▄▄▄░░░░░░░░░ ASUS GTX680
    ░░░░░░█░░░░▒▒▒▒▒▒▒▒▒▒▒▒░░▀▀▄░░░░░ ASUS Maximun V Gene
    ░░░░░█░░░▒▒▒▒▒▒░░░░░░░░▒▒▒░░█░░░░ Mushkin 8GB Blackline
    ░░░░█░░░░░░▄██▀▄▄░░░░░▄▄▄░░░█░░░░ Crucial M4 256GB
    ░░░▀▒▄▄▄▒░█▀▀▀▀▄▄█░░░██▄▄█░░░█░░░ Hitachi Deskstar 2TB x2
    ░░█▒█▒▄░▀▄▄▄▀░░░░░░░░█░░░▒▒▒▒▒█░░ FSP 750W Gold
    ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ Fractal Arc Mini

  24. #124
    Registered User
    Join Date
    Jun 2008
    Posts
    35
    Great performance from the HD 4850, and it seems like a very efficient card, the minimum frame rates are high, even equaling the GTX260 in a couple of games. It beats the 8800Ultra/9800GTX overall.

    Can't wait for the 4870 reviews. I think I'm getting an HD 4870 1GB GDDR5, hope they won't cost more than $350

    BTW, Sapphire HD Radeon 4850 in stock selling for $189
    [SIGPIC][/SIGPIC]

  25. #125
    Xtreme Mentor
    Join Date
    Jul 2004
    Posts
    3,247
    AMD Radeon HD 4850 512MB Preview - RV770 Discovered
    http://www.pcper.com/article.php?aid=579

Page 5 of 17 FirstFirst ... 234567815 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •