Page 27 of 91 FirstFirst ... 17242526272829303777 ... LastLast
Results 651 to 675 of 2268

Thread: The ATI Radeon 5XXX Thread

  1. #651
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by Hornet331 View Post
    Im not talking about xxxhead2go (thats the cheap consumer stuff), but there graphics card.
    They offer up to 4 displayes (and that already for severaly years) per card, with the option to use 2 cards in one pc. Making it able to use 8 displays.
    All that was already back available in 2007.




    Love it when people just pop in, have no clue what the discussion is about and say lulz matrox, ze suck....
    Matrox cards cannot span 3D across multiple displays.

    hm.. 3x TH2G + 5870?

    All along the watchtower the watchmen watch the eternal return.

  2. #652
    Xtreme Cruncher
    Join Date
    Nov 2002
    Location
    Belgium
    Posts
    605
    How much shorter is the HD 5850 estimated compared to a 5870 ?


    Main rig 1: Corsair Carbide 400R 4x120mm Papst 4412GL - 1x120mm Noctua NF-12P -!- PC Power&Cooling Silencer MK III 750W Semi-Passive PSU -!- Gigabyte Z97X-UD5H -!- Intel i7 4790K -!- Swiftech H220 pull 2x Papst 4412 F/2GP -!- 4x4gb Crucial Ballistix Tactical 1866Mhz CAS9 1.5V (D9PFJ) -!- 1Tb Samsung 840 EVO SSD -!- AMD RX 480 to come -!- Windows 10 pro x64 -!- Samsung S27A850D 27" + Samsung 2443BW 24" -!- Sennheiser HD590 -!- Logitech G19 -!- Microsoft Sidewinder Mouse -!- Fragpedal -!- Eaton Ellipse MAX 1500 UPS .





  3. #653
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    107
    Quote Originally Posted by cegras View Post
    Just to quote some 'facts' (since I'm sure people will question the validity):

    Eyefinity transparently handles the combinations of monitors as one virtual monitor.

    They allow economical stacking of monitors instead of paying exponentially for more space. The human eye is more responsive to horizontal versus vertical change, and if using an odd x odd grid of monitors, bezels at the edge of your vision will vanish with play, as it has been proven that static objects at your periphery disappear over time.

    http://www.silentpcreview.com/article746-page3.html

    A 19" monitor consumes under 30 watts. Adding two more will make the total consumption ~90W. I'm not sure how this is significant. A 30" Dell consumes:

    http://support.dell.com/support/edoc...P/EN/about.htm

    147 W.

    HM.
    The Dell is a 30" H-IPS panel. A typical 26" IPS panel consumes 135W. A typical 20" IPS panel consumes around 80W. You are comparing essentially the a cream of the crop panel that is typically seen in very high end televisions compared to a low end computer display, a panel technology that's so bad nobody would ever consider selling it on the television market.
    Last edited by astrallite; 09-13-2009 at 12:27 AM.

  4. #654
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by CrimInalA View Post
    How much shorter is the HD 5850 estimated compared to a 5870 ?
    Most likely, both 5850 and 5870 use the same circuitboard.
    Last edited by largon; 09-13-2009 at 12:43 AM. Reason: typo
    You were not supposed to see this.

  5. #655
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by largon View Post
    I spotted this on Wiki - dunno where this pic came from and if it's valid, but FWIW;



    edit:
    Looks like fake, tbh.
    I guess it doesn't let you link to quotes in closed threads.

    Largon, Here is what I said about it:
    ...similar to dual core, yet not actually two dies in one package ala MCM). Perhaps there is no more real-time compiler in the driver and its all handled on the hardware level by the scheduler. Because the core of the chip is so modular & scalable with certain areas sharing parts of the die (ROPS + Memory controller logic), you are able to divide the specs in half (1600/256/80/32 to 800/128/40/16) and have two parts (rv870/rv830) and rv870 appears as two rv830's, yet it is still only one die. Hence term "dual-core" - more like "modular". Some people think, yes, but all GPUs are multi-core because each of the shaders is like a core by itself. Well, here we are dealing with two large arrays of 800 shaders, along with other standalone logic that communicates with either one of the two identical arrays. More specifically, one Rv870 die is composed of two Rv830-like 160x5 clusters/shader arrays (like two current-gen revamped rv770's in one die) sharing certain features, but connected via "internal crossfire" working in unison and the entire design is a continuation of R600 architecture. It is load balanced, efficient, and requires no crossfire software optimization (because it is hardware level communication); it works via SFR 8*8-32*32, and is bandwidth intensive. The board is using next-gen 5ghz Gddr5 to provide the required bandwidth. So, apparently they've slapped together two 40nm rv770's... so it's easy to see where "dual-core" confusion comes from. Cypress is like a 40nm 4890x2 in one die! -So, it is like a "native dual core" CPU...
    To me it sounds like a "dual-core GPU", because even though there are 1600 shaders/stream processors, they are in two groups of 800. 10 SIMD x 2. That's all, the guy over at Hardocp forums called it "dual shader mega clusters" CJ at tweakers called it "dual shader" and Theo called "dual shader engine" at BSN. Dozens of posts over chiphell they call it "native dual core gpu". Occasionally someone comes into the Beyond3d thread and declares it to be "native dual core". One guy brought up a point earlier and showed a gt200 die shot and said, "well i guess you would call this quad core?" (because it looks like 4 quadrants of unequal size, or two equal halves?) Well you could cut it in half down the middle and seem to have two identical parts, but are those 2 parts rendering via SFR internally, and designed that way to be seen as a single gpu? Probably not. There are lots of threads on "multi-core" and "interconnect" that are way more technical than what I can understand. Either these people are completely mistaken, or it is actually using SFR internally, and sharing the MC. Remember, we still have a bet

    Here is a quote from the technical discussion of the core at chiphell. (It sounds goofy because of the translation, but see what you make of it):
    Since the House passed 100 to summarize it:

    We all know that RV870 is a native dual-core, MEM share, TMU sharing, etc. These hardware innovations is just the beginning, or rather in addition to hardware of the new technology, RV870 brings us the updated software in-depth , for example, CrossFire mode, specifically can refer to the savage major technical paper, I did not explain the response to this (http://bbs.chiphell.com/viewthre ... & extra = page% 3D1)
    RV870 by native dual-core hardware support, the biggest benefit is the CF model can be evolved from today's AFR for a more reasonable SFR (do not understand what AFR and SFR for us to look savage Great article,), SFR and thoroughly get rid of the PCI-E bandwidth bottlenecks leading to CF or SLI can not use the SFR of the constraints, CF no longer rely on driving on the optimization of 1 +1 is infinitely close to two and said, where everyone will be RV870 native dual-core architecture of the deep deep impress, at least in my case, native dual-core is my first time in years, felt playing card graphics technology, a major step forward.
    Go back to Crysis on the results, in fact, the RV870's Crysis performance can be regarded as a strong reason why the bounds of reason, but it can be regarded as lucky, because to say whether it is before the CF or SLI, the efficiency of both in Crysis is not high, RV870 just drill the use of technology, this SFR loopholes and completed card crisis, the role of Terminator, but I hope that everyone calmly RV870 performance, in other CF or SLI more efficient GAME in, RV870 impossible to think about Crysis so lucky, though it in terms of the total, RV870 architecture as pioneered the use of dual-core technology, combined with the many part of the shared framework, the core DIE or the ratio of power consumption and performance will be the same Performance Products the highest, which is beyond doubt.

    Lao Lao hastily said so much, would have wanted to test them on the RV870 article says, but I got fascinated by the RV870 architecture could not resist spraying a bit out, more content or waiting for me in the future release of the RV870 test, let us share the bar.
    Quietly asked a small problem, AFR to SFR does not seem to call it evolution, can only be called after the performance to a certain extent the development of an inevitable trend of bar
    Last edited by jaredpace; 09-13-2009 at 12:47 AM.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  6. #656
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Yet, the die is nearly a perfect square...
    You were not supposed to see this.

  7. #657
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by tajoh111 View Post
    There is no hate for this technology. Just skepticism on whether it will catch on. No one want this technology to fail or anything, some people, including myself, don't see the benefits it outweighing the costs associated with the technology. The biggest application of this technology or how AMD is trying to present it is gaming since business environments where cost is hardly an issue already have cards or systems which can display 3 screens or more.


    Multiple display gaming is something more of a novelty because we have been introduced to something similar in the past and it just hasn't caught on because the consumer hardly see's it as essential part of the gaming experience.
    Snip!

    The costs now are allot cheaper for monitors than before.
    Also the tech is different now that it would be much easier for pc
    gamers.

    Before multi monitor was aimed more at productivity than anything else then triple head go came on the scene & cost as much as a gfx card it self with heavy limitations requiring allot of planing with matching monitors & at the right res which only but the hardcore would bother to go through.

    Hearing about it never makes up for seeing it in action, there is no one who wouldn't be more happy with no bezel but the experience does make up for it.

    Very little ever starts off a being essential. A pc is not required at all for gaming.
    The NV 3D is apples to oranges as it requires another piece of hardware between the PC & the screen which you could by 2 new screen or nearly for the cost & is no use for anything else & requires specific monitors.

    As time goes on people go though monitors & replace them but now they may hang onto them & try out some multi screen gaming as it will cost them nothing pluysh being able to use them for multi or extended desktop with out the need for utramon to fix the taskbar. Even second hand monitors are cheap as chips & many are thrown away in office refits.

    At the end of the day it will be the marketing & easy of use which will effect it popularity.
    Last edited by Final8ty; 09-13-2009 at 02:09 AM.

  8. #658
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by jaredpace View Post

    Largon, Here is what I said about it:


    To me it sounds like a "dual-core GPU", because even though there are 1600 shaders/stream processors, they are in two groups of 800. 10 SIMD x 2. That's all, the guy over at Hardocp forums called it "dual shader mega clusters" CJ at tweakers called it "dual shader" and Theo called "dual shader engine" at BSN. Dozens of posts over chiphell they call it "native dual core gpu". Occasionally someone comes into the Beyond3d thread and declares it to be "native dual core". One guy brought up a point earlier and showed a gt200 die shot and said, "well i guess you would call this quad core?" (because it looks like 4 quadrants of unequal size, or two equal halves?) Well you could cut it in half down the middle and seem to have two identical parts, but are those 2 parts rendering via SFR internally, and designed that way to be seen as a single gpu? Probably not. There are lots of threads on "multi-core" and "interconnect" that are way more technical than what I can understand. Either these people are completely mistaken, or it is actually using SFR internally, and sharing the MC. Remember, we still have a bet
    so to simplify, its raid0 internally with the output doubled.
    Allowing for better IQ and also further on, going Fusion for AMD.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  9. #659
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by largon View Post
    Yet, the die is nearly a perfect square...
    It could be square and look like this (SIMD sections highlighted red are the "cores" of the "dual-core")



    Call it, "Native dual-SIMD section gpu". HAHAHAHA
    Last edited by jaredpace; 09-13-2009 at 03:38 AM.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  10. #660
    Xtreme Member
    Join Date
    Dec 2008
    Location
    Sweden
    Posts
    450
    Quote Originally Posted by Final8ty View Post
    Snip!

    The costs now are allot cheaper for monitors than before.
    Also the tech is different now that it would be much easier for pc
    gamers.

    ....

    As time goes on people go though monitors & replace them but now they may hang onto them & try out some multi screen gaming as it will cost them nothing. Even second hand monitors are cheap as chips & many are thrown away in office refits.

    At the end of the day it will be the marketing & easy of use which will effect it popularity.
    I agree. For most high end gamers a screen for 400-600 dollars is a no-brainer then how could three screens at 200 dollars be something extravagant. As tajoh said himself, (although not literally) you don't need the most expensive screens for your peripheral vision since it's not nearly the same quality as your central vision.

    This technology enables users to use three different screens (different kinds) and still work. For those like me who have been thinking about going multi monitor - this is making me seriously consider three instead of two screens... And sure, the bezels on my 19" CRT are wide but on most LCDs it's not that big of a deal.

    Some here are focusing on the limitations while others are seeing the possibilities...

  11. #661
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    Quote Originally Posted by STEvil View Post
    Matrox cards cannot span 3D across multiple displays.
    Even if it could span 3D across multiple displays, what 3D can you do with a Matrox card?

  12. #662
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    How about putting ALL of the 6 monitors horizontally, possibly encircling the gamer with 360 degree vision? Holy crap it would be awesome in zombie games to both run and look at your back to see zombies chasing you. Sure a lot of pixels would be going to waste but still it'd be cool
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  13. #663
    Xtremely Kool
    Join Date
    Jul 2006
    Location
    UK
    Posts
    1,875
    Quote Originally Posted by marten_larsson View Post
    I agree. For most high end gamers a screen for 400-600 dollars is a no-brainer then how could three screens at 200 dollars be something extravagant. As tajoh said himself, (although not literally) you don't need the most expensive screens for your peripheral vision since it's not nearly the same quality as your central vision.

    This technology enables users to use three different screens (different kinds) and still work. For those like me who have been thinking about going multi monitor - this is making me seriously consider three instead of two screens... And sure, the bezels on my 19" CRT are wide but on most LCDs it's not that big of a deal.

    Some here are focusing on the limitations while others are seeing the possibilities...
    I have seen many posts across forums of people thinking of going multi screen purely because of ATI Eyefinity where they never even considered it because of needing fork out for triple head go & it limitations.

    Forking out for the monitors & the Belize was not really discussed for a reason not to go multi screen because that's taken as a given.

  14. #664
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by annihilat0r View Post
    How about putting ALL of the 6 monitors horizontally, possibly encircling the gamer with 360 degree vision? Holy crap it would be awesome in zombie games to both run and look at your back to see zombies chasing you. Sure a lot of pixels would be going to waste but still it'd be cool
    Updated idea:

    You put the gamer in the center, give him a rotating chair so that he can turn anywhere 360 degrees at will. In a FPS for example, when you change your direction the monitor with the crosshair would change too. You place a little motor on the chair (idea going a bit off the map here) so that when you turn 30 degrees to the right in the game, you turn 30 degrees in real life with the chair also, and the crosshair is ALWAYS directly in front of you, so it's a WHOLE LOT MORE realistic - when you simply want to look back, you literally look at your back monitor and when you want to TURN back, your chair turns also. Holy crap it'd be amazing.
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  15. #665
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Wow, i think you guys are on to something. sounds like virtual reality. You'd need some kind of Wii head tracking unit on to keep track of your crosshair on the map.

    Imagine a circle of 12 displays, 2 high. 24 monitors in crossfire! of course your headset/controller/keyboard would have to be wireless if you plan on doing a lot of 180's, haha!
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  16. #666
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    Quote Originally Posted by annihilat0r View Post
    How about putting ALL of the 6 monitors horizontally, possibly encircling the gamer with 360 degree vision? Holy crap it would be awesome in zombie games to both run and look at your back to see zombies chasing you. Sure a lot of pixels would be going to waste but still it'd be cool
    I want bunnies, just hate the idea of a zombie behind me, a bunny much better.
    prefer a playboy one also.

    we will see some innovation with eyefinity, as its becomes avilible for mainstream.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  17. #667
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by STEvil View Post
    Matrox cards cannot span 3D across multiple displays.
    That was never the ponit anyway... but people just like to bash matrox for the "lulz".

  18. #668
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by Hornet331 View Post
    That was never the ponit anyway... but people just like to bash matrox for the "lulz".
    Well, what else do they deserve?

  19. #669
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Back in the day Matrox cards used to be THE consumer cards for video editing and Image quality... anyway back on topic, the 2GB HD5870 just cant come out soon enough, I'm hoping that it has a decent OpenCL and DirectX Compute implementation as I am still wary that nVidia will have the upper hand where this is concerned.
    For the record I don't really give a fig for multi-monitor capabilities and this is what concerns me. Just get one phat 30" 2500x1600 or whatever it is display and give us some performance figures in Crysis, S.T.A.L.K.E.R Clear Sky and World in Conflict DX10 already!
    John
    Stop looking at the walls, look out the window

  20. #670
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Istantinople
    Posts
    1,574
    Quote Originally Posted by jaredpace View Post
    Wow, i think you guys are on to something. sounds like virtual reality. You'd need some kind of Wii head tracking unit on to keep track of your crosshair on the map.

    Imagine a circle of 12 displays, 2 high. 24 monitors in crossfire! of course your headset/controller/keyboard would have to be wireless if you plan on doing a lot of 180's, haha!
    The greatest thing about my unique trademark (TM) design is that you can't just turn 180 degrees back and forth in milliseconds like in all FPS's - and you won't need to. Just like in real life, you won't have to change the direction you're facing in order to look back, for example. To look back, you just look back. To turn back, you and your chair turn back along with your control device. Great or what!
    Has anyone really been far even as decided to use even go want to do look more like?
    INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"

  21. #671
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    I think there's a tiny flaw in your design. What happens when you want to look up or down

  22. #672
    Xtreme Member
    Join Date
    Dec 2007
    Location
    Karachi, Pakistan
    Posts
    389
    Quote Originally Posted by trinibwoy View Post
    I think there's a tiny flaw in your design. What happens when you want to look up or down
    That will be solved with the Holodeck
    [SIGPIC][/SIGPIC]

  23. #673
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Found this closeup at HKEPC, but they shopped an ATi logo over the die...




    http://global.hkepc.com/4007
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  24. #674
    Xtreme Enthusiast
    Join Date
    Jan 2008
    Posts
    743
    Man I'm so excited for these new cards, AMD has been so tight lipped about them. Grr past couple years there would be multiple leaked benchies by now. Makes me wonder what kind of a monster 5870x2 is going to be ...MCM chip maybe.

  25. #675
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    But the real question is how many out there actually have a second monitor that they didn't put to good use yet? IE: No need to actually buy another monitor just use the spare they already have. The steam survey already points out that it's not uncommon that some already use more then one monitor.
    [SIGPIC][/SIGPIC]

Page 27 of 91 FirstFirst ... 17242526272829303777 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •