Results 1 to 25 of 50

Thread: Why are the top of the line ATi cards so cheap??

Hybrid View

  1. #1
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Quote Originally Posted by SoulsCollective View Post
    Display Port should die in a pit of burning fire.
    Why?
    HDMI = DVI repackaged.
    Legacy technology with bundle of useless features like audio channels and... Nothing else, really. Can't ignore the fact DP is technically more advanced, future proof than HDMI. So, again, which is it that should die?
    Quote Originally Posted by ELItheICEman View Post
    The reason they don't usually use it is because 512-bit memory chips are ing expensive.
    Ofcourse 16 memory chips cost more than just 8.

    But then again, if using 16 chips (512bit bus) they can use lower density chips to reach the same capacity compared to a 8-chip solution. More lower density chips means cost/chip is lower and as such total cost is not actually ing more expensive..
    Last edited by largon; 08-23-2009 at 01:44 AM.
    You were not supposed to see this.

  2. #2
    L-l-look at you, hacker.
    Join Date
    Jun 2007
    Location
    Perth, Western Australia
    Posts
    4,644
    Quote Originally Posted by largon View Post
    Why?
    HDMI = DVI repackaged.
    Legacy technology with bundle of useless features like audio channels and... Nothing else, really. Can't ignore the fact DP is technically more advanced, future proof than HDMI. So, again, which is it that should die?
    As I said - DisplayPort.

    HDMI has more than adequate "bandwidth" to carry digital data at resolutions even higher than what we consider "extreme" now (ie. 2560), and I didn't think anyone would call the ability to carry full 7.1 audio data over the same cable "useless" (DisplayPort has the same functionality anyway - would you call it useless in that context?). Furthermore, HDMI supports xvYCC, which we should all be using anyway.

    But this isn't about HDMI><DP, this is about DP itself - the supporters are advocating it as "HDMI for your PC", which sounds neat, but the problem is the DRM - the number of consumer-level displays which are HDCP certified is tiny compared to the number of otherwise perfectly good displays that will be needlessly and pointlessly rendered useless if DP gains wide acceptance. If Big Content (R) and Big Beige Boxes (TM) get together and enforce this standard on us, we all lose.

    The point is, it isn't needed, it isn't even desirable when ordinary DVI works just fine for even the highest resolutions available today, and really it has no purpose other than to restrict consumer rights and choice.
    Rig specs
    CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200

    Foundational Falsehoods of Creationism



  3. #3
    Registered User
    Join Date
    Jul 2009
    Location
    Rochester, NY
    Posts
    48
    Quote Originally Posted by largon View Post
    Ofcourse 16 memory chips cost more than just 8.

    But then again, if using 16 chips (512bit bus) they can use lower density chips to reach the same capacity compared to a 8-chip solution. More lower density chips means cost/chip is lower and as such total cost is not actually ing more expensive..
    I'd have to think they'd use eight chips for a 512MB card and sixteen chips for a 1GB card - eight on the front and eight on the back. Maybe I'm wrong, but that seems to make a ton more sense to me - that way, to differentiate between the two variations in production, all you'd have to do is solder memory onto the front of the card for the 512MB cards instead of both sides, as you'd do for the 1GB cards. Otherwise you'd need two entirely different PCBs designed, and that's extra project time and too much trouble from a design standpoint. But that's just my thoughts.

    Anyway, without looking at the layout I really don't know, but I'd have to assume they use 512-bit memory chips instead of twice as many 256-bit chips. It's a better (albeit more expensive) solution.
    Intel Core i7 920 @ 4.4GHz w/HT
    Gigabyte EX58-UD3R v1.6
    3x2GB OCZ XMP DDR3 1600MHz 8-8-8-24 1T
    2x Sapphire 4870 1GB CF @ 840/1030
    4x OCZ Vertex 30GB SSD (RAID0)
    WD VelociRaptor 300GB HDD
    2x WD Caviar Black 1TB (RAID1)
    Enermax Galaxy EVO 1250W PSU
    Custom Swiftech Water Cooling
    Antec 1200 (Custom Mod)

  4. #4
    Registered User
    Join Date
    Dec 2005
    Posts
    50
    The prices of the gtx275 (£150)/ 285 gtx (£230+) & gtx295 (£330+) are all jokes compared to the 4850 (sub £100)/4870(£105+)/4890 (£150+)/4870x2(£230+). Price/performance ATI win hands down v nvidia. Im surprised nvidia still manging to charge as much. I have nothing against nvidia, i actually perfer having nvidia card in my system, but u cannot justify the prices.

    I was actually surprised yesterday to see a gtx285 at sub £200 at microdirect. Of course it was a typo as its priced at £247 today *sigh*

  5. #5
    Xtreme CCIE
    Join Date
    Dec 2004
    Location
    Atlanta, GA
    Posts
    3,842
    Another reason for more expensive nVidia cards is the additional features and R&D that go in to them. With nVidia, their cards are not just about video performance, they also put R&D money into developing / maintaining / debugging CUDA and such. I don't know exactly how much they fund that, but considering the market that it opens up to them (ie. scientific computing) I would expect it to be a fair amount.

    Probably not the #1 reason I expect, but another factor for sure.
    Dual CCIE (Route\Switch and Security) at your disposal. Have a Cisco-related or other network question? My PM box is always open.

    Xtreme Network:
    - Cisco 3560X-24P PoE Switch
    - Cisco ASA 5505 Firewall
    - Cisco 4402 Wireless LAN Controller
    - Cisco 3502i Access Point

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •