MMM
Page 2 of 3 FirstFirst 123 LastLast
Results 26 to 50 of 60

Thread: ATI’s Dual-Chip Hemlock Due Late in Q4'09+ Possible Prices

  1. #26
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    I'm pretty sure Hemlock will be a 5850x2 part not a 5870x2 part considering:

    2x5870 @ $399 = $798

    2x5850 @ $299 = $598

    Perkam

  2. #27
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by ajaidev View Post
    *NEW*

    "....The top dog carries the name Radeon HD 5870X2, and we are talking about single-PCB, dual-GPU card that will retail for cool $599. This is still $50 cheaper than GTX280 at the time of its debut [do you remember the outrageous $649?], but bear in mind that this is a top dog part.

    For some odd reason, the $499 bracket will remain without a card. We expect that slot will be filled with a water-cooled edition of 5870, or more likely - 5870X2 once that nVidia launches their competing products. Afore mentioned Radeon HD 5870 is set to go on sale for $379-399, while the cheapest entry into the 5800 series, the Radeon HD 5850 is priced in the $279-299 bracket....."

    Source
    those prices sound way weird...
    5870x2 549$?
    5870 399$?

    this would be the first time in vga history that a dual gpu card doesnt cost 2x or more than 2x the price of a single gpu card...

    lets look at the current pricing:

    GTX295 500$
    GTX285 350$
    GTX275 225$
    GTX260 175$
    GTS250 125$

    4870x2 400$
    4890 200$
    4870 175$
    4850 125$
    4770 125$

    we can expect the 5870 and 5850 to beat the 285... but that doesnt mean it will cost more than a 285... i doubt ati will launch their cards at such a high price...

    Quote Originally Posted by LordEC911 View Post
    You are massively confused with your codenames...
    Xbitlab's is closer than yours but they got the lowend mixed up.
    well thanks for clearing it up! :P

    yes, thats what those damn code names are all about, arent they? to confuse people

    who cares how you call it, its pretty obvious that there will be a dual rv870 at the top, then rv870 in 2 variants, mainstream and slightly higher clocked, and then rv840/830/810 or whatever you wanna call the mainstream/entry level parts...

    ati may change their naming strategy but i highly doubt they change their design strategy...

  3. #28
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by azza21 View Post
    I very much doubt there will be a x3 gpu anytime soon. Remember only 6 moinths ago microsoft made it publicly known that multiple chipped gpus are not the way to be going as they make there operating systems become less stable/reliable or something along them lines. They recommend people only use single gpu solutions and that manufacturers should concentrate on making them faster.

    There is a thread about it on this forum somewhere, now the question I got is, if the top dog is saying such things why would ati/nvidia want to ignore the big daddy and go for more and more gpus on a pcb...it just bring more issue's.
    Infact i garrantee you guys there will be NO top end, ati/nvidia card with x3 gpus or more, out before Q3 2010, some of you lot get far to carried away, the sites don't help as 80% are full of bull****.
    thats basically microsoft telling nvidia and ati, hey, go make better gpus... they will ignore comments like that for sure or reply with, hey, go make a better os

  4. #29
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by saaya View Post
    thats basically microsoft telling nvidia and ati, hey, go make better gpus... they will ignore comments like that for sure or reply with, hey, go make a better os
    CPUs and GPUs have been getting 2x as strong every 18 months

    Operating Systems have been consuming 20x the resources every 48 months

    guess which one has the problem with their laws?

  5. #30
    Xtreme Member
    Join Date
    Jun 2006
    Location
    Long Island, NY
    Posts
    387
    Quote Originally Posted by Manicdan View Post
    CPUs and GPUs have been getting 2x as strong every 18 months

    Operating Systems have been consuming 20x the resources every 48 months

    guess which one has the problem with their laws?
    Ohhh I know!! Pick me! I know the answer!!!!
    Core i7 940:Asus Rampage II:HIS ATI 5870:6GB (3 x 2GB):2 x 1.5TB:2 x 750GB: assorted 200-500GB drives:CoolerMaster HAF932:Thermalright IFX-14

    MacBookPro:2.33 Ghz Core2Duo:2GB DDR2:250GB 7200RPM

    Mac Mini 1.5 core solo with a 2.33Ghz Merom, 2GB's of Muskin ram and a 250GB WD HDD

    Home Theater:
    Marantz SR7002:KEF iQ9's, iQ5, and iQ8's:No Sub

    2-Channel:
    KEF XQ-20 Khaya Mahogany:McIntosh Labs MA-6100:Monster M Series Bi-wire 2-4

  6. #31
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by Manicdan View Post
    CPUs and GPUs have been getting 2x as strong every 18 months

    Operating Systems have been consuming 20x the resources every 48 months

    guess which one has the problem with their laws?
    How you do you figure this?

    Vista wasn't that much obscenely slower than XP considering the time gap. And 7 is actually faster than Vista by most accounts.

    The Mac OS X updates also don't seem to have any major impact on performance from an update-to-update basis.

  7. #32
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    Quote Originally Posted by Sly Fox View Post
    How you do you figure this?

    Vista wasn't that much obscenely slower than XP considering the time gap. And 7 is actually faster than Vista by most accounts.

    The Mac OS X updates also don't seem to have any major impact on performance from an update-to-update basis.
    its just some mean math to say that windows use to work fine on 32MBs of ram just a decade ago, and now need 2GBs. from Win98 to XP was a big jump (around 100MB to 512-1GB) and from XP to Vista the cpu really needed to be upgraded or its a slideshow without all the visuals turned off.

    no OS past Win7 should really need to take up more resources, i cant see what they could possibly need with all that space and power. but in another decade we will laugh at anyone who cant use Win9/10 on less than 10GB and GPU acceleration.

    dont read too much into this, the point was that no matter how good the hardware gets, the OS seems to be able to abuse it happily.

  8. #33
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    Quote Originally Posted by Manicdan View Post
    its just some mean math to say that windows use to work fine on 32MBs of ram just a decade ago, and now need 2GBs. from Win98 to XP was a big jump (around 100MB to 512-1GB) and from XP to Vista the cpu really needed to be upgraded or its a slideshow without all the visuals turned off.

    no OS past Win7 should really need to take up more resources, i cant see what they could possibly need with all that space and power. but in another decade we will laugh at anyone who cant use Win9/10 on less than 10GB and GPU acceleration.

    dont read too much into this, the point was that no matter how good the hardware gets, the OS seems to be able to abuse it happily.
    This is the point. I would rather have the OS consume all 6GB of my memory in order to cache a lot of stuff instead of sitting idle with 5.5GB free. What does free memory do for me at idle? Nothing at all. But with a lot of data cached it will speed up my daily usage of the OS. When I need memory for an application the OS frees up the cache and I only take note that "wow, that application loaded fast".

  9. #34
    Xtreme Member
    Join Date
    Mar 2005
    Posts
    447
    Quote Originally Posted by perkam View Post
    I'm pretty sure Hemlock will be a 5850x2 part not a 5870x2 part considering:

    2x5870 @ $399 = $798

    2x5850 @ $299 = $598

    Perkam
    I don't see hemlock being a 2x5850 part. Since day one they have talked about a highend dual gpu board..which would logically be the 70 part not the 50 part..Sylvia from the INQ originally reported the codenames and had hemlock at the bottom by mistake. She was later corrected. She was spot on all the others, with Cypress being the top end single gpu board. Hemlock is 2 cypress' on one board, which has been stated already in this thread. This is all based on information that's currently out, so things can change.. Hemlock is a poison, it makes perfect sense to call the highest end x2 part this because its going to kill NVIDIA..

    Also based on my observations and personal opinion, dual gpu boards are becoming more standard, not in the sense that everyone has one but yet its no longer uncommon or extravagent. You can't base pricing logic on the past when x2 parts were 2x the price of the normal part. Just like quadcores carried almost a 50% premium when they came out..They aren't anymore because they are more common. This is ati's 3rd generation in a row with a x2 product. Its logical and pretty probable that a 5870x2 wont be 2x the price as a 5870. $399 and $549, a price difference of $150 is about right. Besides how can they charge exactly double the price when it isn't exactly double the performance, linearly..?
    Iron Lung 3.0 | Intel Core i7 6800k @ 4ghz | 32gb G.SKILL RIPJAW V DDR4-3200 @16-16-16-36 | ASUS ROG STRIX X99 GAMING + ASUS ROG GeForce GTX 1070 STRIX GAMING | Samsung 960 Pro 512GB + Samsung 840 EVO + 4TB HDD | 55" Samsung KS8000 + 30" Dell u3011 via Displayport - @ 6400x2160

  10. #35
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    i dont think they would sell an x2 as "buy one, get one 60% off". AMD has said themselves get up to 1.8x the performance with crossfire, so they will probably charge 1.8x as much for a duel. so if its $550 for a x2, the 5870 would be spot on at $300. or the x2 will be delayed long enough for the 5870 prices to come down to 300ish

  11. #36
    Registered User
    Join Date
    Feb 2009
    Posts
    22
    new years resolution
    Westmere Gulftown
    xfired 5870x2s
    i wonder how well they will scale. some stupidly high unnecessary framerate...i cant wait
    Intel QX9650@~working on the OC..
    GA EP45-UD3P
    Refrigerated WC CPU
    x-fired XFX 4890's
    8gb Corsair Dominator ~1066
    Dual Boot Vista Ultimate 64 & W7 Ultimate 64 RTM
    Samsung 26" ToC
    Corsair HX1000

  12. #37
    Xtreme Member
    Join Date
    Jul 2009
    Location
    NY
    Posts
    225
    Quote Originally Posted by Tenknics View Post
    I don't see hemlock being a 2x5850 part. Since day one they have talked about a highend dual gpu board..which would logically be the 70 part not the 50 part..Sylvia from the INQ originally reported the codenames and had hemlock at the bottom by mistake. She was later corrected. She was spot on all the others, with Cypress being the top end single gpu board. Hemlock is 2 cypress' on one board, which has been stated already in this thread. This is all based on information that's currently out, so things can change.. Hemlock is a poison, it makes perfect sense to call the highest end x2 part this because its going to kill NVIDIA..

    Also based on my observations and personal opinion, dual gpu boards are becoming more standard, not in the sense that everyone has one but yet its no longer uncommon or extravagent. You can't base pricing logic on the past when x2 parts were 2x the price of the normal part. Just like quadcores carried almost a 50% premium when they came out..They aren't anymore because they are more common. This is ati's 3rd generation in a row with a x2 product. Its logical and pretty probable that a 5870x2 wont be 2x the price as a 5870. $399 and $549, a price difference of $150 is about right. Besides how can they charge exactly double the price when it isn't exactly double the performance, linearly..?


    Lol i didnt know Hemlock was poison , now it all makes sense

    You have some good points and i agree it shouldnt be priced at 2x the price , but that is far from truth anywhere in GPU market, single GPU cards that perform 20% better can cost 2x its all about bleeding edge and perhaps being able to run quadfire with only 2 pci-e slots.

  13. #38
    Xtreme Member
    Join Date
    May 2005
    Posts
    193
    To calculate Hemlock price based on 2x5870 price is wrong.

    The 4870x2 was cheaper to produce than two 4870s, hence the lower price.

  14. #39
    Banned
    Join Date
    Jun 2008
    Location
    Mi
    Posts
    1,063
    Quote Originally Posted by Manicdan View Post
    i dont think they would sell an x2 as "buy one, get one 60% off". AMD has said themselves get up to 1.8x the performance with crossfire, so they will probably charge 1.8x as much for a duel. so if its $550 for a x2, the 5870 would be spot on at $300. or the x2 will be delayed long enough for the 5870 prices to come down to 300ish


    It's because the up-sell is so easy for people to make.... for only a $150 bucks more you have more value. So ATI gets more profit from people who wouldn't normally buy SLI, but see this as an opportunity to get into a multi-GPU solution...


    As tenknics said a moment ago.

  15. #40
    I am Xtreme
    Join Date
    Dec 2007
    Posts
    7,750
    we also have historical pricing to help out, what did the 3870x2 and 4870x2 and 4850x2 sell for compared to their single counterpart?

  16. #41
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,356
    Quote Originally Posted by Manicdan View Post
    its just some mean math to say that windows use to work fine on 32MBs of ram just a decade ago, and now need 2GBs. from Win98 to XP was a big jump (around 100MB to 512-1GB) and from XP to Vista the cpu really needed to be upgraded or its a slideshow without all the visuals turned off.

    no OS past Win7 should really need to take up more resources, i cant see what they could possibly need with all that space and power. but in another decade we will laugh at anyone who cant use Win9/10 on less than 10GB and GPU acceleration.

    dont read too much into this, the point was that no matter how good the hardware gets, the OS seems to be able to abuse it happily.
    I agree totally that software developers use good quality hardware as an excuse to be lazy.

    We'll have to agree to disagree on whether or not an OS should raise requirements over the years.

  17. #42
    Xtreme Enthusiast
    Join Date
    Jan 2007
    Location
    QLD
    Posts
    942
    Quote Originally Posted by Sly Fox View Post
    I agree totally that software developers use good quality hardware as an excuse to be lazy.
    often they do not even need that, just a guaranteed market and they do not care how much of a steaming pile their work is.

  18. #43
    Xtreme Addict
    Join Date
    Jul 2007
    Location
    Alberta, Canada
    Posts
    1,264
    Quote Originally Posted by Xoulz View Post
    It's because the up-sell is so easy for people to make.... for only a $150 bucks more you have more value. So ATI gets more profit from people who wouldn't normally buy SLI, but see this as an opportunity to get into a multi-GPU solution...
    ^ This.

    I paid $600cnd for my X2 at the tail end of August last year and at the time here there were only 4870 512s and they were 290-320cnd still. So given you have 2x the vmem, that is more value than 2 4870s ( and less slots used and slightly less load and a fair bit less idle power usage ) The biggest thing I take issue with X2 cards is the noise but there are ways around that I suppose.

    I still expect both the 5870 and 5870x2 to use the same amount of GDDR5 however ( unlike the 4870 and X2 ) I'd be shocked if they had 2x2GB on the X2 although It could be possible.

    $549USD MSRP would put the new X2 at the same price at last gen so I'd consider that realistic market wise. Unless Nvidia does in fact manage to both release GT300 in November and "wow" us with it, I doubt the 5800s will drop in price much if at all.
    Last edited by Chickenfeed; 09-02-2009 at 04:29 PM.
    Feedanator 7.0
    CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
    LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i

  19. #44
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by Tenknics View Post
    I don't see hemlock being a 2x5850 part. Since day one they have talked about a highend dual gpu board..which would logically be the 70 part not the 50 part..Sylvia from the INQ originally reported the codenames and had hemlock at the bottom by mistake. She was later corrected. She was spot on all the others, with Cypress being the top end single gpu board. Hemlock is 2 cypress' on one board, which has been stated already in this thread. This is all based on information that's currently out, so things can change.. Hemlock is a poison, it makes perfect sense to call the highest end x2 part this because its going to kill NVIDIA..

    Also based on my observations and personal opinion, dual gpu boards are becoming more standard, not in the sense that everyone has one but yet its no longer uncommon or extravagent. You can't base pricing logic on the past when x2 parts were 2x the price of the normal part. Just like quadcores carried almost a 50% premium when they came out..They aren't anymore because they are more common. This is ati's 3rd generation in a row with a x2 product. Its logical and pretty probable that a 5870x2 wont be 2x the price as a 5870. $399 and $549, a price difference of $150 is about right. Besides how can they charge exactly double the price when it isn't exactly double the performance, linearly..?
    A. HD 5850 and 5870 are both Cypress.

    B. HD 4870 was $299 when the HD 4870X2 was priced at $549. That's a $50 saving on twice the GPUs. If the same logic is applied to the HD 5870 @ $399, then the HD 5870X2 would end up at $749 or at max $699. (You can't expect to save too much money on a new process.)

    Perkam

  20. #45
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,463
    Quote Originally Posted by Manicdan View Post
    CPUs and GPUs have been getting 2x as strong every 18 months
    Operating Systems have been consuming 20x the resources
    I agree. I'm still using Win XP Pro. Don't care too much for the newer jumbled up power hogs with all the eye-candy.

    Quote Originally Posted by Chickenfeed View Post
    I still expect both the 5870 and 5870x2 to use the same amount of GDDR5 however ( unlike the 4870 and X2 ) I'd be shocked if they had 2x2GB on the X2 although It could be possible.
    I think they will have 1GB and 2GB 5870's and 2x1GB and 2x2GB 5870x2's. At launch probably only the 1GB 5870 and 1GB 5850 though.
    Last edited by jaredpace; 09-02-2009 at 05:22 PM.
    Bring... bring the amber lamps.
    [SIGPIC][/SIGPIC]

  21. #46
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Tenknics View Post
    Also based on my observations and personal opinion, dual gpu boards are becoming more standard, not in the sense that everyone has one but yet its no longer uncommon or extravagent. You can't base pricing logic on the past when x2 parts were 2x the price of the normal part. Just like quadcores carried almost a 50% premium when they came out..They aren't anymore because they are more common. This is ati's 3rd generation in a row with a x2 product. Its logical and pretty probable that a 5870x2 wont be 2x the price as a 5870. $399 and $549, a price difference of $150 is about right. Besides how can they charge exactly double the price when it isn't exactly double the performance, linearly..?
    how can they? well theyve been doing it for years and so has nvidia... what your saying makes no sense whatsoever... if you buy a vga that costs twice as much you NEVER get twice the performance... is a core i7 975 for 999$ 5x faster than a core i7 920?
    is a gtx295 5x faster than a 4770 or gts250?
    subsidizing the second gpu on an X2 card is really weird... why would they do that? that makes no sense whatsoever...
    that only makes sense if you try to push the market to adopt dual gpu cards faster... and i doubt thats worth spending that much money to ati...

    there are people buying a dual gtx285 card from asus for 999$ or even more... a 5870x2 will def blow that away so why would ati sell it for less than the price of 2 5870s and for only 50$ more than a regular 295? no, this doesnt make sense... im not saying the pricing is wrong, but if its right, its pretty weird and stupid from ati...

    the only way i can make sense of the pricing is that by the end of the year when the 5870x2 actually gets released, prices of the 5870 have probably dropped to around half that of the 5870x2, so thats why they project the latter to be 550$ at the end of the year... i read that as: we will launch 5870 at 399$ and expect it to drop to 279$ by the end of the year.
    Last edited by saaya; 09-03-2009 at 04:04 AM.

  22. #47
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by maxgull View Post
    new years resolution
    Westmere Gulftown
    xfired 5870x2s
    i wonder how well they will scale. some stupidly high unnecessary framerate...i cant wait
    oh yeah, that gulftown is going to get you a huge framerate boost... NOT :P

  23. #48
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    9,060
    Quote Originally Posted by saaya View Post
    oh yeah, that gulftown is going to get you a huge framerate boost... NOT :P
    It might clock higher due to 32nm. And bigger cache is also a plus.
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  24. #49
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by zalbard View Post
    It might clock higher due to 32nm. And bigger cache is also a plus.
    how many extra fps do you think youll get from doubling the cache? seriously, im curious...

    and if it clocks higher then what?
    the new gpus will def need more cpu power to shine, but you definately wont see a diference above 4ghz every i7 and i5 does easily now...

    chose any game your playing and then check your fps with highest cpu clocks and then reduce cpu clocks in 200mhz steps... youll be surprised how little cpu clocks we need... last time i checked it was around 2.4ghz... that doesnt mean more than 2.4ghz doesnt give you more fps, it just barely scales above that...

    why do you think some 3dmark records are done with the cpu way below the max it can run? cause theres absolutely no diference in pushing it higher so they chose a low safe speed...

  25. #50
    Xtreme X.I.P. Particle's Avatar
    Join Date
    Apr 2008
    Location
    Kansas
    Posts
    3,219
    Quote Originally Posted by saaya View Post
    how many extra fps do you think youll get from doubling the cache? seriously, im curious...

    and if it clocks higher then what?
    the new gpus will def need more cpu power to shine, but you definately wont see a diference above 4ghz every i7 and i5 does easily now...

    chose any game your playing and then check your fps with highest cpu clocks and then reduce cpu clocks in 200mhz steps... youll be surprised how little cpu clocks we need... last time i checked it was around 2.4ghz... that doesnt mean more than 2.4ghz doesnt give you more fps, it just barely scales above that...

    why do you think some 3dmark records are done with the cpu way below the max it can run? cause theres absolutely no diference in pushing it higher so they chose a low safe speed...
    That's way too generic. While that's certainly true for games like Crysis, for others such as TF2, CPU is king.
    Particle's First Rule of Online Technical Discussion:
    As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.

    Rule 1A:
    Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.

    Rule 2:
    When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.

    Rule 2A:
    When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.

    Rule 3:
    When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.

    Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!

    Random Tip o' the Whatever
    You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.

Page 2 of 3 FirstFirst 123 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •