Page 47 of 51 FirstFirst ... 3744454647484950 ... LastLast
Results 1,151 to 1,175 of 1265

Thread: AMD Shanghai/Deneb Review Thread

  1. #1151
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    northern ireland
    Posts
    1,008
    Quote Originally Posted by demonkevy666 View Post
    I read it lol I must be a ninja it said I7 till rules high end, and all alone up there. Deneb beats the Q9300 and puts a good fight, higher power consumption for platform.

    wonder how they got higher power consumption...
    Thats not so good, Did they not also test against a q8200 and a q9400? If so how did it fair?

  2. #1152
    Registered User
    Join Date
    Jun 2008
    Location
    Mesa, AZ
    Posts
    74
    Quote Originally Posted by gallag View Post
    did you read it all?
    Yes they compared it to a i7 920, x4 9950, a q9300 and a q9400. They managed to run benchmarks at 3.6 ghz for the phenom 940. The new phenom beat the q9400 and q9300 in just about every benchmark. It consumed about 50 more watts at 2d load compared to the q9400 and q9300 and consumed less power than a i7 920.

  3. #1153
    Xtreme Addict
    Join Date
    Dec 2007
    Location
    Hungary (EU)
    Posts
    1,376
    Quote Originally Posted by demonkevy666 View Post
    I read it lol I must be a ninja it said I7 till rules high end, and all alone up there. Deneb beats the Q9300 and puts a good fight, higher power consumption for platform.

    wonder how they got higher power consumption...
    The Phenom II was at 3GHz but the 9300 was at 2.5GHz (+500MHz), and they (stupid idiots) tested the Phenom with an IGP board (more wattages).

    They said that Dragon = Phenom II + 790GX + HD47xx.
    -

  4. #1154
    Xtreme Member
    Join Date
    Dec 2004
    Location
    Italy
    Posts
    147

  5. #1155
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Lubbock, Texas
    Posts
    2,133
    nice thanks for the link.

  6. #1156
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by Uncle Jimbo View Post
    The Barton processors originally came out with a max 'rating' of 2600. A year later that was up to 3200. I don't know how much you understand about semiconductor manufacturing, but the process tuning to get yields of certain parts is not the same as the chip design. So if they want to shift yield balance, they might do minor tweaks to increase yields of higher speed parts. If, on the other hand, they see a bigger volume need for low power chips, and there is a market for more than the bin split gives, they may adjust the process to increase those yields.

    I think AMD is more interested in tuning the process for low power than for high speed.
    DFM, process tweaks, etc. is all good but my point is that their is no way that a 4ghz Phenom II will only dissipate 90W at load. If you are implying that this first batch of PII are "tweaked for low power" then they must also be "tweaked for higher speed parts" as well since they are running at 4ghz with 90W?? The poof is in the pudding, if this chip could run at that speed with that wattage then we would already of seen it released as an FX chip regardless of what their tuning their process to do. $$$$$$

  7. #1157
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Lubbock, Texas
    Posts
    2,133
    Quote Originally Posted by qurious63ss View Post
    DFM, process tweaks, etc. is all good but my point is that their is no way that a 4ghz Phenom II will only dissipate 90W at load. If you are implying that this first batch of PII are "tweaked for low power" then they must also be "tweaked for higher speed parts" as well since they are running at 4ghz with 90W?? The poof is in the pudding, if this chip could run at that speed with that wattage then we would already of seen it released as an FX chip regardless of what their tuning their process to do. $$$$$$
    who said it would only be 90W at 4ghz?

    edit: just read that review and it looks pretty good. im confused as to why they did the gaming tests at such a low resolution. because really whats the point of 1024x768? does anyone still run that? so that part of the review was just pointless imo. the power consumption looks good too. completely different from hwbox's results. interesting to see that the 920 and 940 use the same amount of power in standby and idle and are close in load. shows that maybe it won't heat up too bad when it overclocks. i also saw that they listed the 940 and 920 both as unlocked too so idk about that.
    Last edited by roofsniper; 01-07-2009 at 03:38 PM.

  8. #1158
    Xtreme Member
    Join Date
    Jul 2005
    Location
    Norway
    Posts
    319
    Quote Originally Posted by roofsniper View Post
    nice thanks for the link.
    Yes, even though no OC was tried, this was a promising review.
    Deneb is a clear improvement over Agena.
    (The Italian conclusion was somewhat pessimistic according to Babelfish).

    Look at the good power consumption!
    Green CPU! (Idle).
    Last edited by TL1000S; 01-07-2009 at 03:40 PM.

    3DMarknn - 79506/96025/33499/25592

  9. #1159
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by Uncle Jimbo View Post

    More interesting to me is we now have enough info to say that the Phenom II at around 4G, after 10 min load (based on IoC's data) showed a rise of 17C (39-22). With a .18 C/W cooler, that says the chip is putting out 95W - a really exceptional result, and about half of what the i7 965 puts out at 3.5G.
    Unless there is a misunderstanding here, but that's what I read.

  10. #1160
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by TL1000S View Post
    Yes, even though no OC was tried, this was a promising review.
    Deneb is a clear improvement over Agena.

    Look at the good power consumption!
    Green CPU! (Idle).
    Dont want to play party pooper, but im not that impressed by the power consumption, its apprx. the same as a similar clocked yorkfield.

    But i think thats due to the fact that they used a 790FX chipset.

  11. #1161
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Lubbock, Texas
    Posts
    2,133
    Quote Originally Posted by Hornet331 View Post
    Dont want to play party pooper, but im not that impressed by the power consumption, its apprx. the same as a similar clocked yorkfield.

    But i think thats due to the fact that they used a 790FX chipset.
    hwbox used a 79-T for power consumption as well. what im happy about is that with my phenom 9600 running at 2.3ghz upgrading to a phenom II at 3ghz uses less power and from the power consumption difference between the 920 and 940 it looks like i might even be able to oc it a bit and still get lower numbers than now. it only went up 2.3W from 2.8ghz to 3ghz on the phenom II while on the phenom I from 2.5ghz to 2.6ghz is 3.45W.
    Last edited by roofsniper; 01-07-2009 at 03:45 PM.

  12. #1162
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by roofsniper View Post

    edit: just read that review and it looks pretty good. im confused as to why they did the gaming tests at such a low resolution. because really whats the point of 1024x768? does anyone still run that? so that part of the review was just pointless imo.
    They test at low resolution to remove the bottleneck from the GPU no?

  13. #1163
    Banned
    Join Date
    Oct 2006
    Location
    Haslett, MI
    Posts
    2,221
    Quote Originally Posted by qurious63ss View Post
    Unless there is a misunderstanding here, but that's what I read.
    I know, its as if he intentionally skips some posts. Anyway, did you see that 170w consumption @ 3.85Ghz or so? Interestingly, the temp was around 50c load but could not prime at 4Ghz, according to Coolaler.

    Now if you listen to Informal, high voltage is no problem for PII; which begs the question: If it can take high voltage and runs cool at 1.6v, why won't it prime at 4ghz? The devil in it is somewhere, don't ask me to find it.

  14. #1164
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Lubbock, Texas
    Posts
    2,133
    Quote Originally Posted by qurious63ss View Post
    They test at low resolution to remove the bottleneck from the GPU no?
    maybe but still what does that say? if you want to know gaming results then you want to know what your cpu will get at normal resolutions. as we saw with the hwbox review at 1680x1050 deneb got about the same performance as competing intel cpus and sometimes even better. showing it at 1024x768 is a whole new story.

  15. #1165
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by Zucker2k View Post
    I know, its as if he intentionally skips some posts. Anyway, did you see that 170w consumption @ 3.85Ghz or so? Interestingly, the temp was around 50c load but could not prime at 4Ghz, according to Coolaler.

    Now if you listen to Informal, high voltage is no problem for PII; which begs the question: If it can take high voltage and runs cool at 1.6v, why won't it prime at 4ghz? The devil in it is somewhere, don't ask me to find it.
    Yes, it seems like the power numbers are not adding up.Reviews are coming out and more importantly alot more people are going to start testing these chips so we will know shortly what is really going on.

  16. #1166
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by roofsniper View Post
    maybe but still what does that say? if you want to know gaming results then you want to know what your cpu will get at normal resolutions. as we saw with the hwbox review at 1680x1050 deneb got about the same performance as competing intel cpus and sometimes even better. showing it at 1024x768 is a whole new story.
    Yes, hwbox numbers were the same because the GPU became the bottleneck so those numbers actually say less about the CPU and more about the GPU. The low resolution numbers I think are more informative if you wan't to future proof your rig so that say you upgrade to a faster card in a year or so the chances of your CPU becoming the bottleneck are less likely.

  17. #1167
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Lubbock, Texas
    Posts
    2,133
    Quote Originally Posted by qurious63ss View Post
    Yes, hwbox numbers were the same because the GPU became the bottleneck so those numbers actually say less about the CPU and more about the GPU. The low resolution numbers I think are more informative if you wan't to future proof your rig so that say you upgrade to a faster card in a year or so the chances of your CPU becoming the bottleneck are less likely.
    not necessarily. the lower resolution numbers don't test the same exact things as the higher ones do. it just seems that if you are going to post gaming benchmarks then you post what people game at. 1280x1024 was the highest i saw and does that make me want to buy it when im running 1650x1080? i want to know how it performs on the resolutions that i run and the resolutions that most others run as well. if they were making the gaming benchmarks to show how futureproof it would be you would think that they would show it at the high resolutions.

  18. #1168
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by roofsniper View Post
    not necessarily. the lower resolution numbers don't test the same exact things as the higher ones do. it just seems that if you are going to post gaming benchmarks then you post what people game at. 1280x1024 was the highest i saw and does that make me want to buy it when im running 1650x1080? i want to know how it performs on the resolutions that i run and the resolutions that most others run as well. if they were making the gaming benchmarks to show how futureproof it would be you would think that they would show it at the high resolutions.
    There are to many resolution to test, and such stuff usually is part of GFX tests and not cpu tests.
    E.g. Im only interessted in 1920x1200, but since at that resolution most cards just reach its limit it wont tell you how good the cpu is and what you can expect if you later upgrade your gpu. (thats just my viewpoint on that topic)

  19. #1169
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by roofsniper View Post
    not necessarily. the lower resolution numbers don't test the same exact things as the higher ones do.
    What do you mean by that? Again, I could be wrong but at higher resolutions you actually testing the GPU and not the CPU since the GPU becomes the bottleneck.

  20. #1170
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by qurious63ss View Post
    What do you mean by that? Again, I could be wrong but at higher resolutions you actually testing the GPU and not the CPU since the GPU becomes the bottleneck.
    That's unfortunately true because smost of reviews only show average fps.
    Minimum fps is a good indicator of a weak cpu too.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  21. #1171
    Xtreme Addict
    Join Date
    Jan 2008
    Location
    Lubbock, Texas
    Posts
    2,133
    Quote Originally Posted by Hornet331 View Post
    There are to many resolution to test, and such stuff usually is part of GFX tests and not cpu tests.
    E.g. Im only interessted in 1920x1200, but since at that resolution most cards just reach its limit it wont tell you how good the cpu is and what you can expect if you later upgrade your gpu. (thats just my viewpoint on that topic)
    yea it just seems with new gpus coming about once a year and new cpus coming about one every two years that if the cpus were equal at the higher resolutions then putting in a new video card wouldn't make that big of a difference if any. and you can always overclock too if your cpu becomes a bottleneck. i would be more interested than how it performs now than how it performs years from now when most likely ill have a new cpu anyway.
    Quote Originally Posted by qurious63ss View Post
    What do you mean by that? Again, I could be wrong but at higher resolutions you actually testing the GPU and not the CPU since the GPU becomes the bottleneck.
    what i mean is that if a cpu wins in a benchmark at a lower resolution then it doesn't mean it will be better for gaming. i see testing it at a lower resolution more of a different test than a test that is finding out how futureproof it is. in one situation the cpu is dealing with a lot of small frames while in another its dealing with a lot less larger ones.

  22. #1172
    Banned
    Join Date
    Oct 2006
    Location
    Haslett, MI
    Posts
    2,221
    Quote Originally Posted by roofsniper View Post
    yea it just seems with new gpus coming about once a year and new cpus coming about one every two years that if the cpus were equal at the higher resolutions then putting in a new video card wouldn't make that big of a difference if any. and you can always overclock too if your cpu becomes a bottleneck. i would be more interested than how it performs now than how it performs years from now when most likely ill have a new cpu anyway.

    what i mean is that if a cpu wins in a benchmark at a lower resolution then it doesn't mean it will be better for gaming. i see testing it at a lower resolution more of a different test than a test that is finding out how futureproof it is. in one situation the cpu is dealing with a lot of small frames while in another its dealing with a lot less larger ones.
    Another proof this guy knows nothing about what he's talking about. Sig-worthy, but I'll pass.

  23. #1173
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    288
    Quote Originally Posted by Zucker2k View Post
    Another proof this guy knows nothing about what he's talking about. Sig-worthy, but I'll pass.
    That's uncalled for man. We are all here to learn.

  24. #1174
    Xtreme Enthusiast
    Join Date
    Jun 2008
    Posts
    746
    They shouldn't have used a gx for power consumption.

    They also could only overclock to 3.6....seriously? everyone here has gotten that easily on stock cooling or worse.

    They don't seem to get that you can overclock nb and hypertransport etc. when overclocking phenom's guess review sites are so used to intel.

    Minimum fps should be used in reviews aswell...since cpu power influences that before and after overclocking from what I've seen and fps stability is very important for gaming.

    I also forgot to meantion when it came to power consumption there is no way it'd be as bad as it was if they turned on cool n' quiet....which now runs the processor at 1ghz and 1v apparantly which should dramatically reduce idle power consumption.
    Last edited by Caveman787; 01-07-2009 at 07:08 PM.

  25. #1175
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by qurious63ss View Post
    That's uncalled for man. We are all here to learn.
    If only that were true. This is the internet!

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

Page 47 of 51 FirstFirst ... 3744454647484950 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •