MMM
Page 20 of 21 FirstFirst ... 101718192021 LastLast
Results 476 to 500 of 525

Thread: Intel Q9450 vs Phenom 9850 - ATI HD3870 X2

  1. #476
    Xtreme Member
    Join Date
    Nov 2007
    Posts
    178
    Quote Originally Posted by JumpingJack View Post
    Nonetheless, the more interesting stuff for most people are the settings that make the game fun and full of eye candy... still working on that, I should have it in the next day or so.

    Jack

  2. #477
    Xtreme Mentor
    Join Date
    Mar 2006
    Posts
    2,978
    Quote Originally Posted by Boschwanza View Post
    Sorry, slipped by me... will work on it tomorrow.
    One hundred years from now It won't matter
    What kind of car I drove What kind of house I lived in
    How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
    -- from "Within My Power" by Forest Witcraft

  3. #478
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Jack: Yes I see what you mean. This is just like two threads running in parallel and all stuff about FSB, memory, hypertransport is the CPU management (CPU work).
    But that also means that the frame rate isn't good for testing how smooth the game is. It's just frames that may be a bit old because how "up to date" the picture is depends on when the cpu did its work for it.
    If the CPU decides frame rate speed and it is above 30 to 40 FPS and LOW FPS is above 25 then the game will be smooth. But if the game has an average 60 FPS and it is the GPU that slows the game to that frame rate the game could in fact behave more unresponsive and delayed compared to when the slower frame rate decided by the cpu.

  4. #479
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    Quote Originally Posted by gosh View Post
    But if the game has an average 60 FPS and it is the GPU that slows the game to that frame rate the game could in fact behave more unresponsive and delayed compared to when the slower frame rate decided by the cpu.

  5. #480
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    gOJDO: When the frame rate depends on the GPU you don't measure the CPU. AND if the CPU is ALL computer works then this is also the mouse work as one example.
    GPU processor and CPU processor don't know about each other and GPU is in fact asynchronous. That means that there isn't any way for the processor to know when exactly the frame was produced. If it doesn't know that it can't calculate the exact difference in time in order to know how much movement etc that should be done.

    In this discussion this will mask bottlenecks in the cpu if you test the gpu

    w = work
    i = idle
    Code:
    Sample (only one frame is buffered):
    GPU |wwwwwwww| wwwwwwww | wwwwwwww |
    CPU |wwwiiiii| wwwiiiii | wwwiiiii |
    In this situation the picture that will be shown on the 
    screen is (big W) wwWiiiii.
    
    Now something happens and 
    the cpu needs to do extra work
    GPU |wwwwwwww| wwwwwwwwiii | wwwwwwww |
    CPU |wwwiiiii| wwwwwwwwwww | wwwiiiii |
    Here the difference for the two pictures when cpu needed to do extra work 
    is: wwWiiiii wwwwwwwwwwW 16 "units" But the frame rate is 11 "units" (MAX)
    
    Another example (most extreme) with no difference in GPU frame rate
    GPU |wwwwwwww| wwwwwwww | wwwwwwww |
    CPU |wiiiiiii| wwwwwwww | wwwiiiii |
    
    Here the difference for the two pictures when cpu needed to 
    do extra work is: Wiiiiiii wwwwwwwW 15 "units" Frame rate is 8 "units"
    If you buffer more than one picture this error will increase.
    Last edited by gosh; 09-12-2008 at 04:10 AM.

  6. #481
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    Congratulations!!! 20 pages of BS and you are still failing to make a point.

    When I think that you have posted the greatest crapload of BS ever, you come with more marvelous BS. I fail to realize how you came to your theories, but if you have ever tried to understand anything related to how CPU & GPU work you have understood everything completely wrong.
    Last edited by gOJDO; 09-12-2008 at 10:05 AM.

  7. #482
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by gOJDO View Post
    Congratulations!!! 20 pages of BS and still you fail to make a point.
    The point is how playable the game is, this is what matters for the person who is going to play the game.

  8. #483
    Xtreme Addict
    Join Date
    May 2005
    Posts
    1,656
    Quote Originally Posted by gosh View Post
    The point is how playable the game is, this is what matters for the person who is going to play the game.
    Actually the point all together is in truth you didn't even have a clue of what really happens between the cpu and gpu from the time you started this thread and you still seem very lost or hard headed one. You should be posting a page long thanking JumpingJack for the free education.
    Work Rig: Asus x58 P6T Deluxe, i7 950 24x166 1.275v, BIX2/GTZ/D5
    3x2048 GSkill pi Black DDR3 1600, Quadro 600
    PCPower & Cooling Silencer 750, CM Stacker 810

    Game Rig: Asus x58 P6T, i7 970 24x160 1.2v HT on, TRUE120
    3x4096 GSkill DDR3 1600, PNY 660ti
    PCPower & Cooling Silencer 750, CM Stacker 830

    AMD Rig: Biostar TA790GX A2+, x4 940 16x200, stock hsf
    2x2gb Patriot DDR2 800, PowerColor 4850
    Corsair VX450

  9. #484
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by gosh View Post
    The point is how playable the game is, this is what matters for the person who is going to play the game.
    and thats why serious games go with intel, cause they give the best performance and the best experience.

  10. #485
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    Man this thread is so awesome. Just one recommendation to Jack: stop writing that kind of posts, not worth the effort with some guys. Save your time
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  11. #486
    Xtreme Addict
    Join Date
    Dec 2005
    Posts
    1,035
    Jack is like an mixture of Gandhi and Mother Theresa.

    After the first gems, I have preserved my brain by not reading those walls of text written by gosh.

  12. #487
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by highoctane View Post
    Actually the point all together is in truth you didn't even have a clue of what really happens between the cpu and gpu from the time you started this thread and you still seem very lost or hard headed one. You should be posting a page long thanking JumpingJack for the free education.
    Yes I have read about how some solutions in hardware are done. But this isn't a rule when a programmer develops against one driver. How the video card works or the driver works isn't something that the programmer needs to know and different vendors for video cards can solve API's as they like. The programmer just call api's then and measure time spans.

    And this isn't what really what this thread is about. it's about how CPU works and why there are so much intel fanboys out there.
    Whats been said in this thread about how the videocard works is just that if you buy a Intel processor your computer will use a lot more power because it will produce frame rates that isn't needed (max fps and average fps). Also checking how this works will rule out the need for faster processors in current games (those that have been tested). Intel runs good when it finds data in the cache (almost20 times faster compared to memory). And framerates for this are sky high for fast video cards. If the processor really needs to work for the game using threads it is a different scenario.
    The reason why I am in this discussion is that I think this is strange, it got me curious. Doing other types of development it is easy to check different scenarios when amd is better compared to intel and intel better compared to amd. Why are people willing to spend more money for something that isn't noticed?
    Also most know the behavior for intels processors. it has worked on it's fast parts and skipped the weak parts.
    Last edited by gosh; 09-12-2008 at 01:27 PM.

  13. #488
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Oh my gosh.
    Are we riding a marry go round ?

    Once again you're implying that AMD is faster in games and specifically that it gives you better ( higher ) minimum framerates ( which isn't true ).

    I for one give up, nobody and nothing can change your mind.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  14. #489
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by highoctane View Post
    Actually the point all together is in truth you didn't even have a clue of what really happens between the cpu and gpu from the time you started this thread and you still seem very lost or hard headed one. You should be posting a page long thanking JumpingJack for the free education.
    Yes I have read about how some solutions in hardware are done. But this isn't a rule when a programmer develops against one driver. How the video card works or the driver works isn't something that the programmer needs to know and different vendors for video cards can solve API's as they like. The programmer just call api's then and measure time spans.

    And this isn't what really what this thread is about. it's about how CPU works and why there are so much intel fanboys out there.
    Whats been said in this thread about how the videocard works is just that if you buy a Intel processor your computer will use a lot more power because it will produce frame rates that isn't needed (max fps and average fps). Also checking how this works will rule out the need for faster processor for games in current games (those that have been tested). Intel runs good when it finds data in the cache (almost 20 times faster compared to memory). And framerates for this are sky high for fast video cards. If the processor really needs to work for the game using threads it is a different scenario.
    The reason why I am in this discussion is that I think this is strange, it got me curious. Doing other types of development it is easy to check different scenarios when amd is better compared to intel and intel better compared to amd. Why are people willing to spend more money for something that isn't noticed?
    Also most know the behavior for intels processors. it has worked on it's fast parts and skipped the weak parts.
    Last edited by gosh; 09-12-2008 at 02:08 PM.

  15. #490
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by BenchZowner View Post
    Once again you're implying that AMD is faster in games and specifically that it gives you better ( higher ) minimum framerates ( which isn't true ).
    If minimum frame rates depends on the cpu and the game is threaded and advanced it will most likely produce higher MIN FPS.

  16. #491
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by gosh View Post
    If minimum frame rates depends on the cpu and the game is threaded and advanced it will most likely produce higher MIN FPS.
    Doubtful because:

    1) Clock per clock the Core 2 CPUs are faster than the Phenoms
    2) The operating frequencies of the Core 2 CPUs are way higher than those of AMD's Phenoms.
    3) The "scaling" advantage of AMD's with Quad-Cores isn't good enough to cover the Clock per clock performance gap.
    4) Unfortunately in most games ( if not all ) in real-life gaming settings ( resolutions, game details and AA/AF ) the minimum framerate depends on the graphics card, just like the average & maximum framerates.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  17. #492
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by BenchZowner View Post
    1) Clock per clock the Core 2 CPUs are faster than the Phenoms
    But i can assure you that no game developer will develop a game that needs one processor that is running at 3.0 GHz or more (few buyers). Whats very easy to check for the programmer is how the processor works for the game. If it is raw clocks that matters this will be noticed immediately. if the processor slows the game it will probably be because there are something strange happening. Maybe the cache needs to be refreshed or latency for something else will be high. This is harder for the developer to check.

  18. #493
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by gosh View Post
    Yes I have read about how some solutions in hardware are done. But this isn't a rule when a programmer develops against one driver. How the video card works or the driver works isn't something that the programmer needs to know and different vendors for video cards can solve API's as they like. The programmer just call api's then and measure time spans.

    And this isn't what really what this thread is about. it's about how CPU works and why there are so much intel fanboys out there.
    Whats been said in this thread about how the videocard works is just that if you buy a Intel processor your computer will use a lot more power because it will produce frame rates that isn't needed (max fps and average fps). Also checking how this works will rule out the need for faster processors in current games (those that have been tested). Intel runs good when it finds data in the cache (almost20 times faster compared to memory). And framerates for this are sky high for fast video cards. If the processor really needs to work for the game using threads it is a different scenario.
    The reason why I am in this discussion is that I think this is strange, it got me curious. Doing other types of development it is easy to check different scenarios when amd is better compared to intel and intel better compared to amd. Why are people willing to spend more money for something that isn't noticed?
    Also most know the behavior for intels processors. it has worked on it's fast parts and skipped the weak parts.
    omfg ... this post of yours leave only to conclusions:
    a) your a total amd fanboy or
    b) your preception dont even reaches beyoned your house door and you reject reality and substitute it with your own. (god, finally i could use that quote )

    So much BS in that bold highlighted part, it isn't even funny any more...

    Just in case it sliped your attention:
    Phenom consumes more power while delivering less performance then C2Q.

    http://www.computerbase.de/artikel/h...stungsaufnahme

    this chart show full load with cpu (prime 95) and gpu (Firefly Forest“ & „Canyon Flight“ endlessloop 3dmark06)

    clock for clock amds phenom consume 3% more power then a kentsfield and 20% more then a yorkfield... all with the same graphics card....

  19. #494
    Xtreme Mentor
    Join Date
    Nov 2006
    Location
    Spain, EU
    Posts
    2,949
    And the third option: c) He's just trolling us.

    I vote for a mixture of a) + b) + c). I can't believe it if not
    Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)


    Quote Originally Posted by PerryR, on John Fruehe (JF-AMD) View Post
    Pretty much. Plus, he's here voluntarily.

  20. #495
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by Hornet331 View Post
    omfg ... this post of yours leave only to conclusions:
    a) your a total amd fanboy or
    Well, for each amd fanboy you will find at least 10 intel fanboys. and they are very sensitive to hear something that says that amd could be better

    Quote Originally Posted by Hornet331 View Post
    Just in case it sliped your attention:
    Phenom consumes more power while delivering less performance then C2Q.
    My friend and I just tested computers running at idle to see how much it was. He has one E6600 and one 7900 GTX. It used 140 watt idle. I had one opteron 165 and 7900 GTX and it was using 130 watt idle. Also tested other computers and the strange thing is that they often seem to draw less power compared to intel on idle. Now I have one using 9750 and ATI HD3850. It uses 100 watt idle, one server with X2 3600+ and that is using 52 watt. It is from the wall.

    GPU's are using much more power, and CPU's isn't maxed out (quads)
    Last edited by gosh; 09-12-2008 at 03:39 PM.

  21. #496
    Banned
    Join Date
    Aug 2008
    Posts
    1,052
    Quote Originally Posted by STaRGaZeR View Post
    And the third option: c) He's just trolling us.

    I vote for a mixture of a) + b) + c). I can't believe it if not
    I have no doubt whatsoever he is trolling.

    If you are an AMDZone True Believer as he is, then your warped perceptions can never be altered and that is why I posted to Jack earlier in this thread the below:
    Quote Originally Posted by Chad Boga View Post
    Don't you think his spamming of forums is all related to his AMD fanaticism?

    Everything he has been posting has been about trying to portray AMD in a better light.
    Jack is completely wrong to think Gosh has taken on anything he has been told, all he has done is zigged and zagged to keep his trolling efforts alive.

    Also I think he has taken pleasure in getting Jack to do so much leg work all for nothing(well at least as far as Gosh is concerned), so that is why he keeps writing nonsense replies hoping to get Jack to keep wasting hours of his time.

  22. #497
    Banned
    Join Date
    Aug 2008
    Posts
    1,052
    Quote Originally Posted by gosh View Post
    and they are very sensitive to hear something that says that amd could be better
    Only when that something is complete and utter bullsh1t.

    If it is true like K8 was better than P4 for gaming, you will get no dispute, but when a AMDZone loopie wants to claim that clock for clock the K10 is better than Penryn, then of course the AMDZone loopie's claims will be rubbished for the nonsense that they are.

  23. #498
    Xtreme Enthusiast
    Join Date
    May 2008
    Posts
    612
    Quote Originally Posted by Chad Boga View Post
    Also I think he has taken pleasure in getting Jack to do so much leg work all for nothing(well at least as far as Gosh is concerned), so that is why he keeps writing nonsense replies hoping to get Jack to keep wasting hours of his time.
    I think that all has learned quite a bit in this thread. Check whats been said at the start and you will find that much of those things has been cleared later in thread.

    The problem to talk about good parts for amd is well known. Most forums have quite a bit of people that likes intel and they will immediately hit on amd talks. it is even hard to ask about processors, they just can't handle it

    I haven't seen anyone say that the cpu and gpu runs asynchronously which they in fact does
    Last edited by gosh; 09-12-2008 at 04:06 PM.

  24. #499
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    Quote Originally Posted by BenchZowner View Post
    Doubtful because:

    1) Clock per clock the Core 2 CPUs are faster than the Phenoms
    2) The operating frequencies of the Core 2 CPUs are way higher than those of AMD's Phenoms.
    3) The "scaling" advantage of AMD's with Quad-Cores isn't good enough to cover the Clock per clock performance gap.
    4) Unfortunately in most games ( if not all ) in real-life gaming settings ( resolutions, game details and AA/AF ) the minimum framerate depends on the graphics card, just like the average & maximum framerates.
    from market's perspective

    1) insignificantly in games
    2) for a cost
    3) scaling is the same as for Intel's CPUs. Platform's scaling is higher
    4) true, though not sure about the "unfortunately" part. There should be a reason to buy high-end GPUs


    From Mods point of view

    I think some of you guys need to edit your posts as per forum policies to conduct clean no-flaming-or-name-calling policy.
    Apparently these people cannot oppose in a polite manner
    Last edited by Cooper; 09-12-2008 at 04:13 PM.

  25. #500
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by gosh View Post
    My friend and I just tested computers running at idle to see how much it was. He has one E6600 and one 7900 GTX. It used 140 watt idle. I had one opteron 165 and 7900 GTX and it was using 130 watt idle. Also tested other computers and the strange thing is that they often seem to draw less power compared to intel on idle. Now I have one using 9750 and ATI HD3850. It uses 100 watt idle, one server with X2 3600+ and that is using 52 watt. It is from the wall.

    GPU's are using much more power, and CPU's isn't maxed out (quads)
    that view is so faulted its not even funny...

    1) you compare a 2,4ghz proc to 1,8ghz proc,
    2) you compare different systems with different psus -> different efficiency -> different results (and they are quite massive depending on the what psus are used)
    3) i gave you a review that showes both cpu and gpu maxed out, with same psu and same graphic cards.

    It's time to phrase some Willi Wonka:
    It's all there, black and white, clear as crystal! You lose! Good day sir!
    Last edited by Hornet331; 09-12-2008 at 04:18 PM.

Page 20 of 21 FirstFirst ... 101718192021 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •