Results 1 to 12 of 12

Thread: Why are gpu temps better than cpu temps?

  1. #1
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    100

    Why are gpu temps better than cpu temps?

    So some systems have gpus that put out more heat.

    Why are load temps better for gpu's than on cpus?
    Last edited by homefry; 05-19-2011 at 09:32 AM.

  2. #2
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    126
    Thats a pretty ridiculous generalization, if anything in a standard air cooled system the GPUs get considerably hotter under heavy load.

  3. #3
    Xtreme Member
    Join Date
    Oct 2010
    Location
    Sydney Australia
    Posts
    393
    As your talking in the water cooling forum i presume you mean water cooled CPU vs GPU.

    The load temp is better because the GPU chip is bigger as is the waterblock cooling the GPU, so you have more surface contact area for the GPU. ie more heat is removed over a given time due to more time water is in contact with the block.

    My CPU runs at 70c compared with 50c for 4 GPUs running in series.
    Last edited by Phatboy69; 05-18-2011 at 10:03 PM.

    I am Intel of Borg. Resistance is futile. You will be assimilated. Borg Homeworld - Blog
    i7 3930k @ 5Ghz AC Kryos Silver - Asus Rampage IV Extreme - 4 way SLI 3GB GTX 580-UD @ 1000/2200 - 16GB Corsair GT DDR3-2000 RAM - 4 x GTX 360 rads w/ 24 x CM push-pull fans & shrouds - Aquaero 5 XT controller, 6 PA2 Ultras - 2 Flow meters - 2 x Enermax 1500W PSU - MM Extended Ascension Horizon XL-ATX Case - 2 x Koolance 452x2 Res and 4 x D5 pumps - 3 x 24" Acer H243H Surround LCD

  4. #4
    Xtreme Member
    Join Date
    Mar 2011
    Location
    SoCal
    Posts
    268
    1/ Greater heat load (sometimes)
    2/ Larger die surface (i.e. Smaller Heat Flux - i.e. easier to cool)
    3/ Probe location (high gradients of temperature throughout the die surface)
    4/ IHS vs no IHS (on AMD at least..)

    #2 + #3 make CPU and GPU not really comparable temperature wise

  5. #5
    Registered User
    Join Date
    Nov 2010
    Posts
    92
    Is it just me or is the question in the title the inverse of the one in the post?

  6. #6
    -100c Club
    Join Date
    Jun 2005
    Location
    Slovenia, Europe
    Posts
    2,283
    subtec: True

  7. #7
    Xtreme Member
    Join Date
    Sep 2008
    Location
    Toronto, Ontario
    Posts
    166
    Quote Originally Posted by subtec View Post
    Is it just me or is the question in the title the inverse of the one in the post?
    +1

    interesting post....
    i7 920 D0 @ 4.01ghz 1.25v
    Rampage III Extreme
    24gb Corsair Vengeance @ 1600mhz 8-8-8-24-1T
    eVGA GTX 1070 Founders Edition
    Silverstone 1000w
    Corsair 800D
    Liquid Cooled - CPU/MB - EKWB Supreme HF/Full Cover MBWB, Iandh 225 res, D5 w/XTOP rev. 2, Feser x-changer 360, Bitspower fittings



  8. #8
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    100
    yeah, that was a ridiculous mistake. i've been studying too long.

  9. #9
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    100
    Quote Originally Posted by Phatboy69 View Post
    As your talking in the water cooling forum i presume you mean water cooled CPU vs GPU.

    The load temp is better because the GPU chip is bigger as is the waterblock cooling the GPU, so you have more surface contact area for the GPU. ie more heat is removed over a given time due to more time water is in contact with the block.

    My CPU runs at 70c compared with 50c for 4 GPUs running in series.
    You would think overtime, all your components would stabilize toward the same temp within at least 10C and not a crazy high 20C delta.

    Even GPU only blocks the same size as cpu only blocks get better temps, so it doesn't have to do with the size of gpu full cover blocks.
    Last edited by homefry; 05-19-2011 at 12:14 PM.

  10. #10
    Xtreme Member
    Join Date
    Jan 2009
    Posts
    100
    Quote Originally Posted by stephenswiftech View Post
    1/ Greater heat load (sometimes)
    2/ Larger die surface (i.e. Smaller Heat Flux - i.e. easier to cool)
    3/ Probe location (high gradients of temperature throughout the die surface)
    4/ IHS vs no IHS (on AMD at least..)

    #2 + #3 make CPU and GPU not really comparable temperature wise
    1. Some systems with a 5970 and 920 have lower temps on the gpu even though the gpu puts out more heat.
    3 This is a non issue because most probes are within the hottest areas.
    4. IHS should have a minimal impact. Surely not as much as 20C delta we see in systems.

  11. #11
    -100c Club
    Join Date
    Jun 2005
    Location
    Slovenia, Europe
    Posts
    2,283
    You cannot compare two different sensors, it's like comparing apples to oranges.

  12. #12
    Xtreme Addict
    Join Date
    Jun 2007
    Posts
    1,442
    Quote Originally Posted by homefry View Post
    1. Some systems with a 5970 and 920 have lower temps on the gpu even though the gpu puts out more heat.
    3 This is a non issue because most probes are within the hottest areas.
    4. IHS should have a minimal impact. Surely not as much as 20C delta we see in systems.
    by size of die... referring to power density of cpus are higher. 150W consumed in size of 1 fingerprint it going to have a higher temp gradient than 200W consumed in size of 2 fingerprints. Every time the die shrinks, power density rises, so architectural improvements and smaller v must counterbalance rise.

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •