Page 178 of 180 FirstFirst ... 78128168175176177178179180 LastLast
Results 4,426 to 4,450 of 4486

Thread: Real Temp - New temp program for Intel Core processors

  1. #4426
    Registered User
    Join Date
    Nov 2007
    Posts
    34
    Thank you to give us RT GT 3.70. Has RT now command line Switches to enable / disable C-States and Turbo?
    Intel w3520(G0)@4.2Ghz@1.35V@eVGA X58 3x SLI classified, Swiftech Apogee XT ultra, 6 GB (3 x 2GB) Mushkin XP3-120800 1600MHz 6-7-6-24@1.65V, GeForce 7900GTX, 3x 74er Raptor WD740ADFD, 2x 250er WD2500YS RE, 1x 160er Samsung SV1604, Corsair HX Pro 750W, 2x HP LP2475w"

  2. #4427
    Xtreme Member
    Join Date
    Sep 2011
    Location
    Canada
    Posts
    147
    Whats the easiest way to make the program start on launch?


    Carnage

    [Silverstone TJ11] [Silverstone 1200w]
    [EVGA Z77 FTW] [I5 2500k] [8gb Samsung 30nm]
    [Sapphire HD 7970]
    [Sound Blaster Titanium HD] [Sennheiser PC360]
    [Patriot Wildfire 120gb] [Seagate 1.5tb 7200pm]
    [Steelseries Sensei w/ I-2 Mousepad] [24" 1920x1200 Samsung]

  3. #4428
    Xtreme Member
    Join Date
    Dec 2004
    Location
    Austria
    Posts
    319
    windows autostart ?
    1st: i7 2600K@4,6Ghz@1,28 Vcore || Asus MIVE Rev.3 || Bios 3208 || 2 x 4096 MB Corsair Vengeance || Asus GTX 580 CU II || WC with Koolance-370 || EK-VGA Supreme HF || Mora3 Pro

    2nd: i7 2600K@4,6Ghz@1,32 Vcore || Asus MIVE-Gene Z68 || EVGA GTX 280 || 2 x 4096 MB ADATA XPG G Series v2.0 || WC with - EK-Supreme HF Full Nickel || VGA HK || Mora2 Pro

    5,0 Ghz LinX_Stable_Club

  4. #4429
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Cochrane, Canada
    Posts
    2,042
    Cumulonimbus: I don't ever use the command line so I don't plan to add any command line switches to RealTemp.

    nksharp: For Windows autostart, I use the Task Scheduler.

    http://forum.notebookreview.com/hard...ml#post6865107

    If you are an Administrator on your account and you are not using UAC then try dragging a link to RealTemp into your Windows Startup folder.

  5. #4430
    Registered User
    Join Date
    Jan 2007
    Posts
    14
    Hi, need some help here. I just bought a used Q9450 (rest of the system is in sig) which I installed yesterday and very much intend to overclock. Upon installing the CPU I first cleaned it and also my HSF with Arcticlean, or whatever it's called, and put some AS5 on the CPU. I set all voltages in BIOS to minimum value, except the vcore and multi which I left on auto. I also set speedstep and the other CPU features to enabled, and set the FSB to 333. When I got into Windows I started Real Temp and noticed my temps were really weird, which they never were with my old 65nm E6600. I ran IBT and it seemed stable, so I started reading a bit about the C2Q sensor problems and got here. So I ran the sensor test in RT next, with the settings described above:



    At this point I thought maybe something was wrong with my installation so I turned the case on the side, no difference. I then removed the HSF, cleaned it and the CPU, and then put some NT-H1 on it. At this point ran the sensor test again, using the same settings as before and here are the results:



    And nothing. Rebooted into BIOS, set vcore to 1.3875V (stock seems to be 1.2375V according to RT) and booted into Windows. Ran the sensor test a third time and guess what? This:



    THE F is going on? What readings am I supposed to trust here, none of them?
    Asus P5E Deluxe
    WEIRD Q9450 C1 @ stock (at the moment) w/ NH-D14
    2x2GB Corsair @ 800 MHz 5-5-5-18
    Palit GTX260 SP216 Sonic 896MB @ 652/1187/1451
    Silverstone Fortress FT02B
    Corsair 620W
    36GB Raptor + 3TB Caviar Green
    64-bit Windows 7
    ViewSonic VX2025WM

  6. #4431
    Xtreme Member
    Join Date
    Feb 2004
    Location
    Home of the Sun Devils
    Posts
    360
    Will Sandy Bride-E support be coming soon to RealTemp?
    i7-3930K
    GA-x79-UD3
    Corsair H100
    4x2GB F3-12800CL6
    X-Fi Xtreme Music
    EVGA GTX 470
    Pioneer BDR-207DBK
    2X128GB Samsung 830's
    3XWD2002FYPS
    Corsair TX850
    Dell U2412M
    ___________________

  7. #4432
    Xtreme Enthusiast
    Join Date
    Nov 2008
    Posts
    877
    Quote Originally Posted by Peakr View Post
    Will Sandy Bride-E support be coming soon to RealTemp?
    SB-E IS supported. Go one page back and read.
    Maximus 5 Gene | i7-3770K @ 5GHz | ADATA 2x2GB @ 2.6GHz 9-12-10-28-1T | HD7970 @ 1200/6400
    Rampage 4 Extreme | i7-3930K @ 5GHz ||| X58-A OC Orange | i7-980X @ 4.6GHz

  8. #4433
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Cochrane, Canada
    Posts
    2,042
    Quote Originally Posted by SS_The_Demon View Post
    What readings am I supposed to trust here, none of them?
    The temperature sensors that Intel used on there 45nm Core 2 processors were junk. They get stuck at lower temperatures and have a pile of error when trying to report full load temperatures. Intel was always a little sheepish about just how bad they were. I have never seen any official specs for how much error they have but I know from experience that it is significant.

    You are correct. On some of these 45nm CPUs, you can't trust any of these sensors. Intel only intended these sensors to be used for thermal throttling and thermal shutdown control and for those two purposes, these crappy sensors are more than good enough. Intel spent a few more pennies on the Core i sensors but they are still only intended to be used for thermal control. None of them were ever intended to be used for 100% accurate temperature reporting.

    As far as I know, RealTemp should work on the Ivy CPUs too.

  9. #4434
    Registered User
    Join Date
    Jan 2007
    Posts
    14
    Quote Originally Posted by unclewebb View Post
    The temperature sensors that Intel used on there 45nm Core 2 processors were junk. They get stuck at lower temperatures and have a pile of error when trying to report full load temperatures. Intel was always a little sheepish about just how bad they were. I have never seen any official specs for how much error they have but I know from experience that it is significant.

    You are correct. On some of these 45nm CPUs, you can't trust any of these sensors. Intel only intended these sensors to be used for thermal throttling and thermal shutdown control and for those two purposes, these crappy sensors are more than good enough. Intel spent a few more pennies on the Core i sensors but they are still only intended to be used for thermal control. None of them were ever intended to be used for 100% accurate temperature reporting.

    As far as I know, RealTemp should work on the Ivy CPUs too.
    So in my case, you don't think I can trust any of the readings, right? I mean considering that I raised vcore from 1.2375 to 1.3875 and the temp hardly rose on any of the cores, with the exception of core 1 which rose 8 degrees. I would think that the temp should rise alot more then that considering the huge increase in voltage? Not that it seems to matter because I can't seem to get the damn thing stable at even 3.3 GHz...
    Asus P5E Deluxe
    WEIRD Q9450 C1 @ stock (at the moment) w/ NH-D14
    2x2GB Corsair @ 800 MHz 5-5-5-18
    Palit GTX260 SP216 Sonic 896MB @ 652/1187/1451
    Silverstone Fortress FT02B
    Corsair 620W
    36GB Raptor + 3TB Caviar Green
    64-bit Windows 7
    ViewSonic VX2025WM

  10. #4435
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Cochrane, Canada
    Posts
    2,042
    Core 0, the core on the far left of RealTemp is probably the best of a bad bunch but there is no way to tell for sure. The 2 cores on the left appear to have some significant slope error where they change at a different rate than the temperature changes and core 1 has some issues with getting stuck at lower temperatures.

    If you can't get one of these CPUs stable at 3.3 GHz then you probably have other problems besides core temperature. Run CPU-Z while Prime95 is running and see what is reported for actual voltage. Some boards are not great when it comes to overclocking Core 2 Quads and crap out when the bus speed goes over 400 MHz. I have an old Asus board that can run a Core 2 Duo reliably at a bus speed of over 500 MHz but falls on its face when trying to overclock a similar Core 2 Quad. I ended up buying a QX9650 for this board so I could overclock by adjusting the multiplier higher which allowed me to keep the bus speed at 333 MHz for better stability.

  11. #4436
    Registered User
    Join Date
    Jan 2007
    Posts
    14
    Quote Originally Posted by unclewebb View Post
    Core 0, the core on the far left of RealTemp is probably the best of a bad bunch but there is no way to tell for sure. The 2 cores on the left appear to have some significant slope error where they change at a different rate than the temperature changes and core 1 has some issues with getting stuck at lower temperatures.

    If you can't get one of these CPUs stable at 3.3 GHz then you probably have other problems besides core temperature. Run CPU-Z while Prime95 is running and see what is reported for actual voltage. Some boards are not great when it comes to overclocking Core 2 Quads and crap out when the bus speed goes over 400 MHz. I have an old Asus board that can run a Core 2 Duo reliably at a bus speed of over 500 MHz but falls on its face when trying to overclock a similar Core 2 Quad. I ended up buying a QX9650 for this board so I could overclock by adjusting the multiplier higher which allowed me to keep the bus speed at 333 MHz for better stability.
    I just lapped the CPU, which seemed to lower the temps a bit, but again it's hard to tell since the sensors don't seem to work. Testing 3.3 GHz at the moment, using 413 x 8. Several days ago I had it running at 458 x 7 so it doesn't appear to be the board that's holding me back. Using a P5E Deluxe btw. Vdroop seems to be the biggest problem atm. Setting a voltage of 1.3875 in BIOS which I'm using for 3.3 GHz currently, let's me idle @ 1.368V according to CPU-Z. While running IBT it drops down to 1.304V however, with LLC disabled that is. And I would prefer not using LLC since it causes my CPU to degrade even more. And since Intel's spec says max voltage shouldn't be more than 1.3625V I'm hesitant to raise it more, since I'm already slightly above that at idle. So sure, I probably could go higher if I wanted to, unless temps become a problem, which I have no idea of knowing.
    Last edited by SS_The_Demon; 04-08-2012 at 08:57 AM.
    Asus P5E Deluxe
    WEIRD Q9450 C1 @ stock (at the moment) w/ NH-D14
    2x2GB Corsair @ 800 MHz 5-5-5-18
    Palit GTX260 SP216 Sonic 896MB @ 652/1187/1451
    Silverstone Fortress FT02B
    Corsair 620W
    36GB Raptor + 3TB Caviar Green
    64-bit Windows 7
    ViewSonic VX2025WM

  12. #4437
    Xtreme Addict
    Join Date
    Feb 2008
    Location
    Russia
    Posts
    1,910
    Dear, unclewebb. Couldn`t you write Real Temp Mac OS version? Is it real?

    Intel Q9650 @500x9MHz/1,3V
    Asus Maximus II Formula @Performance Level=7
    OCZ OCZ2B1200LV4GK 4x2GB @1200MHz/5-5-5-15/1,8V
    OCZ SSD Vertex 3 120Gb
    Seagate RAID0 2x ST1000DM003
    XFX HD7970 3GB @1111MHz
    Thermaltake Xaser VI BWS
    Seasonic Platinum SS-1000XP
    M-Audio Audiophile 192
    LG W2486L
    Liquid Cooling System :
    ThermoChill PA120.3 + Coolgate 4x120
    Swiftech Apogee XT, Swiftech MCW-NBMAX Northbridge
    Watercool HeatKiller GPU-X3 79X0 Ni-Bl + HeatKiller GPU Backplate 79X0
    Laing 12V DDC-1Plus with XSPC Laing DDC Reservoir Top
    3x Scythe S-FLEX "F", 4x Scythe Gentle Typhoon "15", Scythe Kaze Master Ace 5,25''

    Apple MacBook Pro 17` Early 2011:
    CPU: Sandy Bridge Intel Core i7 2720QM
    RAM: Crucial 2x4GB DDR3 1333
    SSD: Samsung 840 Pro 256 GB SSD
    HDD: ADATA Nobility NH13 1GB White
    OS: Mac OS X Mavericks

  13. #4438
    Registered User
    Join Date
    Nov 2006
    Posts
    9
    How is the 22nm support on the latest Real Temp, i know that it reads the temperature and everything appears to be OK, but myself and many others are quite surprised in terms of the temperatures reported. I've tested it with 2 3770K's and a 3570K. On the 3770K's i'm seeing close to 80C with just 1.35v, i'm using chilled water so the water temp is 10c under load, in other words i have a 70C Delta T between the reported temps and the water block at just 1.35v. On the 3570K i got a load temp that peaked @ 68C (62.5C on average between the 4 cores) with 1.4v, with 2600K with HT off i got an average or 37.75C with the same voltage. Is it really possible that the 22nm chips are 25C hotter on average between all 4 cores compared to the 32nm chips. See my test results under:

    Load with Prime95

    ff1dd25b_Thermal.jpeg

    @ 1.1V IB runs 4,25C hotter then SB with HT off. (16.03%)
    @ 1.2V IB runs 10,75C hotter then SB with HT off. (36.44%)
    @ 1.3V IB runs 15,00C hotter then SB with HT off. (44.11%)
    @ 1.4V IB runs 24,75C hotter then SB with HT off. (65.56%)

    Also the TJ Maxx is reported as 110C, where is this taken from, and could it be that the programs are reading wrong from the sensors considering that this are all new chips that is not yet released? (Getting the same temps in Core Temp btw)

    Thanks.

  14. #4439
    Xtreme Mentor stasio's Avatar
    Join Date
    Jan 2008
    Location
    Malaysia
    Posts
    3,036
    TJ Max for Sandy is 98C.
    TJ Max for Ivy is reported as 105C.
    Need a Gigabyte latest BIOS?
    Z370 AORUS Gaming 7,
    GA-Z97X-SOC Force ,Core i7-4790K @ 4.9 GHz
    GA-Z87X-UD3H ,Core i7-4770K @ 4.65 GHz
    G.Skill F3-2933C12D-8GTXDG @ 3100 (12-15-14-35-CR1) @1.66V
    2xSSD Corsair Force GS 128 (RAID 0), WD Caviar Black SATA3 1TB HDD,
    Evga GTS 450 SC, Gigabyte Superb 720W
    XSPC RayStorm D5 EX240 (Liquid Ultra)
    NZXT Phantom 630 Ultra Tower
    Win 7 SP1 x64;Win 10 x64

  15. #4440
    Xtreme Cruncher
    Join Date
    May 2007
    Location
    CA
    Posts
    1,885
    IB runs much hotter than SB. It's been reported everywhere. Even here.

    "Ivy Bridge exhibits much higher temperatures during full load due to its 22nm process, which will probably only get better though cooling optimizations and better contact between the HIS and the CPU Die."

    http://www.xtremesystems.org/forums/...s+Overclocking
    Cooler Master HAF 942
    Sabertooth X79
    Win7 64
    3960X @ 4805 1.376 v-core
    32GB DDR3 1866 G.SKILL Ripjaws Z
    OCZ RevoDrive 3 series RVD3-FHPX4-120G PCI-E 120GB
    3 X 6T Raid 0 Hitachi Storage
    Themaltake Tough Power 1200
    1 HD 7970

    F@H badge by xoqolat



  16. #4441
    Xtreme Mentor
    Join Date
    Oct 2004
    Location
    SLOVENIJA
    Posts
    2,594
    what is latest version?
    ASUS P5K-E // E8400 Q746A519
    G.Skill F2-8000CL5D-4GBPQ
    LC 550W GP// XPERTVISION 9600GT

  17. #4442
    Xtreme Cruncher Russ_64's Avatar
    Join Date
    Aug 2005
    Location
    London, UK
    Posts
    850
    Quote Originally Posted by SimpleTECH View Post
    I think it is this one.
    Asus Maximus VIII Ranger Z170 : Core i5-6600K : EVGA RTX2080 XC : 16Gb Corsair Vengeance DDR4-3200 : 256Gb Crucial MX500 : Corsair H100i : PCP&C 750w 60A : CM Cosmos S : Windows 10 x64
    Asus Z8NA-D6 : Dual Xeon E5645 : 24Gb DDR3-1333 ECC : MSI GTX470 : 120Gb Samsung EVO 840 : 1TB HDD : PCP&C 750w 60A : CM Stacker : DD MC-TDX, EK-FC470, RX240+RX120, D5 X-Top, BayRes : VMware ESXi 6.7.0 - VM's - WCG crunchers x 5 (Ubuntu 18.04 LTS), Mint 19, Windows 10 Insider Preview
    Sophos XG 17.5.3 running on GA-Z97-Wifi : Core i3 : 8Gb DDR3-1600 : 120Gb SSD : Corsair H80
    BenQ GW2765, Aten 4-port KVM, Asustor AS5002 4Tb NAS, Belkin 1500va UPS, Sky Fibre Max 80/20Mbps


  18. #4443
    Xtreme Member
    Join Date
    Dec 2005
    Location
    -X-
    Posts
    165
    It would be nice to have a Min/Max CPU temp when hover on systray like the GPU Min/Max, will we see it in the next update, tnx?
    Last edited by -X-hellfire; 06-12-2012 at 07:24 AM.
    Gigabyte P35-DQ6 - rev 1.0, F7 bios | Kentsfield Q6600 G0 - 2.4 @ 3.200 Ghz, 400x8, Vcore 1.300V | Corsair HX-620W PSU | Realtek HD audio 7.1 mb | SATA: 0-3:4x1TB Samsung Spinpoint F3 in RAID 10, 64k stripe on Intel Matrix Storage Manager with volume c:128GB, d:1.7TB, 4:250 GB Samsung SSD 840 EVO, nonraid: SATA: 5:1TB Samsung Spinpoint F3, 1TB Samsung Spinpoint F1 on Gigabyte SATA2/Jmicron | usb3:Silverstone EC04P- (1x-pcie) | SATA:Rocket 620 (4x-pcie) | XFX 8800GTS FATAL1TY 320MB RAM | Corsair XMS DDR2 PC6400 5-5-5-18 2 x 2x2048 8GB kit @ 800MHz +( default )V in bios | ThermalRight Ultra EXTREME 120 + Noctua NF-P12 120mm fan | 27" QNIX 2710LED, IBM P97 19" gone bad | Samsung SH-203N DVD; firmware SB01 | Logitech MX1000 + MX600 Laser Mouse, Comfort Cordless Keyboard | Dlink DIR-855 Firewall wireless 100/10, DWA-556 (300N) | 2 x T-Balancer XL fancontroller with 8 fans on Attenuators| 3 x Noctua NF-P12 120mm, NF-R8 80mm, CT80 80mm, 2xPanaflo 80mm | case1: CM Stacker T01 | OS: 1:Windows XP Pro, 2:64-bit 3:Win 8.1 64-bit 4:Win 7 64-bit | case2: CM HAF 932 | Corsair HX-520W PSU

  19. #4444
    Xtreme Addict
    Join Date
    Dec 2007
    Location
    Earth
    Posts
    1,787
    Hi Uncle

    I would like to ask do you have anything documenting RTCore.dll. I am writing a plugin for LCD Smartie to get CPU core temps, and I would love to use your library if that is ok with you.

    Thanks
    Sandy Bridge 2500k @ 4.5ghz 1.28v | MSI p67a-gd65 B3 Mobo | Samsung ddr3 8gb |
    Swiftech apogee drive II | Coolgate 120| GTX660ti w/heat killer gpu x| Seasonic x650 PSU

    QX9650 @ 4ghz | P5K-E/WIFI-AP Mobo | Hyperx ddr2 1066 4gb | EVGA GTX560ti 448 core FTW @ 900mhz | OCZ 700w Modular PSU |
    DD MC-TDX CPU block | DD Maze5 GPU block | Black Ice Xtreme II 240 Rad | Laing D5 Pump

  20. #4445
    Registered User
    Join Date
    Dec 2005
    Location
    Ireland
    Posts
    47
    Hi

    If I remember correctly Real Temp gave an approximation of watts usage on Sandy Bridge ?

    Can we has this for Ivy ?

    Thankies
    ________________________________________________
    LIVE LONG AND PROSPER





    AsRock P67 Fatal1ty B3
    Intel Core i7 3770k ~ 3.0 Ghz (UC) WTF @ 0.8 volts
    8GB Gskill F3 DDR3 2200 Mhz CAS9
    Ati Radeon 5870 1gb @ 1ghz ~ in water loop
    Plextor M2S 256GB
    OCZ Vertex 3 60GB
    DVD+RW Rom Drive
    Corsair AX850 Semi-Passive
    ThermalTake Kandalf Black
    Custom Water Setup 2 Passive XSPC Towers + Quad Black Ice Extreme 480 ~ DangerDen CPU Block
    HannsG 28" Widescreen TFT
    Windows 8 x64

    Some Pics https://picasaweb.google.com/115978793057748435380/CPU





    ________________________________________________

  21. #4446
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Cochrane, Canada
    Posts
    2,042
    Quote Originally Posted by Jonny5isalivetm View Post
    If I remember correctly Real Temp gave an approximation of watts usage on Sandy Bridge?
    Intel came up with a great idea so monitoring software could easily read power consumption data directly from the CPU. This data is used internally by Sandy Bridge and Ivy Bridge CPUs to control the amount of Turbo Boost. Unfortunately, this data is based on the VID voltage and not the actual core voltage. If you go into the bios and change the CPU core voltage manually, this estimated power consumption number will not be accurate. I will try to add this data back to RealTemp in the near future for Ivy Bridge CPUs.

    RTCore.dll was designed to work with RivaTuner. When RivaTuner was replaced by MSI Afterburner, I stopped working on the RTCore.dll plugin. Reading the core temperature of a Core i CPU is trivial. You would be better off just writing your own code but you can try and use the RTCore.dll however you like.

  22. #4447
    Xtreme Enthusiast
    Join Date
    Nov 2008
    Posts
    877
    Quote Originally Posted by unclewebb View Post
    Intel came up with a great idea so monitoring software could easily read power consumption data directly from the CPU. This data is used internally by Sandy Bridge and Ivy Bridge CPUs to control the amount of Turbo Boost. Unfortunately, this data is based on the VID voltage and not the actual core voltage. If you go into the bios and change the CPU core voltage manually, this estimated power consumption number will not be accurate. I will try to add this data back to RealTemp in the near future for Ivy Bridge CPUs.
    What if you take the power consumption, divide it with CPU VID, and multiply it by vCore? Wouldn't it give approximate real power?

    [thinking out loud]If you read first one, then before you read second one everything drops, you could get abnormal result, very high power consumption.[/thinking out loud]
    Maximus 5 Gene | i7-3770K @ 5GHz | ADATA 2x2GB @ 2.6GHz 9-12-10-28-1T | HD7970 @ 1200/6400
    Rampage 4 Extreme | i7-3930K @ 5GHz ||| X58-A OC Orange | i7-980X @ 4.6GHz

  23. #4448
    Xtreme Addict
    Join Date
    Dec 2006
    Location
    Cochrane, Canada
    Posts
    2,042
    If you were doing some Prime95 or similar testing where the VID and actual voltage were fairly consistent, you probably could come up with some sort of correction factor.

    Power consumption in a CPU is proportional to voltage squared. If VID voltage was 1.20 volts and actual voltage was 10% higher at 1.32 volts then the correction factor would be:

    1.10 x 1.10 = 1.21

    In that case, actual power consumption would be about 21% higher than what RealTemp is showing.

    Does any software show actual power consumption based on actual voltage? Any monitoring software that uses the Intel method and the Intel recommended power consumption register is not accurate.

  24. #4449
    Registered User
    Join Date
    Dec 2005
    Location
    Ireland
    Posts
    47
    Okay thanks for your efforts, excellent program :-)
    ________________________________________________
    LIVE LONG AND PROSPER





    AsRock P67 Fatal1ty B3
    Intel Core i7 3770k ~ 3.0 Ghz (UC) WTF @ 0.8 volts
    8GB Gskill F3 DDR3 2200 Mhz CAS9
    Ati Radeon 5870 1gb @ 1ghz ~ in water loop
    Plextor M2S 256GB
    OCZ Vertex 3 60GB
    DVD+RW Rom Drive
    Corsair AX850 Semi-Passive
    ThermalTake Kandalf Black
    Custom Water Setup 2 Passive XSPC Towers + Quad Black Ice Extreme 480 ~ DangerDen CPU Block
    HannsG 28" Widescreen TFT
    Windows 8 x64

    Some Pics https://picasaweb.google.com/115978793057748435380/CPU





    ________________________________________________

  25. #4450
    Registered User
    Join Date
    Jul 2008
    Posts
    9
    Hi Unclewebb, does RT support Rainmeter? If not, may I request it?

Page 178 of 180 FirstFirst ... 78128168175176177178179180 LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •