Disclaimer

This information HAS errors and is made available WITHOUT ANY WARRANTY OF ANY KIND and without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. It is not permissible to be read by anyone who has ever met a lawyer or attorney. Use is confined to Engineers with more than 370 course hours of engineering.
If you see an error contact:
+1(785) 841 3089
inform@xtronics.com

Computer Component Standard


Components


Thermal Pads Vs Thermal Grease - a critical failure point issue!

Thermal pads AKA the devils bubblegum is a serious failure point. Isopropyl usually not effective in dissolving the devils bubblegum.

After reading how thermal pads are as good as the grease I just used the heat-sink as it came. (I was starting to believe.) Even if the 'new' way works as well, after a year of cooking under a Athlon it clearly failed. The key issue seems to be poor lifespan. Yet, the film sellers claim that the grease works it way out over time - something I've never seen in my 35 years of working in electronics and they don't use pads in mil-qualified equipment that I've seen.

The dead computer had such a pad coating - but when I took it apart it no longer was at all flexible and over the CPU area it had clearly gotten VERY hard and brown. I cleaned it off with paint thinner (some recommend acetone) and had to scrape the top of the Athlon die with a screwdriver (it had turned very hard and brown like burned on food!). I put on some artic silver and now it works fine. (update: still working fine a year later). I have never seen thermal grease fail even at extreme temperatures that even burned the circuit boards!

There are two advantages the pads or film coatings offer:

  1. They are user friendly - and provide some cushion for those that can't seem to put the heat sink on gently (end users). With a little practice you can put heat sinks on processors with out stressing them. Find an old (dead) mother board and practice! uP manufacturers specify the pads so they don't have to tell customers they are clumsy.
  2. Pads are less messy. Working with grease takes a bit of practice. On the other hand, a smudge or two of grease isn't going to hurt a thing.


Bottom line - Grease works better

The - "works it way out over time" line is just advertising FUD that has no research to back it that I know of. I don't think there is much difference between the brands of the films, but my experience designing circuit boards that used heat sinks and thermal grease with power transistors over the years gives me a some expertise that the average computer user lacks. Grease will be a thinner film - all things equal, a thin layer will have less thermal resistance than a thick layer. Next, I have seen nothing in any engineering publication that shows any problem with thermal grease other than it is messy. If you have contrary information please post it here.

The few degrees of difference for the CPU is not a big deal, BUT once a pad starts to fail, it will fail completely. This run away failure is caused because as pad fails to transfer heat, its own temperature gets higher which causes the pad to fail even faster.

The temperature difference measured in the above link was when both the pad and grease were new. I will stick my neck out a bit here and guess that the 3-4 degrees C difference will increase with time and then very well could make a difference - random crashes etc.

Motorola once had a design note on how to measure transistor temperature - they had us drill a tiny hole in the metal part of the transistor tab so we could mount the thermistor as close as possible to the transistor junction. On a computer heat sink, it lays right on the chip top and there is no place to put a thermistor on the chip - but by drilling a tiny hole all the way through the heat sink, one can mount a tiny thermistor that could touch the top of the chip. Packing the hole with grease and keeping this hole VERY small (1/16") will allow you to measure the temperature of the grease interface between the heat sink and chip top. The extremely small thermistor for this are a specialty item, you probably would have to contact some thermistor manufacturers to find them. I also would not use such a drilled heat sink on a production system.

Reading chip temperature from the processor or motherboard won't have much accuracy to speak of, BUT as long as you don't change the mother board and processor the resolution of the on chip temperature sensor should be quite good and allow one to make valid comparisons between different heat-sinks and thermal interface products. Just don't expect to be able to make compare to your friends system with the same hardware.

Best heat-sink compound appears to be arctic-silver 5


Testing

I've come up with a burn in procedure: You can set most BIOS to a slight over clock burn in setting. I put tissue paper over the air filters to raise the power supply inlet temp by 30 Deg F And ran I then boot up any tinny CD version of Linux and then I run cpuburn, https://packages.debian.org/search?keywords=cpuburn memtest, bonnie++, stress over the next nights.

Also see Breakin Memtest , vmt (Video Memory Test) ext. 


Management Engine notes

So modern computers have a management engine - originally it was a way to keep people from burning up the processor with bad heatsinks (to save money for Intel and the like).  So the management engine is a separate processor that can take over and throttle the CPU and memory management chips.  This can show up as CPU usage spiking - or real time application having problems ( audio, video, VoIP,  machine control).

These management engines are on everything the public buys now - sold as improving security - but if that were the truth, this secondary processor would not have access to hard drives/Internet etc.  Old computers are used for machine control to get past this problem.




Top Page wiki Index

Email lrak@lrak.net

(C) Copyright 1994-2019 reserved
All trademarks are the property of their respective owners.