circuit breaker temperature differential

Hey Everyone,

I was trolling youtube and came across this video. https://youtu.be/_Kl7OeTuvAY

At about 3:30 seconds in the video he talks about the temperature of the circuit breaks. He states if there is more than a 5 degree difference there may be an issue.

My questions is, since he doesn’t cite anything for this. Does anyone have any kind of reference for why you should check for a temperature difference or what is the basis for this?

Out of curiosity I’ve been checking the temperature as I go on service panels and came across one so I could post in here. Attached are pictures of a temperature difference.

It’s far beyond the SOP, but I’m curious to know. What does this mean and how could one know without knowing every type and manufacturer variances?

I found this old thread on this topic going back a ways also. http://www.nachi.org/forum/f19/checking-temperature-breakers-6154/

Sincerely,

Stephen Rager :mrgreen:

Seems like a rather imprecise way of going about this. I would prefer to take a thermal image of the entire panel. As to temperature, that would entirely depend on if there is a load on the individual circuits. A FLIR One would give you better temperature reading as it measures the entire range of temperatures within the panel . Acceptable ranges for circuits are readily available online.

Opps sorry I forgot I am on notice. :roll:

Yes you can get a thermal imager… on eBay for a hundred and fifty bucks.

http://pages.ebay.com/link/?nav=item.view&alt=web&id=371424027532&globalID=EBAY-US

^ Blind leading the blind.

I scan the breakers with the IR thermometer but I have no hard rule ( 5 degrees) as you saw in the video.

Certainly I’m looking for variation from the ambient breaker temps.
If I’ve been running a lot of hot water during the inspection I’d expect the related breaker might be warmer.

I see 5 degree variations commonly, so I’m looking for a greater variant than that.

Is this “scientific”, no of course not and there is no hard and fast rule to go by.
But yeah I’ll certainly call it out if one is 20 degrees higher than the average.

He must be getting some sort of kick back. :wink:

10, 20, 30 even 50 degree differences in breaker temperature is generally nothing to be concerned about. It has much to do with the type of breaker (standard, AFCI, GFCI) and the load being imposed on it.

I had a double pole, 50 amp breaker yesterday where one pole was 318 degrees under a 13 amp load. I think that could be a problem :smiley:

This idea comes from the guy that invented the HVAC temperature split BS!

Reality check:

Yes, there may be a significant issue at 5° temperature differential.

But!… You must know what you are taking a temperature of.

Is it a direct or indirect temperature measurement?

Did you correctly adjust for emissivity of the material?

Did you take any amperage readings and calculate the apparent temperature rise based upon circuit capacity?

To what standard are you inspecting the panel by?

What is your reference temperature?

Did you comply with the spot size ratio of your measurement tool?

If you cannot ensure accuracy of your emissivity adjustments (or even if you can make them or not) the temperature differential error factor itself can be greater than 5 to 8°F.

So my recommendation to anyone that wants to use a thermal measuring device is when you find a temperature rise (of any proportion) you visually inspect the component for damage. If it is not there, move on.

You may be compelled to recommend further evaluation by an electrician, but in my opinion you should also be compelled to pay for that service call if there is nothing wrong! After all, if you cannot verify what you are talking about in the report, you should bring someone in that can (on your dime). When you do things like this, that’s what your client pays the big money for instead of Mr. Lowball down the street.

Interesting, only rarely do I ever find the breakers varying much at all.

So, you don’t apply 40% load on the circuits you are testing?!

At 40%, HVAC, oven, stoves, water heaters all have 5F or more.

If you use the right test equipment, AFCI, GFCi temp differential is obvious.

Typically I apply 42.5% load. I’ll try to achieve exactly 40% load from now on.:roll:

Here is a link from Fluke about this. And btw I disagree with what it says.

But hey this is from an international conglomerate that specializes in measurement devices so what do I know?

Anyway, it would be nice to determine if using an IR thermometer is of any utility at all for this purpose and what the protocols would be for checking ambient temps of CBs.

This one had a really poor connection. The first clue was the sound of sizzling bacon when I removed the dead front cover…

Just curious, where is the dividing line between reporting and not reporting a higher temperature reading? Isn’t it possible that a semi-loose connection with a 1 amp load would have a temperature profile similar to that of a fully loaded circuit breaker?

That’s why you take amperage and do the math to come up with an apparent temperature rise.

You learned that in school, right?

I’m well aware of temperature rise and calculations but is that part of the SOP of a home inspector? I’ve seen quite a few photo’s on here from IR devices that simply mention the temperature. I don’t recall ever seeing someone mention that they also checked the conductor ampacity and compared that the reading provided by the instrument.

No it is not. Hence the huge debt about IR and guys like David and Chuck helping some understand to do themselves a favor and get trained as well buy a good IR camera before they get in trouble.

When would anything outside of home inspection fall under the home inspection standard of practice?

We seem to be reluctant as a group to share how we assess and document thermal exceptions or any exception for that matter (it’s apparently super secret stuff). Here’s an example of how I document a low level thermal exception based on apparent temperature. There is nothing dramatic or particularly exciting about this one, it just happens to be the most recent example from this week. Keep in mind that I charge a separate additional fee for thermography - I do not consider this in-scope for a “home inspection”.

IMO: if you endeavor to document thermal exceptions in electrical equipment, you need to be prepared to provide the relevant information and apply some standard to the exception. I’d like to see some other folks share their method of reporting as well.

http://www.nachi.org/forum/attachment.php?attachmentid=101183&d=1441407148

BTW: I think that the video is a disaster waiting to happen. This guys is all handsy, touching things all over the interior of the panel in a most cavalier manner. How many in his YouTube audience (this is targeted to homeowners) understand which components in that panel will electrocute them and which won’t? He tells these people to remove the cover annually and do their own inspection without one mention of the hazards inside there.

Excellent post, thanks Chuck. :cool:

I was merely asking for my own edification, these instruments in the hands of untrained individuals seem to do more harm than good. I haven’t watched the video, but I will say that some of the electrical videos on YouTube are the worst example of electrical system explanation that I’ve ever seen.