where expertise comes together - since 1996

Web's Largest Portal of Nondestructive Testing (NDT)
Open Access Database (Conference Proceedings, Articles, News), Exhibition, Forum, Network

All Forum Boards
Technical Discussions >
Temperature effects on UT Thickness readings research
Career Discussions
Job Offers
Job Seeks
Classified Ads
About NDT.net
Articles & News

NDTSS - Non Destructive Testing Society of Singapore
Visit us for 15th APCNDT 2017

3930 views
08:59 Jan-02-1999
Malcolm Maclean
Temperature effects on UT Thickness readings research

I have been conducting various tests determining exactly how variance in temperature between 0 and 100 degrees celcius effects the thickness readings on steel pipe. Tests have shown errors over 100% or up to 2.7mm in most instances. This appears to be the result velocity in the probe shoe medium varying with temperature due to the mechanical properties varying with temperature.

Question: Is there anyone who has conducted similar tests or has knowledge of this problem. If anyone could shine some light on this for me i would be very grateful, i have limited knowledge in this field of work.

Thanks
Malcolm Maclean


 
02:10 Jan-02-1999

Ed Ginzel

R & D, -
Materials Research Institute,
Canada,
Joined Nov 1998
1197
Re: Temperature effects on UT Thickness readings research Malcolm:
Your assessment that the source of error is in the wedge
material is correct.
Results of dV/dT studies on plastics are often
published but for metals they are not too common.
There is a paper by Chandrasekaran and Salama
(University of Texas), in NonDestructive Methods
for Material Property Determination, edited by Ruud and Green, Plenum Press 1983

They found for that for ASTM 533B steel typical
compression mode dV/dT was -0.64m/s/C degree.

When compared to plastics where a dV/dT on the order of
-3m/s/C degree are common, it is obvious that
thickness variations due to simply metal temperature
differences are not possible in the 100 C degree
range you are working in.

To assess thickness you are probably calibrating your
instrument on a room temperature calibration block.
By the time you have made several measurements on the
warm test piece the plastic temperature has changed.

If you went back to the calibration block (step wedge)
while the wedge was still warm you would probably find
that the zero point had changed wrt the initial calibration.

Instead of using just a digital meter try using a scope
and gate the time interval between the interface and
first backwall. The plastic delayline is probably
necessary to protect the piezo element. If it is a
standard style material temperatures over 40C may
degrade the probe. You could also try a special high
temperature contact probe (non-delayline style) and
again use the scope type display and gate the time from
the backwall to the test surface (i.e. the 2nd & 3rd
signals) but these probes often suffer from poor
resolution due to low damping.

Ed

: I have been conducting various tests determining exactly how variance in temperature between 0 and 100 degrees celcius effects the thickness readings on steel pipe. Tests have shown errors over 100% or up to 2.7mm in most instances. This appears to be the result velocity in the probe shoe medium varying with temperature due to the mechanical properties varying with temperature.

: Question: Is there anyone who has conducted similar tests or has knowledge of this problem. If anyone could shine some light on this for me i would be very grateful, i have limited knowledge in this field of work.

: Thanks
: Malcolm Maclean




 
05:36 Jan-03-1999
Malcolm Maclean
Re: Temperature effects on UT Thickness readings research Thanks for the reference guidance.
I am conducting these tests for my BEng(Hons) project. The results may seem quite alarming to some who use the standard twin 5MHz compression probes for corrosion monitoring up to the temperature limits specified by probe manufactures. I have known errors can exist due to temperature variance but I didn't realise the errors would be to the degree found in the tests (50 degree C temperature increase had increased readings by 1.2mm). I have previously been unaware of any method to reduce the errors apart from attempting to recalibrate at the same temperature as the steel, or using a hot shoe probe as you suggested.

Typical conditions arising for such a variance in temperature would be calibrating outdoors in near zero conditions (common on North Sea installations), test a cold pipe say 5 degrees C, then move the probe on to one say near 60 degrees C; eg crude oil lines, heat exchangers etc. I know that some tech's recalibrate by attempting to bring the temperature of the calibration block to the same as the pipe. But this is a crude technique that is time consuming and doesn't really work that well due to the variance of heat conduction. Most NDT tech’s don't carry surface thermometers to check that calibration and test temp's are the same.

I have found that the variance between calibration and test temperature is a very important factor for some commonly used twin compression probes and has resulted with major errors as mentioned previously.
Various common standard 2.5 & 5 MHz 10 & 15mm dia. twin crystal compression probes I have had available to test (all with perspex shoes) have shown a 0.2 - 0.3mm increase in thickness for every 10 degree C increase in temperature. The average increase in thickness reading after increasing the temperature from 0 to 90 degrees C was 2.5mm, the worst case being 2.7mm variance.

A brief on the test procedure is as follows:-

I used a 2.5 - 25mm step wedge as a test piece, and calibrated the probe at 0 degrees C (reducing the temp of the probe and test block by partially submerging them in an ice bath).
Using an immersion heater, the bath was heated and thickness readings taken in 10 degree C intervals up to 90 degrees C, with the probe and test block only partially submerged in the water bath. Recalibration between the measurements at each temperature was not conducted so a thickness/temperature graph could be constructed to show the error with reference to readings calibrated at 0 degrees C.
Water was the primary couplant but Duckhams LB10 lithium based grease was used to prevent corrosion of the test block and to help "stick" the probe in place for hands free assistance (slight pressure applied at times to ensure close contact was maintained).
Tests were repeated using an Epoch III and USN52 calibrating to 25mm and 50mm ranges reading from the 1st BWE (25mm step being out of range at certain temps), results were consistent. The thickness variances were the same for each step. I will next try taking readings between 2nd and 3rd BWE as you suggest.

I have conducted a test using a hot shoe 5MHz twin crystal compression probe but it wasn’t all that encouraging either. The probe gave a 2mm increase in thickness reading after raising the temp from 0 to 90 degrees C (0.2 - 0.3mm per 10 degrees).
However, tests using various 2.5 & 5 MHz single crystal compression probes saw only a 0.2 - 0.6mm increase in thickness readings between 0 and 90 degrees C. I can only guess that this is because perspex was used as a shoe for the twins but not for the singles hence the difference in the delay lines, is this a correct assumption?. The cygnus meter performed best interestingly enough, only 0.1mm increase in reading occurred between 0 and 90 degrees C. Would this be due to it using multiple echo technique thus reducing the delay line error??

Further tests were conducted calibrating a common 5 MHz Twin 10mm dia compression probe at 0 degrees C then placing it on a 10mm sample at 80 degrees C and timing the thickness change. The rate of change of thickness was approximately linear for the first couple of minutes (11.4sec's/0.1mm increase) then exponentially decayed to a steady state reading of 12.1mm after 10minutes giving a 2.1mm error.

If you could think of any other way this temperature problem could be rectified please let me know. It's not something that I, or may be other NDT techs have been trained to take into consideration. It seems that if the correct probes are not on site then scanning pipework/vessels may turn out to be a time consuming procedure using A-scan displays reading between 2nd and 3rd BWE's.
I haven't seen any procedures or standards stating how to take temperature into account when conducting thickness readings, do you know of any?
Are there are actually twin cyrstal probes on the market that have been tested in similar conditions and do not show a variance of thickness with an increase in temperature, baring in mind that the hot shoe probe I tested still had a 2mm error.
If not, will the alternative be to use equipment that operates using multiple echoe principals for measuring, would this do the job? Could you advise me of any that have A-scan display as most people I know feel uncomfortable relying on numerical displays alone.
I know these findings may raise a few eyebrows and may even have implications on defect sizing with angle work at temperatures between 0 and 90 degrees C. These are common temperatures ranges encountered in plant pipework & vessels. However, I haven’t tested angle probes for temperature effects, has any one conducted tests for these?

I look forward to your feed back.

Regards Malcolm



 
05:56 Jan-03-1999
Malcolm Maclean
Re: Temperature effects on UT Thickness readings research Ed
Thanks for the reference guidance.
I am conducting these tests for my BEng(Hons) project. The results may seem quite alarming to some who use the standard twin 5MHz compression probes for corrosion monitoring up to the temperature limits specified by probe manufactures. I have known errors can exist due to temperature variance but I didn't realise the errors would be to the degree found in the tests (50 degree C temperature increase had increased readings by 1.2mm). I have previously been unaware of any method to reduce the errors apart from attempting to recalibrate at the same temperature as the steel, or using a hot shoe probe as you suggested.

Typical conditions arising for such a variance in temperature would be calibrating outdoors in near zero conditions (common on North Sea installations), test a cold pipe say 5 degrees C, then move the probe on to one say near 60 degrees C; eg crude oil lines, heat exchangers etc. I know that some tech's recalibrate by attempting to bring the temperature of the calibration block to the same as the pipe. But this is a crude technique that is time consuming and doesn't really work that well due to the variance of heat conduction. Most NDT tech’s don't carry surface thermometers to check that calibration and test temp's are the same.

I have found that the variance between calibration and test temperature is a very important factor for some commonly used twin compression probes and has resulted with major errors as mentioned previously.
Various common standard 2.5 & 5 MHz 10 & 15mm dia. twin crystal compression probes I have had available to test (all with perspex shoes) have shown a 0.2 - 0.3mm increase in thickness for every 10 degree C increase in temperature. The average increase in thickness reading after increasing the temperature from 0 to 90 degrees C was 2.5mm, the worst case being 2.7mm variance.

A brief on the test procedure is as follows:-

I used a 2.5 - 25mm step wedge as a test piece, and calibrated the probe at 0 degrees C (reducingthe temp of the probe and test block by partially submerging them in an ice bath).
Using an immersion heater, the bath was heated and thickness readings taken in 10 degree C intervals up to 90 degrees C, with the probe and test block only partially submerged in the water bath. Recalibration between the measurements at each temperature was not conducted so a thickness/temperature graph could be constructed to show the error with reference to readings calibrated at 0 degrees C.
Water was the primary couplant but Duckhams LB10 lithium based grease was used to prevent corrosion of the test block and to help "stick" the probe in place for hands free assistance (slight pressure applied at times to ensure close contact was maintained).
Tests were repeated using an Epoch III and USN52 calibrating to 25mm and 50mm ranges reading from the 1st BWE (25mm step being out of range at certain temps), results were consistent. The thickness variances were the same for each step. I will next try taking readings between 2nd and 3rd BWE as you suggest.

I have conducted a test using a hot shoe 5MHz twin crystal compression probe but it wasn’t all that encouraging either. The probe gave a 2mm increase in thickness reading after raising the temp from 0 to 90 degrees C (0.2 - 0.3mm per 10 degrees).
However, tests using various 2.5 & 5 MHz single crystal compression probes saw only a 0.2 - 0.6mm increase in thickness readings between 0 and 90 degrees C. I can only guess that this is because perspex was used as a shoe for the twins but not for the singles hence the difference in the delay lines, is this a correct assumption?. The cygnus meter performed best interestingly enough, only 0.1mm increase in reading occurred between 0 and 90 degrees C. Would this be due to it using multiple echo technique thus reducing the delay line error??

Further tests were conducted calibrating a common 5 MHz Twin 10mm dia compression probe at 0 degrees C then placing it on a 10mm sample at 80 degrees C and timing the thickness change. The rate of change of thickness was approximately linear for the first couple of minutes (11.4sec's/0.1mm increase) then exponentially decayed to a steady state reading of 12.1mm after 10minutes giving a 2.1mm error.

If you could think of any other way this temperature problem could be rectified please let me know. It's not something that I, or may be other NDT techs have been trained to take into consideration. It seems that if the correct probes are not on site then scanning pipework/vessels may turn out to be a time consuming procedure using A-scan displays reading between 2nd and 3rd BWE's.
I haven't seen any procedures or standards stating how to take temperature into account when conducting thickness readings, do you know of any?
Are there are actually twin cyrstal probes on the market that have been tested in similar conditions and do not show a variance of thickness with an increase in temperature, baring in mind that the hot shoe probe I tested still had a 2mm error.
If not, will the alternative be to use equipment that operates using multiple echoe principals for measuring, would this do the job? Could you advise me of any that have A-scan display as most people I know feel uncomfortable relying on numerical displays alone.
I know these findings may raise a few eyebrows and may even have implications on defect sizing with angle work at temperatures between 0 and 90 degrees C. These are common temperatures ranges encountered in plant pipework & vessels. However, I haven’t tested angle probes for temperature effects, has any one conducted tests for these?

I look forward to your feed back.

Regards Malcolm

: Malcolm:
: Your assessment that the source of error is in the wedge
: material is correct.
: Results of dV/dT studies on plastics are often
: published but for metals they are not too common.
: There is a paper by Chandrasekaran and Salama
: (University of Texas), in NonDestructive Methods
: for Material Property Determination, edited by Ruud and Green, Plenum Press 1983

: They found for that for ASTM 533B steel typical
: compression mode dV/dT was -0.64m/s/C degree.

: When compared to plastics where a dV/dT on the order of
: -3m/s/C degree are common, it is obvious that
: thickness variations due to simply metal temperature
: differences are not possible in the 100 C degree
: range you are working in.

: To assess thickness you are probably calibrating your
: instrument on a room temperature calibration block.
: By the time you have made several measurements on the
: warm test piece the plastic temperature has changed.

: If you went back to the calibration block (step wedge)
: while the wedge was still warm you would probably find
: that the zero point had changed wrt the initial calibration.

: Instead of using just a digital meter try using a scope
: and gate the time interval between the interface and
: first backwall. The plastic delayline is probably
: necessary to protect the piezo element. If it is a
: standard style material temperatures over 40C may
: degrade the probe. You could also try a special high
: temperature contact probe (non-delayline style) and
: again use the scope type display and gate the time from
: the backwall to the test surface (i.e. the 2nd & 3rd
: signals) but these probes often suffer from poor
: resolution due to low damping.

: Ed


: : I have been conducting various tests determining exactly how variance in temperature between 0 and 100 degrees celcius effects the thickness readings on steel pipe. Tests have shown errors over 100% or up to 2.7mm in most instances. This appears to be the result velocity in the probe shoe medium varying with temperature due to the mechanical properties varying with temperature.

: : Question: Is there anyone who has conducted similar tests or has knowledge of this problem. If anyone could shine some light on this for me i would be very grateful, i have limited knowledge in this field of work.

:: Thanks
: : Malcolm Maclean




 
03:59 Jan-04-1999

Ed Ginzel

R & D, -
Materials Research Institute,
Canada,
Joined Nov 1998
1197
Re: Temperature effects on UT Thickness readings research Malcolm:
The standard digital meter uses a mechanism that assesses a zero point through the perspex delayline. This function is to be carried out at the start of a calibration. It is in fact this zero point that is changing as the temperature of the perspex warms. Since the dV/dT for perspex is a negative value the time it takes to traverse the delayline INCREASES as the perspex warms. This appears as an increase in thickness as the start time is still unchanged (if you do not keep re-zeroing) and the total time to the next signal (the backwall) also increases....mostly by the plastic delayline time increase. Your plots would show only the effect of dV/dT on the plastic, as you described them carried out.

You stated:
"...However, tests using various 2.5 & 5 MHz single crystal compression probes saw only a 0.2 - 0.6mm increase in thickness readings between 0 and 90 degrees C. I can only guess that this is because perspex was used as a shoe for the twins but not for the singles hence the difference in the delay lines, is this a correct assumption?..."
Correct! Most single element probes without delayline use a BoronSilicate protective layer (about 0.25mm thick). The dV/dT on this is less than perspex and the thickness of the layer being so thin reduces the total time change.

The Cygnus meter error would be reduced as it would average several GOOD metal multiples with the poor Intial signal. (I assume it used a perspex delayline).

The "Hot-Shoe" probe you used was probably just another form of delayline (usually a material like Vespel). Although better able to take higher temperatures without transferring heat to the piezo element it too would suffer from a change in zero point as temperature changes.

The "hot-shoe" probes you mentioned were not exactly what I meant by High Temperature probes. Etlon (a probe manufacturer in the USA) makes a version that is a piezo element spring loaded against the probe mounting. This has no backing so HOT surfaces will not melt the solder nor distort the epoxy protective face or metal filing loaded backing. However, because there is no back loading, the ringing can be excessive.


Actually the problem of temperature change has been considered in Codes. e.g. British Standards 4331 Part 3 Appendix C. There they are more concerned with the effects on refraction of angle beams but it is still a factor considered. The refracted angle is a function of relative velocity ratios so since the velocity in the steel is little changed, the significant velocity change in the perspex can cause LARGE refracted angle changes.

Panametrics made a Thickness meter (26DL) that incorporated an Ascan display (and I am sure others do also). This is one option to ensure you use the correct signals. If memory serves, Panametrics also made a version of meter for checking thicknesses under paint. This would have used the 2nd and 3rd signals (backwall and return to entry). I think it was the 25DL and it had 3 modes of triggering/gating.

I have heard of a company that had a novel way of contolling thermal effects in delaylines. They made probes with small heating elements to fix the temperature of the plastic.

The simplest fix to your problem seems to be a single element probe with No delayline. The next fix would be to use 2nd and 3rd signal intervals but even with the 2nd and 3rd signal method a single elemt is probably better as the duals tend to introduce some error as a result of roof angle. This is exacerbated on multiple skips.

Ed



 
07:04 Jan-04-1999

Tom Nelligan

Engineering,
retired,
USA,
Joined Nov 1998
390
Re: Temperature effects on UT Thickness readings research : Panametrics made a Thickness meter (26DL) that incorporated an Ascan display (and I am sure others do also). This is one option to ensure you use the correct signals. If memory serves, Panametrics also made a version of meter for checking thicknesses under paint. This would have used the 2nd and 3rd signals (backwall and return to entry). I think it was the 25DL and it had 3 modes of triggering/gating.

To add to Ed's comments: we are indeed among the companies that offer hand-held corrosion gages with A-scan display (the latest model is the 36DL PLUS), as well as instruments that offer an echo-to-echo measurement mode (the 25DL for precision work, the 36DL PLUS for corrosion survey applications).

Another trick that we (and others) employ to get around the delay line zero drift problem is to have the gage periodically measure pulse transit time through the delay line as the probe heats up, and then use any detected variation in this transit time to automatically adjust the zero offset. This usually solves the drift problem when using a full-featured corrosion gage.

Flaw detectors typically don't do this automatically, so the trick there is to either re-zero on a test block, as has been already suggested, or manually do a pulse/echo measurement of delay line length and adjust zero accordingly. Also, I'd add that the amount of delay line zero drift with temperature varies quite a bit depending on delay line materials. A number of manufacturers (including us) offer dual element transducers using proprietary high temperature delay line materials that minimize drift.

--Tom Nelligan
Senior Applications Engineer, Panametrics, Inc.
http://www.panametrics.com


 
07:54 Jan-04-1999
Norm Woodward
Re: Temperature effects on UT Thickness readings research : Malcolm:
: The standard digital meter uses a mechanism that assesses a zero point through the perspex delayline. This function is to be carried out at the start of a calibration. It is in fact this zero point that is changing as the temperature of the perspex warms. Since the dV/dT for perspex is a negative value the time it takes to traverse the delayline INCREASES as the perspex warms. This appears as an increase in thickness as the start time is still unchanged (if you do not keep re-zeroing) and the total time to the next signal (the backwall) also increases....mostly by the plastic delayline time increase. Your plots would show only the effect of dV/dT on the plastic, as you described them carried out.

: You stated:
: "...However, tests using various 2.5 & 5 MHz single crystal compression probes saw only a 0.2 - 0.6mm increase in thickness readings between 0 and 90 degrees C. I can only guess that this is because perspex was used as a shoe for the twins but not for the singles hence the difference in the delay lines, is this a correct assumption?..."
: Correct! Most single element probes without delayline use a BoronSilicate protective layer (about 0.25mm thick). The dV/dT on this is less than perspex and the thickness of the layer being so thin reduces the total time change.

: The Cygnus meter error would be reduced as it would average several GOOD metal multiples with the poor Intial signal. (I assume it used a perspex delayline).

: The "Hot-Shoe" probe you used was probably just another form of delayline (usually a material like Vespel). Although better able to take higher temperatures without transferring heat to the piezo element it too would suffer from a change in zero point as temperature changes.

: The "hot-shoe" probes you mentioned were not exactly what I meant by High Temperature probes. Etlon (a probe manufacturer in the USA) makes a version that is a piezo element spring loaded against the probe mounting. This has no backing so HOT surfaces will not melt the solder nor distort the epoxy protective face or metal filing loaded backing. However, because there is no back loading, the ringing can be excessive.

:
: Actually the problem of temperature change has been considered in Codes. e.g. British Standards 4331 Part 3 Appendix C. There they are more concerned with the effects on refraction of angle beams but it is still a factor considered. The refracted angle is a function of relative velocity ratios so since the velocity in the steel is little changed, the significant velocity change in the perspex can cause LARGE refracted angle changes.

: Panametrics made a Thickness meter (26DL) that incorporated an Ascan display (and I am sure others do also). This is one option to ensure you use the correct signals. If memory serves, Panametrics also made a version of meter for checking thicknesses under paint. This would have used the 2nd and 3rd signals (backwall and return to entry). I think it was the 25DL and it had 3 modes of triggering/gating.

: I have heard of a company that had a novel way of contolling thermal effects in delaylines. They made probes with small heating elements to fix the temperature of the plastic.

: The simplest fix to your problem seems to be a single element probe with No delayline. The next fix would be to use 2nd and 3rd signal intervals but even with the 2nd and 3rd signal method a single elemt is probably better as the duals tend to introduce some error as a result of roof angle. This is exacerbated on multiple skips.

: Ed
Another way to examine hot pipes for corrosion and other thickness changes without compensating for delayline velocity changes or risking probe damage is to eliminate the delayline and probe.

Karta Technologies, of San Antonio Texas, has developed a hand-held laser ultrasound system which addresses this application. More information can be found at their website, , in the "products" section.

BTW, the best to all in this new year.

Norm



 
06:30 Jan-05-1999

Linas Svilainis

R & D,
Kaunas University of Technology,
Lithuania,
Joined Nov 1998
66
Re: Temperature effects on UT Thickness readings research Happy New Year to everyone!

it seems small holiday brake have increased
number and quality of the discussions.
I would like to suggest to go even further. Might be
equipment manufacturers can utilise this idea.
Instead of throwing away the delay line introduced
zero shift I would suggest to use it for temperature
value T evaluation. Then, by using rough DV/DT for
material under test, one can reduce even the
temperature effects on material thickness evaluation.
This can be done by subtraction of delay line TOF
and DV/DT*T from measured TOF.

Linas



 
02:44 Jan-17-1999

Rolf Diederichs

Director, Editor, Publisher, Internet, PHP MySQL
NDT.net,
Germany,
Joined Nov 1998
602
Re: Temperature effects on UT Thickness readings research : I have been conducting various tests determining exactly how variance in temperature between 0 and 100 degrees celcius effects the thickness readings on steel pipe. Tests have shown errors over 100% or up to 2.7mm in most instances. This appears to be the result velocity in the probe shoe medium varying with temperature due to the mechanical properties varying with temperature.

: Question: Is there anyone who has conducted similar tests or has knowledge of this problem. If anyone could shine some light on this for me i would be very grateful, i have limited knowledge in this field of work.

: Thanks
: Malcolm Maclean
--------------------------

First of all, excuse my absence during the last weeks.
I am very grateful seeing that already some experts
have joint this topic - thank you very much!
Since most of the answers had been already given in depth
I could focus myself fully on the NDTnet editing work.

It was already mentioned the use of an A-scan display.
However, I would like to add a little bit more why for my opinion
it is so important.
The trigger point at the edges of echoes needs to be watched carefully
especially under hot conditions.
The attenuation of sound rises a lot during the warm up time of the delay line.
That can cause a jump of the trigger point to another halve wave of the echo.
That may cause a much higher thickness reading error than that one which is
effected by the dV/dT error of the delay line.
Using a probe frequency as much low as possible can probably prevent this error,
since the change of attenuation is lower for low frequencies.
Anyway watching the screen for a correct trigger threshold is recommended.
Another more or less small error can appear by frequency changes of the backwall echo.

Thickness gauges and flaw detectors are available which offer
automatic gain control, that may help to overcome any echo amplitude drop.
However, I had not heard if that is common practice.

Rolf


BTW:
The following issue focused on Ultrasonic Thickness Measurement.
You can find articles dealing with the temperature effect.
NDTnet - October 1997, Vol.2 No.10
http://www.ndt.net/ut1097.htm



 


© NDT.net - The Web's Largest Portal of Nondestructive Testing (NDT) ISSN 1435-4934

Open Access Database, |Conference Proceedings| |Articles| |News| |Exhibition| |Forum| |Professional Network|