The prime reason for performing calibration tests of an ultrasonic flaw detector is to ensure the integrity of the vertical and horizontal linearity on all channels and over all available range and gain settings.
Analogue flaw detectors needed to be tested at regular frequencies because the trace on the cathode ray oscilloscopes could drift.
Digital flaw detectors, do not drift, nor do components gradually fail over time.
Also, any problem with a digital flaw detector should be readily detected by a technician each time an ultrasonic probe is calibrated.
Therefore, I do not believe that annual calibration tests of digital flaw detectors is essential.
You raise an interesting question. I work for a major manufacturer of digital flaw detectors, and it comes up from our customers now and then. Our calibration certificates have a one-year expiration date, per standard industry practice. However as you note, digital flaw detectors do not drift and when they fail, they tend to fail in an obvious manner (like failing self-test on startup, producing error messages, blank display, etc.). So routine calibration certification is nowhere near as as important as it was in the analog era.
At the same time, it is always up to the user to determine the appropriate interval for calibration certification based on the requirements of his or her specific test. Some of our customers are locked into frequent recertification based on the procedures they're working to, or the dictates of their customers. Some tests are far more critical than others and people want documentation rather than taking a chance. In other cases, people will rely on the self-test function incorporated in all of our digital flaw detectors as sufficient verification that the instrument is working correctly.
That probably doesn't answer your question, but that's my observation.
Tom Nelligan's commments are correct. However, the issue is what does the codes and standards require?the code. If the code is ASME, then a digital instrument requires an annual calibration check for linearity. If an analog, then 3 months.
With that said, I would propose a thought.
It takes all of about 5 minutes to conduct these linearity test as per ASTM E317. (I think that is the correct number). These instruments can take a beating from technician to technican, and from job to job. What if the unit is out of linearity 9 months since the last linearity check?Does that void all previous examinations. Who would tell???
I normanaly, suggest to my students, whether a digital or analog instrument, that they perform the linearity test at 1 week to 1 month intervals. I don't want to lose any work that has been done since the last linearity verification. This is purely personal preference.
So, as a minimum you have to meet the requirememnts of your codes and standards.
Tom is right. The requirement is a legacy of the analog era and it is time the National Standards (preferably the ISO) are revised. The self diagnostics built in are sufficient.
A slightly different problem comes up to mind. In the practical examinations for Level I and Level II, we have to keep the digital instruments out and try to pool up the good old analog instruments for the problems involving instrument calibration.
Sadly, the calibration of instruments at higher-than-needed frequency is well ingrained into our "system." The final nail in the coffin which perpetuates this is NADCAP's interpretation of what should be done (opinion alert!). Our manufacturer calibration stickers have to be removed, and we have to re-cal on six month intervals to meet spec. Rather silly.
We still find ourselves locked into recalibration of calibration standards! It seems silly to me to "recalibrate" test blocks that don't change wtih time, unless we are afraid someone snuck out and enlarged holes or annealed the block, of substituted a different material.
To effect changes, there needs to be involvement at ASTM and SAE/AMS etc - we need to have folks that can bring the energy and do the work to effect the changes to correct these anachronisms.
This is a fascinating post that as others have said keeps coming up. Interesting that a manufacturer makes a post about how reliable digital instrucments are but is happy that as new they only issue a one year calibration certificate. If there is so much confidence why not issue a 10 or 20 year certificate. The reason is that in the real world they do not know what circumstances the instrument will be used in and they are not 'so confident' that digital circuits cannot drift in extremes of say heat, cold or humidity for example. We have taken a digital instrument set it up and left it static for several days and found that there is some degree of drift. Returning the instrument to the manufacturer identified a component issue they had not picked up. The othe issue is abuse of the instrument we see relatively new instruments with the most horrendous degrees of damage and dirt etc - who knows what the instrument performance is. We make multi million dollar decisions on data arising out of these instruments maintaining the annual calibration and intermediate checks currently embedded in our codes is a small price to pay for maintaining confidence in our examination process.
1 year, 6 months, 3 months are all called up by various regulatory bodies for calibration frequency.
ASME has recently made a distinction for digital instruments vs analogue instruments. Finally they see the light
HOWEVER, there are BIG utility customers out there (that will remain nameless) that require very complicated daily or weekly linearity checks when testing his forging products.
These checks take a significant amount of time to perform. (I can do one in about an hour and I am experienced. A newbie takes 2-4 hours to get it right.)
In over 2 years of performing these checks I have never found a problem with a digital instrument. The record keeping invloved alone is a big waste of time and trees.
I just though folks might be interested in the extremes.
AMEN! After doing linearity and amplitude checks on both Panametrics and Krautkramer digital instruments at ASME-1 yr and AWS-2 month intervals, NOTHING ever changes.
We should be allowed by Code to increase the calibration interval, as specified in the 'motherhood' of all calibration programs -- Mil-STD-45662. This STD requires shortening calibration intervals when a gage or instrument has been found 'out' during re-cal, and allows extending intervals based on succesive 'good' re-cals.
"Interesting that a manufacturer makes a post about how reliable digital instrucments are but is happy that as new they only issue a one year calibration certificate. If there is so much confidence why not issue a 10 or 20 year certificate."
Because we are responsive to the wishes of our customers, and most of our customers want the customary one year cal cert.
I have followed this thread with some interest as we use both analogue and digital ultrasonic flaw detectors. Neither have ever required adjustment on calibration certainly in the 10 years I have been dealing with them. Similar to Dent we have customers that require a daily calibration of the Gain and Timebase and some of our procedures actually require it prior to every batch. My attitude tends to be 'If required we will test standing on one leg as long as the customer is paying for it'.
My understanding of the functioning of digital flaw detectors is that the signal amplification takes place in the analogue part of the circuit. It is only this amplified signal that is digitised and therefore the gain linearity of all flaw detectors is 'analogue'. It is only the time base that is truly digital and accurate within the tolerances of the timing oscillator.
I have also noticed that some digital flaw detectors use more than one amplifier and there can be a step non-linearity when they switch between or add amplifiers.
The point I am trying to make is that I believe that it is not whether the flaw detector is analogue or digital but the quality of design and components that gives us the reliability and stability we see in modern equipment.
We use digital ultrasonic flaw detectors.
For doing a Horizontal linearity check - ASME standards suggest the method of adjusting the 3rd and 9th multiple echoes to 20% and 80% screen width. and then with your keen eye try to visualize if the pips are at 10% or 11 or 12 %, also 20 %, 30% 40%, 50%, etc.
I think it would make life easier to just calibrate on 10 mm on a step wedge and make the range 100 and then move the gate over each pip and record the thickness at each. That seems more accurate of a check than gessing if it is 29, 30, 31, or 32% screen width.
What do you think?
i believe even ASTM E 317-06a refers the same thing. But as Mark told, i do not believe that it is a very easy task to carry out the calibrations as there are many things involved like horizontal and vertical linearity, sensitivity test, near surface and far surface test, Signal to Noise ratio etc which take a lot of time
But as per my experience, digital flaw detectors are calibrated to the manufactures standard, in which the acceptance criteria may be even more stringent than the codes.
Re: Calibration of Digital UFDsIn Reply to Sudheer Jai Krishnan at 19:17 Jun-18-2012 .
Was wondering if the industry has made any strides in addressing this issue. ASME does say equipment calibration not required for digital equipment. But can we really say there are no analog components in the digital NDT instruments? Would like to hear from the experts