where expertise comes together - since 1996

Web's Largest Portal of Nondestructive Testing (NDT)
Open Access Database (Conference Proceedings, Articles, News), Exhibition, Forum, Network

All Forum Boards
Technical Discussions >
UT Software Validation
Career Discussions
Job Offers
Job Seeks
Classified Ads
About NDT.net
Articles & News

924 views
18:26 Mar-03-2010
emil shavakis
UT Software Validation

Learned Collegues
I find nothing in the ASME Code, so I am seeking opinions from this forum on two points related to advanced UT methods in ASME –

There are several software packages out there being used to plan (e.g. BeamTool), create delay laws and analyze data (TomoView/Ultravision). Since I sign away all liabilities to their creators when I agree to the license, what must I do and what should I do in concert with their use?

1 - What do I need to do to qualify the software as being accurate and true? How am I to know that the coverages shown are being achieved and maintained. Am I supposed to be measuring exit points and beam angles to ensure I get what I plan, or is it enough to simply punch in the numbers and go for it. What should I do to ensure that my Scan Plan(ned) is my scan achieved. Is there something in the Code I missed?

2 – If I use a software tool to analyze data and make calls – is it subject to any type of linearity (dB, time) calibration or verification, as was the instrument of acquisition.

By this point there must be many Level II and III UT types out there providing Code Case required review and concurrence. What is it that you (we) are supposed (read as either shall or should) to do?

Emil

 
20:22 Mar-03-2010

Ed Ginzel

R & D, -
Materials Research Institute,
Canada,
Joined Nov 1998
1208
Re: UT Software Validation In Reply to emil shavakis at 18:26 Mar-03-2010 (Opening).

Emil; You have raised several issues of concern.
Regarding software for scan plans...the available software for designing scan plans is, I think, a "good start". But the Code Case (2235-9?) requires that the procedure developed be "demonstrated". Part (c) of the CC states "The procedure shall have been demonstrated to perform acceptably on a qualification block(s)." Raytrace programmes are handy tools but do not account for all factors. Demonstration blocks should provide means of demonstrating adequate volume coverage and not just verifying detection of 3 embedded flaws. Effectively the demonstration validates the procedure (which used the software tools to calculate the expected coverage...or you could have made the scan plan beam coverage by hand using drafting tools).

The acqusition and analysis software used is part of the qualification but its linearity requirements are actually spelled out in the CC (I assume you mean 2235-9). Again in part (c) it states; The ultrasonic examination shall be performed in accordance with a written procedure conforming to the requirements of Section V, Article 4.1 This Article describes the system requirements for linearity.

Not many (if any) instruments now use analogue displays so the wording in Article 4 for linearity applies to the entire system. From the earliest days of digital instruments we have been using firmware to indicate the amplitudes and time along the timebase. Therefore the assessments for linearity have been on the readouts provided by the entire "system". There are no longer "CRTs" so all the A-scan displays are simply software generated images. The software used becomes part of the system being assessed.

Even after you demonstrate "detection" capabilities of your procedure, error can still occur in "measurement". E.g. if using a phased-array instrument and you measure the depth indicated to a SDH target in your demonstration block and find that it is off by 10mm from the true depth, I would think that you need to determine why. It MIGHT be the instrument...but more likely it is a parameter such as the velocity entered by the operator. Perhaps wedge delay was not correctly calculated for all angles used. Perhaps the material is anisotropic and the velocity varying with refracted angle. You might need to change the procedure so that each focal law uses the correct velocity...this can be a problem for some systems that just use a single velocity for a given material.
Validation of the system (including the software) can only be verified when all pertinent parameters are known and correctly entered.

 
21:34 Mar-03-2010

emil shavakis

University of Massachusetts,
USA,
Joined Mar 2010
7
Re: UT Software Validation In Reply to Ed Ginzel at 20:22 Mar-03-2010 .

Interesting observations on "system". Certainly can't disagree. I wonder if that is really being done. Testing the software presents an interesting challenge and there is no real guidance on how.

As to the plotting programs - couldn't agree more. But I do think they may be being used without accounting for "factors" despite the warnings in the license.

2235 is pretty non-specific as to success of the demo. There is not much required as to accuracy of the call other than length. As you stated there are many error sources and not much Code to address what needs to be done.

Thanks for the input

Emil

 
15:29 Mar-04-2010

Michael Moles †2014 *1948

,
Joined

Re: UT Software Validation In Reply to emil shavakis at 18:26 Mar-03-2010 (Opening).

Emil:
There is no specific code for software and the evolution of the software development does not make it easy. Basically, ASME qualifies the whole package, which is the sensible solution.

However, many large companies have qualified OmniScan and TomoView for their specific needs. It is also impossible to qualify a software for all purposes (outside our internal qualification program indeed).

Michael

 


© NDT.net - The Web's Largest Portal of Nondestructive Testing (NDT) ISSN 1435-4934

Open Access Database, |Conference Proceedings| |Articles| |News| |Exhibition| |Forum| |Professional Network|