· Table of Contents
· Industrial Plant & Structures
Synergism of NDE & IT: A Generic Knowledge-base System for Effective and Reliable NDEC.Rajagopalan, Baldev Raj and P.Kalyanasundaram
Metallurgy and Materials Group,
Indira Gandhi Center for Atomic Research
Kalpakkam 603 102 Tamilnadu INDIA.
The term "information technology" immediately brings to our mind the notion of e-mail, and to others the worldwide web. There is more to information technology, both in terms of what it is and what it can do, than that meets the eye. Information technology represents a host of new technologies where knowledge, and how knowledge is used, distributed, communicated and archived play a great role. Knowledge itself can be defined as refined, useful information. Today effective application of information technology (IT) is particularly suited and required for Nondestructive Evaluation. The paper discusses some of these aspects in detail with reference to a generic knowledge-based decision support framework that has been developed in the authors' laboratory. The highlights which emerge from the development of the framework are discussed, including issues such as:
Keywords: Nondestructive Evaluation, Knowledge-based System, Decision Support System, Information Technology
The term "information technology" immediately brings to our mind the notion of e-mail, and to others the worldwide web. There is more to information technology, both in terms of what it is and what it can do, than that meets the eye. Information technology represents a host of new technologies where knowledge, and how knowledge is used, distributed, communicated and archived play a great role. Knowledge itself can be defined as refined, useful information. The sudden interest in knowledge-based economies is fuelled by the availability of fast computers and means of establishing high-speed communication links between them, thereby sharing their information. Knowledge management has become pervasive today because of this very same reason. Knowing where information is available and how to access it, have become equally important as knowledge itself. Large organisations are spending enormous amounts of their financial and human resources to re-discover their hidden strengths, knowledge and abilities through information technology.
Nondestructive Evaluation (NDE) is a domain of measurement science and technology, which ensures the quality and fitness-for-purpose of a product before and during its service life, without in anyway impairing the usefulness of the material, component or plant. NDE by its nature is an inter-disciplinary science, involving a number of distinct methods that apply a variety of physics and chemistry principles. Effective application of information technology (IT) is particularly suited and required in NDE. It is therefore essential to fix the objectives of using IT in NDE. Some of the possible guideposts could be:
Diversity in the number of NDE techniques also results in a spectrum of different types of decision-making processes. The three important issues, namely the diversity in the techniques, the number of components that need / constantly undergo testing, and the time-window during which these tests take place, automatically generates, not only a huge amount of raw data but a large measure of human expertise and knowledge. For a number of NDE techniques, irrespective of the underlying physical or chemical basis of operation, the resultant output is either in the form of a signal (one dimensional waveform) or an image (two dimensional map). The study of these two types information pervades the whole of NDE, particularly when there is a need to know 'more' than what is directly presented by these two types of data. Among the many methodologies that are used to analyse and extract information from these two types of data, the most important would be cluster generation, analysis, classification and pattern recognition.
From these considerations, it is clear that NDE provides information technology with a fertile field to grow with and contribute to. NDE invokes the various aspects of information technology:
In addition, scientific data visualisation (through computer aided visualisation), computer vision, pattern recognition and data mining have become important IT components that aid NDE.
A large number of intelligent systems for testing, diagnosis, analysis and advising have been developed and studied in the recent past (Dean et al., 1995; Durkin, 1994, Ginsberg, 1993; Luger and Stubblefield, 1993; Puppe, 1993; Rich and Knight, 1991; Russell and Norvig, 1995; Stefik, 1995; Winston, 1992). Knowledge-based and decision support systems created, for a specific diagnostic problem or by applying a single model of diagnostic problem-solving method, are available (Robey et al., 1994; Winston et al., 1995; Stensmo et al., 1995; Ulieru, 1994). Among these systems, variations occur as to the type of knowledge representation used, the uncertainty management paradigm applied and the scope of the embedded inference procedures. Decision trees and rule-based systems are favoured for knowledge representation and reasoning, while probabilistic approaches are popular to handle uncertain and incomplete inputs. Depending upon the expected outcome (the diagnostic result) of the diagnosis, and to a great extent on the type of knowledge and data processing involved, NDE procedures can be quantitative and / or qualitative. Computer hardware fault diagnosis could be categorised as qualitative, whereas the machine-interpretation of magnetic resonance imaging (MRI) medical images could be termed quantitative.
Limitations of Current Knowledge-Based Systems
Most of the current expert, knowledge-based and decision support systems for NDE are specific to a certain case and apply specific methods to offer decision support. Scaling them to improve their decision support efficiency or to extend them to areas other than that for which they are originally designed, becomes very difficult if not impossible. In such cases where scaling becomes necessary, it involves re-writing major software components and logic of the decision-support system. If it would be possible to create a software architecture that is generic to a wide spectrum of NDE tasks, the dividends are rich.
Qualitative NDE studies involve pure rule-based approaches in arriving at a decision. A framework that attempts to automate quantitative NDE tasks may have to include methods to handle qualitative approaches as most of the acceptance and evaluation methodologies are based on heuristic experiences. In quantitative NDE, different decision-making paradigms are required to arrive at an acceptable solution, often resulting from incomplete or uncertain inputs in the first place. In other words, the automating-framework must be able to address the issues of hybrid computing, and use the appropriate method for a given NDE problem, often combining different computing paradigms (Sun and Bookman, 1994). A hybrid framework should be flexible enough to successfully address the acceptable decision support concerns of a wide variety of diagnostic tasks.
A critical review of the nature and variety of NDE knowledge in a given domain reveals that a substantial portion of this knowledge is symbolic, and that the methods that use this knowledge in order to arrive at a decision are heuristic. In addition, the tools that aid quantitative diagnosis, such as clustering and pattern recognition, employ procedures that are numeric in nature.
This can be understood by focussing our attention onto three issues. The primary issue is the design and validation of a method to combine both the symbolic and numeric approaches. The second issue relates to the ability of the designed approach to be equally applicable across a wide spectrum of NDE techniques, without any changes to any of the framework's architectural components. And finally, the third issue is to evaluate the architecture's capability to reason with incomplete information and possibly learn over a period of time (Rajagopalan, et al., 1996). The generic framework that is developed by the authors to address these issues is called DESKPACK, which stands for Decision Support Knowledge Package.
The Components Perspective
The three major components of the DESKPACK architecture can be identified as
A salient feature is that these components are categorised as either performing symbolically or numerically intensive operations. Those segments that perform symbolic computation are implemented in Visual Prolog, and for those segments where numerical computation is performed, 'C' is used. Hence, the architecture is inherently hybrid, consisting of a number of inhomogeneous components.
The Functional Perspective
The DESKPACK Architecture consists of three layers of functionality, namely, the toolbox layer, advisory-layer and the interpretation-layer. Each of these layers is well integrated among themselves and with the blackboard mosaic. Both intra-layer and inter-layer communications are possible. This communication among various modules and layers offers the architecture, enhanced flexibility. The architecture is scalable - existing modules can be improved to any desired extent and additional modules can be added to the architecture as required, without major changes. In most cases, a re-compilation of the modules should achieve the required performance levels.
The Blackboard Mosaic
NDE tasks often involve multi-stage decision-making. Results obtained in the previous stage influence the course of action in the subsequent stage, or as in some cases, would even select the subsequent stage(s) to be followed. Intermediate results such as these and the steps taken to arrive at these results are to be available both to the inference mechanism and to various modules that require them.
Among the architectures that are candidates for this type of decision-making process, the most effective has been the blackboard architecture. The blackboard architecture allows various inhomogeneous components of such a framework to co-exist and communicate with each other. (Engelmore et al., 1988). It also allows the addition, deletion and edition of various modules as the framework itself evolves over a period of time.
The concept of blackboard also weaves in well with the primary coding language of the framework namely Prolog. Prolog, which handles the symbolic part of the framework, allows the use of its internal database for effective assertion, modification and deletion of facts, during run time. Also this approach is not in conflict with the secondary coding language, which is 'C', responsible for the numerical computation in the framework. In our implementation of the framework, the concept of internal database of Prolog, has been fully utilised to design and realise the blackboard mosaic. Embedded onto this mosaic are the knowledge-engineering segment, the inference mechanism and a host of other computational and security components. The various symbolic, numeric, administrative, graphical user interface and security segments are together called the DESKPACK Software System, or DSS.
Properties of the Knowledge Engineering Segment
In order to maximize modularity and maintainability, the domain knowledge in any field and the knowledge about using this domain knowledge (meta-knowledge) is stored in the DESKPACK Software System (DSS) framework in three distinct files. These files are called the DESKPACK Knowledge Pool file (DKP file, having the extension *.DKP), the DESKPACK Configuration File (DCF file, having the extension *.DCF) and the DESKPACK Bridge File (DBF file, having the extension *.DBF). The DKP file stores the core knowledge of a given domain, as articulated by the domain expert. The DCF file stores the pre-requisites and sequential information required to use the corresponding DKP file. The DBF file, acts as a bridge between a given DKP file and a given DCF file. The DBF file contains the meta-knowledge about the DKP and DCF files.
For example, if computer hardware diagnosis is the domain in question, the relevant knowledge will reside in the knowledge-pool (DKP) file, the preliminary information available / required for the diagnosing the hardware problem (symptoms, behaviour, reachable goals, etc.) will reside in the DESKPACK configuration (DCF) file and the corresponding bridge file (DBF; automatically generated by the framework) will contain information connecting the DKP and the DCF files. Both the DKP and the DCF files can be edited / modified by the domain expert any number of times to include any amount of knowledge, pre-requisites, constraints and goals.
For a given domain, there must be at least one-set of three files (trio-files), viz., the corresponding DKP, DCF and the DBF files. The framework supports a single domain to have more than one set of these trio-files. Also, more than one knowledge-domain can reside in the DSS memory at a given time. This facility allows the User to consult (or even cross verify) the knowledge of two related domains, simultaneously.
The DESKPACK Inference Engine - Flexibility and Performance
The Inference Mechanism (IM) of the DESKPACK architecture is one of the most crucial components affecting the performance of the entire decision support system. The IM has been designed and developed so as to:
While determining the sequence of rules to be fired by the IM, in addition to its default strategy, the architecture offers the User with an array of possible rule firing sequences. These include, firing of rules based on their truth and relevance value, based on the rules' firing rate, and based on the contextual information present in the DSS knowledge-pool rules / facts. Also, the User can customize the way the DSS handles dead-end resolutions. This utility, offers the User a great amount of flexibility in the operation of the DSS. It should be mentioned that the developed framework carefully insulates the domain knowledge from the underlying inference mechanism. This approach enables the framework to be used for entirely different domains, without altering the basic structure of either the logic or the knowledge-base structure.
Among the different domain case studies employed to assess the various strengths of the DESKPACK framework, three are described briefly below. In the first case, the DESKPACK Software System (DSS) framework is used as a toolbox (first level of operation), to analyse eddy current signals from defects. In the second case study, the Advisory nature (second level of operation) of the DSS framework is demonstrated for choosing the right parameters for ultrasonic testing of austentic welds. The third and higher level of operation, namely, interpretation is used in the final case study, where Synthetic Aperture Focussing Technique (SAFT) images are interpreted to identify the location and size of defects.
Analysis of eddy current signals from natural and artificial defects (First Level of Operation - ToolBox)
Artificial and natural defects in stainless steel plates were scanned using an automated mechanism to obtain their eddy current signals. A variety of features were obtained, which were classified initially by a Kohonen's self-organising neural network in DESKPACK, and subsequently by the DESKPACK's multi-layered perceptron. One of the main conclusions of this study was the acceptability of artificial defects to provide features to train a neural network, which can subsequently be used to correctly classify natural defects (Shyamsunder, et al., 2000). In this case, the classifying ability of the DESKPACK framework is used extensively, without invoking the knowledge engineering segments of the architecture. The inputs to the system were one-dimensional signals and the desired output was their classification, through feature extraction.
Advice on the best parameters for Ultrasonic Testing of Austentitic Stainless Steel Welds (Second Level of Operation - Advisor)
The concept of 'casting' the available NDT knowledge and expertise into a form that is 'machine readable', was attempted in this study. The approach used is to consolidate and use widely available knowledge and expertise in the field of ultrasonic testing of austenitic stainless steel welds.
This study obtained circumstantial knowledge (such as information on the weld geometry, fabrication, metallurgy, service conditions, cooling method, microstructure, and available ultrasonic testing facility) from the user, and used this along with the domain specific knowledge (such as the established rules of ultrasonic testing (UT) of AUSS welds) to advice the user on the right type of UT method to be followed. The model envisages use of both algorithmic and logical operations to achieve its goal. Basically, the study used five modules (Weld, Metallurgy, UT, Signal Analysis and Codes), which serve as the entry modules for the system.
The user can start the consultation process by selecting one or more of these entry modules, as he / she likes. In each of these modules a few pertinent questions are asked by the system. The necessary prompts and help are provided by the system through the use of hot keys.
Apart from this, specific reasons for each of these queries and associated literature support is also provided. On completion of answering, when the user requests the system for an advice, these answers are stored in an internal database and are then processed by the scheduler. The scheduler can be defined as an interface program that examines the user input and alters the user input (if necessary) before passing it on to the advice generator. The scheduler part is responsible for
The inference engine then uses the user input and the core knowledge base to generate the required advice, on twelve issues (C.Rajagopalan, 2000). They are:
Interpretation of a stack of SAFT images through localised image feature extraction and analysis (Third Level of Operation - Interpretation)
Synthetic Aperture Focussing Technique (SAFT) is a reconstructive-imaging technique widely used in advanced ultrasonic imaging. In our case study, artificial notches were scanned using an appropriate ultrasonic probe and the pseudo-colour images were reconstructed from a weld. For each defect a stack of 20 - 25 images were obtained, each representing a transverse cross-section of the weld, separated by a few millimetres, the separation representing the probe overlap. The presence of mode-converted waves and the effect of experimental parameters make interpreting these images difficult. The DESKPACK framework was used to extract local features from these images and to identify the location of the defect (by identifying the image(s) in a given stack). Local features extracted and used by the framework include area, perimeter, major axis, minor axis, major axis angle, minor axis angle, etc., of each object in a given SAFT image. Continuity and discontinuity of the values of these local features, through a given stack for all the objects in an image, was used as the deciding criteria by the DESKPACK framework to locate the defective region (C.Rajagopalan, et al, 2000). In this study, the framework was able to locate those images pertaining to large defects. In this case the reasoning ability of the DESKPACK framework is exploited, with inputs from local features of SAFT pseudo-colour images. No classification was involved, yet numerically intensive computations were required (extraction and analysis of image objects' local features), combined with symbolic reasoning (identifying the defect location using continuity, discontinuity and their ratios).
The DESKPACK architecture performs reasonably well under a variety of computational paradigms, including both symbolic and numeric. The architecture has proved its ability to facilitate integration of additional modules and has fully validated the communication between its various layers.
The current framework is being extended in order to
The DESKPACK Software Architecture offers a fresh approach to identifying the basic components of an NDE decision support task, and in building architecture that supports these components for better, efficient and automated decision support. The developed architecture has removed the current deficiency of the lack of an integrated mechanism to process and analyse different types of data within the decision support architecture. In addition the DESKPACK architecture effectively combines the advantages and power of data clustering, classification and pattern recognition methods for decision support. Managing NDE knowledge (storage, effective utilisation and creating usable derivative-knowledge) has been accomplished in a unifying manner.
From a software perspective, it gives valuable methods of handling inhomogeneous decision-support problems such as fault diagnosis, which require combining different programming paradigms. From a systems point of view, the architecture opens up new areas of hybrid computing, integrating both symbolic and numerical approaches, as applied to materials evaluation. From a purely automation point of view, the architecture provides a means to efficiently capture, store, use and update vast amounts of NDE knowledge, with little or no human interference. From a distributed computation perspective, the architecture highlights both the problems involved and the methods to overcome in consulting knowledge bases that are not localised, but are present in different geographical locations. Finally, from a functional perspective, all the modules have performed seamlessly and efficiently.
|© AINDT , created by NDT.net|||Home| |Top||