The best manner to cut down decease rates due to this disease is to handle it at an early phase. Early diagnosing of Lung malignant neoplastic disease requires an effectual process to let doctors to distinguish between benign tumours from malignant 1s. Computer-Aided Diagnosis ( CAD ) systems can be used to help with this undertaking. CAD is a non-trivial job, and present attacks employed have the complexness in increasing both sensitiveness to tumoral growings and specificity in placing their nature.
CAD is an attack designed to cut down experimental inadvertences and the false negative rates of doctors construing medical images. Future clinical surveies have proved that there will be an increased usage of malignant neoplastic disease sensing with CAD aid. Computer plans have been widely used in clinical patterns that support radiotherapists in observing possible abnormalcies on diagnostic radiology tests. The most common application is the computing machine aided ( or assisted ) sensing, normally referred to as CAD. The term CAD is pattern acknowledgment technique that recognizes malignant characteristics on the image and it is reported to the radiotherapist, in order to minimise false negative readings. CAD technique is soon FDA and CE approved for usage with both movie and digital mammography, for both showing and diagnostic tests ; for chest CT ; and, for thorax radiogram. The chief purpose of CAD is to heighten the sensing of disease by minimising the false negative rate due to experimental inadvertences. By utilizing CAD, there are no demands on the radiotherapist. The chief facet of the attack is to increase disease sensing quality. CAD attacks are developed to look into for the same features that a radiotherapist expects during instance reappraisal. Therefore, CAD algorithms in footings of chest malignant neoplastic disease on mammograms look for micro calcifications and multitudes. On thorax radiogram and CT scans, present CAD attacks look for pneumonic densenesss that have peculiar physical characteristics.
The development of CAD systems is chiefly to back up the radiotherapist and non to replace the radiotherapist. For case, a CAD system could scan a mammogram and pull ruddy circles around leery countries. Later, a radiotherapist can detect these countries and analyze the true nature of those countries.
A figure of CAD strategies have been investigated in literature. These include:
Subtraction attacks that detect abnormalcy by comparing with normal tissue
topographic attacks that perform feature extraction and analysis to observe abnormalcies
Filtering attacks that use digital signal processing filters to augment abnormalcies for easy sensing
staged adept systems that perform rule-based analysis of image informations in an effort to supply a right diagnosing
Most of the CAD attacks follow the minus techniques [ 157 ] in which the sensing of abnormalcies is by seeking for image differences based on comparing with known normal tissue. In topographic techniques, the sensing of the anomalousnesss is based on image characteristic designation and extraction of characteristics that associate with pathological anomalousnesss, such as in texture analysis [ 158 ] . Most attacks follow the followers phases which includes
analyzing the image informations
pull outing pre-determined characteristics
Placing parts of involvement or ROIs which can be observed for possible abnormalcies.
Several of these attacks are used for high grades of sensitiveness, but many have been vulnerable by high false-positive rates and therefore low specificity. The job of false positives is aggravated by the fact that false positive rates are reported per image, non per instance. As many radiological scrutinies include more than one image, the existent figure of false positives may be a multiple of those reported.
A figure of different attacks have been employed in an attempt to cut down false positive rates, many of them concentrating on the usage of Artificial Neural Networks ( ANN ) , Machine Learning Approaches etc. Receiver Operating Curve or ROC is a general metric used for measuring the public presentation of CAD systems and is normally used to measure a CAD attack grade of trade-off between sensitiveness and specificity.
CAD is fundamentally depends on extremely complex form acknowledgment. X-ray images are scanned for leery constructions. By and large a few 1000 images are needed to optimise the algorithm. Digital image informations are copied to a CAD waiter in a DICOM-format and are prepared and analyzed in several stairss.
The art of taking in natural informations and taking an action depending on the categorization of the form is by and large defined as pattern acknowledgment. Most research in pattern acknowledgment is about methods for supervised acquisition and unsupervised acquisition. The chief intent of pattern acknowledgment is to categorise informations ( forms ) based on either a priori cognition or on statistical information obtained from the forms. The forms to be categorized are usually groups of measurings or observations, specifying points in a suited multidimensional infinite. The full form acknowledgment system contains a detector that gathers the observations to be classified or described, a characteristic extraction attack that evaluates numeral or symbolic information from the observations, and a categorization or description strategy that does the existent occupation of sorting or depicting observations, trusting on the extracted characteristics. The categorization attack is by and large based on the presence of a set of forms that have already been classified.
In order to get the better of these jobs, this chapter introduces a proposed Computer Aided Diagnosing ( CAD ) system for sensing of lung nodules utilizing the Extreme Learning Machine. The lung malignant neoplastic disease sensing system is shown in figure 6.1. The proposed attack ab initio use the different image treating techniques such as Bit-Plane Slicing, Erosion, Median Filter, Dilation, Outlining, Lung Border Extraction and Flood-Fill algorithms for extraction of lung part. Then for cleavage Modified Fuzzy Possibilistic C Mean algorithm [ 155 ] is used and for larning and categorization Extreme Learning Machine is used.
Lung Regions Extraction
Cleavage of lung part utilizing MFPCM
Analysis of metameric lung part
Formation of diagnosing regulations
Categorization of happening and non happening of malignant neoplastic disease in the lung utilizing ELM
Figure 6.1: The Lung Cancer Detection System
MACHINE LEARNING TECHNIQUES
Machine acquisition is a scientific subject that is chiefly based on the design and development of techniques that allow computing machines to develop characteristic characteristics based on empirical informations. In order to capture the unknown implicit in chance distribution of the scholar, the old experience is of great aid. Data is seen as illustrations that show dealingss between ascertained variables. A chief purpose of machine larning research is to do them automatically learn to place complex forms and do rational determinations based on the nature of informations. As the possible inputs are excessively big to be covered by the set of preparation informations, it is really tough.
The chief purpose of a scholar is to generalise from its past experience. The preparation informations from its experience come from unknown chance distribution and the scholar has to acquire something more general, something about that distribution that offers important responses for the hereafter instances. Figure 6.2 shows the architecture of ELM.
Tocopherol: THESIS-Protein PredictonELM2.png
Figure 6.2: Machine Learning Approach
Importance of Machine Learning
There are several grounds for the machine acquisition attack still being an of import technique. The of import technology grounds are:
Certain operations can be defined merely by cases. It is able to place input / end product braces but a brief relationship between inputs and desired end products can be obtained merely by cases. Machines are expected to change their internal construction to make right end products for a big figure of sample inputs and therefore decently limit their input/output map to come close the relationship implicit in the cases.
Machine larning techniques are frequently used in the extraction of the relationships and correlativities among informations ( informations excavation ) .
Most of the machines designed by homo can non execute in the environments in which they are used. Furthermore, certain features of the on the job environment are non wholly known at design clip. Machine larning attacks can be used for on-the-job sweetening of bing machine designs.
Certain works has big sum of cognition available which is tough for the worlds to encode explicitly. But machines are capable of larning this cognition and execute better.
Environments maintain on altering. But the machines that can accommodate to a changing environment are really important as it reduces the demand for changeless redesign.
Human invariably discover the new cognition about undertakings. There is changeless alteration in the vocabulary. So the redesign of Artificial Intelligence systems harmonizing to the new cognition is impractical, but machine acquisition attacks can track the alterations and can easy update the new engineerings.
Types of machine acquisition algorithms
Machine acquisition algorithms are organized into taxonomy, based on the coveted result of the algorithm.
Supervised acquisition: It generates a map that maps inputs to want end products.
Unsupervised acquisition: It theoretical accounts a set of inputs, like constellating.
Semi-supervised acquisition: This type combines both labeled and unlabelled samples to bring forth a suited classifier.
Reinforcement acquisition: It learns how to move given an observation of the universe.
Transduction: It predicts fresh end products depending on preparation inputs, developing end products, and trial inputs.
Learning to larn: This attack learns its ain inductive prejudice depending on old experience.
Extreme Learning Machines
Extreme Learning Machines have really high capableness that can decide jobs of informations arrested development and categorization. Certain disputing restraints on the usage of feed-forward nervous webs and other computational intelligence attacks can be overcome by ELM. Due to the growing and betterment in the ELM techniques, it integrates the advantages of both nervous webs and support vector machines by holding faster larning velocity, necessitating less human intervene and robust belongings. An Example of ELM is depicted in Figure 6.3.
Tocopherol: THESIS-Protein PredictonELMELM1.jpg
Figure 6.3: An Example of ELM
ELMs parametric quantities can be analytically determined instead than being tuned. This algorithm provides good generalisation public presentation at really fast larning velocity. From map estimate point of position ELM is really different compared to the traditional methods. Elm shows that the concealed node parametric quantities can be wholly independent from the preparation informations.
In conventional acquisition theory, the concealed node parametric quantities can non be created without seeing the preparation informations.
In ELM, the concealed node parametric quantities can be generated before seeing the preparation informations.
Outstanding characteristics of ELM
Compared to popular Back extension ( BP ) Algorithm and Support Vector Machine ( SVM ) , ELM has several outstanding characteristics:
Ease of usage: Except predefined web architecture, no other parametric quantities need to be manually tuned. Users need non hold spent much clip in tuning and developing learning machines.
Faster larning velocity: The clip taken for most of the preparation will be in msecs, seconds, and proceedingss. Other conventional methods can non supply such a fast acquisition velocity.
Higher generalisation public presentation: The generalisation public presentation of ELM is better than SVM and back extension in most instances.
Applicable for all nonlinear activation maps: Discontinuous, differential, non-differential maps can be used as activation maps in ELM.
Applicable for to the full complex activation maps: Complex maps can besides be used as activation maps in ELM.
All the parametric quantity of the provender frontward webs need to be tuned and therefore the dependence between different beds of parametric quantities exist. Gradient descent-based methods have been used in assorted larning algorithms of provender frontward nervous webs. These attacks are normally really slow due to improper larning stairss or may easy meet to local lower limit. To accomplish the important learning public presentation, many iterative learning stairss are required by such learning algorithms
PROPOSED METHODOLOGY
The initial phase of the proposed technique is lung part extraction utilizing several image processing techniques. The 2nd phase is segmentation [ 156 ] of extracted lung part utilizing Fuzzy Possibilistic C Mean ( FPCM ) algorithm. Then the diagnosing regulations for observing false positive parts are elaborated. Finally Extreme Learning Machine ( ELM ) technique is applied in order to sort the malignant neoplastic disease nodules.
Five stages included in the proposed computing machine aided diagnosing system for lung malignant neoplastic disease sensing are as follows:
aˆ? Extraction of lung part from thorax computing machine imaging images
aˆ? Segmentation of lung part utilizing Modified Fuzzy Possibilistic C-Mean
aˆ? Feature extraction from the metameric part
aˆ? Formation of diagnosing regulations form the extracted characteristics
aˆ? Classification of happening and non happening of malignant neoplastic disease in the lung
Phase 1: Extraction of Lung Region from Chest Computer Tomography Images
The first stage of the proposed Computer Aided Diagnosing system is the extraction of lung part from the chest computing machine imaging scan image. This stage uses the basic image processing methods. The process for executing this stage utilizing the image processing methods is provided in figure 6.4. The image processing methods used for this stage are Bit-Plane Slicing, Erosion, Median Filter, Dilation, Outlining, Lung Border Extraction and Flood-Fill algorithms.
Normally, the CT chest image non merely contains the lung part, it besides contains background, bosom, liver and other variety meats countries. The chief purpose of this lung part extraction procedure is to observe the lung part and parts of involvement ( ROIs ) from the CT scan image.
The first measure in lung part extraction is application of spot plane sliting algorithm to the CT scan image. The different binary pieces will be resulted from this algorithm. The best suited piece with better truth and acuteness is chosen for the farther sweetening of lung part.
The following is application of Erosion algorithm which enhances the chopped image by cut downing the noise from the image. Then dilation and average filters are applied to the enhanced image for farther betterment of the image from other deformation. Sketching algorithm is so applied to find the lineation of the parts from the obtained from noise reduced images. The lung part boundary line is so obtained by using the lung boundary line extraction technique. Finally, inundation fill algorithm is applied to make full the obtained lung boundary line with the lung part. After using these algorithms, the lung part is extracted from the CT scan image. This obtained lung part is further used for cleavage in order to observe the malignant neoplastic disease nodule.
Original Image
Extracted lung
Bit-Plane Slicing
Erosion
Median Filter
Dilation
Sketching
Lung Border Extraction
Flood Fill Algorithm
Figure 6.4: The proposed lung parts extraction method.
Figure 6.5 shows the application of different image processing techniques for the extraction of lung part from the CT scan image. The lung part obtained eventually is shown in figure 6.5 ( H ) .
Calciferol: Limage11.JPG
Calciferol: Limage111.JPG
Calciferol: Limage22.JPG
Calciferol: Limage33.JPG
( a )
( B )
( degree Celsius )
( vitamin D )
Calciferol: Limage44.JPG
Calciferol: Limage55.JPG
Calciferol: Limage66.JPG
Calciferol: Limage777.JPG
( vitamin E )
( degree Fahrenheit )
( g )
( H )
Figure 6.5: Lung Regions Extraction Algorithm: a. Original CT Image, b. Bit-Plane Slicing, c. Erosion, d. Median Filter, e. Dilation, f. Outlining, g. Lung Region Borders, and h. Extracted Lung.
Phase 2: Cleavage of Lung Region Using Modified Fuzzy Possibilistic C-Mean
The 2nd stage of the proposed CAD system is the Cleavage of lung part. The cleavage is performed for finding the malignant neoplastic disease nodules in the lung. This stage will place the Region of Interest ( ROI ) which helps in finding the malignant neoplastic disease part. Modified Fuzzy Possibilistic C-Mean ( MFPCM ) is used in the proposed technique for cleavage instead than Fuzzy Possibilistic C Mean because of better truth of MFPCM.
FPCM algorithm merges the advantages of both fuzzed and possibilistic c-means techniques. Memberships and typicalities are indispensable for the accurate feature of informations infrastructure in constellating technique. Thus, an nonsubjective map in the FPCM is based on ranks and typicalities can be given as:
( 1 )
With the undermentioned restraints:
( 2 )
The consequence of the nonsubjective map can be achieved through an iterative method where the grades of rank, typicality and the bunch centres are given by:
( 3 )
( 4 )
( 5 )
FPCM constructs ranks and possibilities at the same time, together with the normal point paradigms or bunch centres for every bunch.
The nonsubjective map choosing is the really of import facet for the public presentation of the bunch method and to carry through better bunch. Therefore the bunch public presentation is based on nonsubjective map to be utilized for constellating. To bring forthing an suited nonsubjective map, the undermentioned set of demands are considered:
The distance between bunchs and the information points allocated to them must be reduced
The distance between bunchs must to be reduced
The desirableness between informations and bunchs is modeled by the nonsubjective map. Besides Wen-Liang Hung provides a new technique called Modified Suppressed Fuzzy C-Means, which well improves the map of FPCM because of a prototype-driven acquisition of parametric quantity I± . The learning process of I± is dependent on an exponential separation strength between bunchs and is updated at every loop. The parametric quantity I± is described as:
( 6 )
where I? represents a normalized term so that I? is taken as a sample discrepancy. That is, I? is described as:
( 7 )
( 8 )
However the statement which must be presented here is the common value used for this parametric quantity by every information at each loop, which may supply in mistake. Thus the weight parametric quantity is introduced for finding common value for I± . Or each point of the information set contains a weight in association with each bunch. So the use of weight allows supplying good categorization peculiarly in the instance of noise informations. So the weight is determined as given below:
( 9 )
Where wji indicates the weight of the point J in conformity with the category I. This weight is used to change the fuzzy and typical separation. FPCM technique is iterative in nature, since it is non likely to alter any of the nonsubjective maps evaluated straight. Otherwise to categorise a information point, bunch centroid has to be nearer to the informations point, it is rank ; and for finding the centroids, the typicality is used for cut downing the unwanted cause of outliers. The nonsubjective map contains two looks:
Fuzzy map and usage of indistinctness burdening advocate
Possibililstic map and usage of typical weighting advocate
But the two coefficients in the nonsubjective map are entirely used as exhibitioner of rank and typicality. A new relation, somewhat unusual, offers a really fast decrease in the map and enhances the rank and the typicality when they inclined near 1 and cut down this grade when they are close 0. This relation is to afford burdening exponent as exhibitioner of distance in the two under nonsubjective maps. The nonsubjective map of the MFPCM can be described as below:
( 10 )
U = { I?ij } indicates a fuzzed divider matrix, and is described as:
( 11 )
T = { tij } indicates a typical divider matrix, is represented as:
( 12 )
V = { six } indicates hundred centres of the bunchs, is represented as:
( 13 )
As MFPCM modifies its rank map harmonizing to the weight, the cleavage can be performed with better truth. When the boundary parts of the malignant neoplastic disease nodules are considered, the use of FPCM will sometimes misjudge the border of nodule because of its fixed rank map. This job is overcome by the use of MFPCM. As the rank map varies harmonizing to the weight of a peculiar part, this helps in cut downing the misclassification of boundary lines of the malignant neoplastic disease nodule. Some times the malignant neoplastic disease nodules will look in about same strength as that of lung part. In this instance, the use of FPCM will non observe the malignant neoplastic disease nodule instead it will misclassify the part as lung part. When MFPCM is used, those malignant neoplastic disease nodules can be precisely identified because of its capableness of altering the rank map.
After the cleavage is performed in the lung part, the characteristic extraction and malignant neoplastic disease diagnosing can be performed with the metameric image.
Phase 3: Feature Extraction from the Segmented Region
After the cleavage is performed on the lung part, the characteristics can be obtained from it for finding the diagnosing regulation for observing the malignant neoplastic disease nodules in the lung part absolutely.
The characteristics that are used in this attack in order to bring forth diagnosing regulations are:
Area of the campaigner part
The maximal drawable circle ( MDC ) inside the campaigner part
Mean strength value of the campaigner part
Area of the campaigner part
This characteristic can be used here in order to
Eliminate isolated pels.
Extinguish really little campaigner object.
With the aid of this characteristic, the detected parts that do non hold the opportunity to organize malignant neoplastic disease nodule are detected and can be eliminated. This helps in cut downing the processing in farther stairss and besides reduces the clip taken by farther stairss.
The maximal drawable circle ( MDC )
This characteristic is used to bespeak the campaigner parts with its upper limit drawable circle ( MDC ) . All the pels within the campaigner part are taken as centre point for pulling the circle. The resulted circle within the part is taken for consideration. Initially radius of the circle is chosen as one pel and so the radius is incremented by one pel every clip until no circle can be drawn with that radius. Maximum drawable circle helps in the diagnostic process to take more and more false positive cancerous campaigners.
Mean strength value of the campaigner part
In this characteristic, the average strength valley for the campaigner part is calculated which helps in rejecting the farther parts which does non bespeak malignant neoplastic disease nodule. The average strength value indicates the mean strength value of all the pels that belong to the same part and is calculated utilizing the expression:
( 14 )
Where J characterizes the part index and ranges from 1 to the entire figure of campaigner parts in the whole image. Intensity ( I ) indicates the CT strength value of pel I, and one scopes from 1 to n, where N is the entire figure of pels belonging to part J.
Phase 4: Formation of Diagnosis Rules from the Extracted Features
After the necessary characteristics are extracted, the undermentioned diagnosing regulations can be applied to observe the happening of malignant neoplastic disease nodule. The three regulations involved for diagnosing are as follows:
Rule 1: Initially the threshold value T1 is set for country of part. If the country of candidate part exceeds the threshold value, so it is eliminated for farther consideration. This regulation will assist in cut downing the stairss and clip necessary for the approaching stairss.
Rule 2: In this regulation upper limit drawable circle ( MDC ) is considered. The threshold T2 is defined for value of maximal drawable circle ( MDC ) . If the radius of the drawable circle for the campaigner part is less than the threshold T2, so that is part is considered as not cancerous nodule and is eliminated for farther consideration. Using this regulation has the consequence of rejecting big figure of vass, which in general have a thin oblong, or line form.
Rule 3: In this, the fury of value T3 and T4 are set as threshold for the average strength value of candidate part. Then the average strength values for the campaigner parts are calculated. If the mean strength value of candidate part goes below minimal threshold or goes beyond maximal threshold, so that part is assumed as not cancerous part.
By implementing all the above regulations, the upper limit of parts which does non considered as cancerous nodules are eliminated. The staying campaigner parts are considered as cancerous parts. This CAD system helps in pretermiting all the false positive malignant neoplastic disease parts and helps in observing the malignant neoplastic disease parts more accurately. These regulations can be passed to classifier in order to observe the malignant neoplastic disease nodules for the supplied lung image.
Phase 5: Categorization of Occurrence and Non Occurrence of Cancer in the Lung
The concluding stage in the proposed CAD system is the categorization of happening and non happening of malignant neoplastic disease nodule for the supplied lung image. The classifier used in this proposed attack is the Extreme Learning Machine ( ELM ) .
Extreme Learning Machine
Extreme Learning Machine ( ELM ) meant for Single Hidden Layer Feed-Forward Neural Networks ( SLFNs ) will randomly choose the input weights and analytically determines the end product weights of SLFNs. This algorithm tends to afford the best generalisation public presentation at highly fast larning velocity.
The construction of ELM web is shown in figure 6.6. ELM contains an input bed, hidden bed and an end product bed.
The ELM has several interesting and important characteristics different from traditional popular acquisition algorithms for provender frontward nervous webs. These include the followers:
Output Layer
First
Hidden Layer
Input Layer
Figure 6.6: Structure of ELM web
The larning velocity of ELM is really speedy when compared to other classifier. The larning procedure of ELM can be performed in seconds or less than seconds for several applications. In all the old bing larning techniques, the acquisition performed by feed-forward web will take immense clip even for simple applications.
The ELM has enhanced generalisation consequence when compared to the gradient-based acquisition like. The bing gradient-based acquisition technique and a few other larning techniques may meet several jobs like local lower limit, non proper larning rate and over adjustment, etc. To get the better of these jobs, some techniques like weight decay and early fillet techniques must be utilized in these bing acquisition techniques.
The ELM will achieve the consequences straight without such troubles. The ELM acquisition algorithm is much simple than the other acquisition techniques for feed-forward nervous webs. The bing acquisition techniques can be applied to merely differentiable activation maps, whereas the ELM larning algorithm can besides be used to develop SLFNs with many non-differentiable activation maps.
Extreme Learning Machine Training Algorithm
If there are N samples ( eleven, Ti ) , where eleven = [ xi1, xi2aˆ¦ xin ] T i?ZRn and ti = [ ti1, ti2, aˆ¦ , tim ] T i?ZRn, so the criterion SLFN with N concealed nerve cells and activation map g ( x ) is defined as:
( 15 )
where Wisconsin = [ wi1, wi2, aˆ¦ , win ] T represents the weight vector that links the ith concealed nerve cell and the input nerve cells, ?i = [ ?i1, ?i2, aˆ¦ , ?im ] T represents weight vector that links the ith nerve cell and the end product nerve cells, and Bi represents the threshold of the ith concealed nerve cell. The “ . ” in Wisconsin. xj indicates the interior merchandise of Wisconsin and xj. The SLFN attempt to cut down the difference between oj and tj. This can be given as:
( 16 )
or, more in a matrix format as H ? = T, where
( 17 )
and
( 18 )
The matrix H is the concealed bed end product matrix of the nervous web. If the figure of nerve cells in the concealed bed is equal to the figure of samples, so H is square and invertible. Otherwise, the system of equations requires to be solved by numerical methods, concretely by work outing
( 19 )
The consequence that reduces the norm of this least squares equation is
( 20 )
where Haˆ is known as Moore-Penrose generalized opposite. The most important belongingss of this consequence are:
Minimal preparation mistake.
Smallest norm of weights and best generalisation public presentation.
The minimal norm least-square solution of HI? = T is alone, and is
( 21 )
The ELM algorithm works as follows
Give a preparation set activation map g ( x ) and concealed nerve cell, do the followers
Delegating random value to the input weight and the prejudice
Find the concealed bed end product matrix H.
Find the end product weight, utilizing I? I‚=Ha??T, where, H and T are defined in the same manner they were defined in the SLFN specification above.
Summary
This chapter provides computing machine aided diagnosing system for early sensing of lung malignant neoplastic disease. The chest computing machine imaging image is used in this proposed. This chapter explains about the machine acquisition attacks and its importance. In the first stage of the proposed technique, the lung part is extracted from the thorax imaging image. The different basic image processing techniques are used for this intent. In the 2nd stage, extracted lung is segmented with the aid of modified fuzzed possibilistic c-means algorithm. The following stage is extraction of characteristics for diagnosing from the segmented image. Following, the diagnosing regulations are generated from the extracted characteristics. Finally with the obtained diagnosing regulations, the categorization is performed utilizing ELM to observe the happening of malignant neoplastic disease nodules. ELM has several salient characteristics and features which make it really utile for the intent of categorization in this proposed attack.
The following chapter trades with the experimental observations for the proposed attack.