Next Page Previous Page Home Tools & Aids Search Handbook
2. Measurement Process Characterization
2.3. Calibration
2.3.5. Control of artifact calibration

2.3.5.1.

Control of precision

Control parameters from historical data A modified control chart procedure is used for controlling instrument precision. The procedure is designed to be implemented in real time after a baseline and control limit for the instrument of interest have been established from the database of short-term standard deviations. A separate control chart is required for each instrument -- except where instruments are of the same type with the same basic precision, in which case they can be treated as one.

The baseline is the process standard deviation that is pooled from \( k = 1, \, \ldots, \, K \) individual repeatability standard deviations, \( {\large s}_k \), in the database, each having \( \nu_k \) degrees of freedom. The pooled repeatability standard deviation is $$ {\large s}_1 = \sqrt{ \frac{1}{\nu} \sum_{k=1}^K \nu_k \, {\large s}_k^2 } $$ with degrees of freedom $$ \nu = \sum_{k=1}^K \nu_k \,\, .$$

Control procedure is invoked in real-time for each calibration run The control procedure compares each new repeatability standard deviation that is recorded for the instrument with an upper control limit, UCL. Usually, only the upper control limit is of interest because we are primarily interested in detecting degradation in the instrument's precision. A possible complication is that the control limit is dependent on the degrees of freedom in the new standard deviation and is computed as follows: $$ UCL = s_1 \sqrt{F_{\alpha, \, \nu_{new}, \, \nu}} $$ The quantity under the radical is the upper α percentage point from the F table where α is chosen small to be, say, 0.05. The other two terms refer to the degrees of freedom in the new standard deviation and the degrees of freedom in the process standard deviation.
Limitation of graphical method The graphical method of plotting every new estimate of repeatability on a control chart does not work well when the UCL can change with each calibration design, depending on the degrees of freedom. The algebraic equivalent is to test if the new standard deviation exceeds its control limit, in which case the short-term precision is judged to be out of control and the current calibration run is rejected. For more guidance, see Remedies and strategies for dealing with out-of-control signals.

As long as the repeatability standard deviations are in control, there is reason for confidence that the precision of the instrument has not degraded.

Case study: Mass balance precision It is recommended that the repeatability standard deviations be plotted against time on a regular basis to check for gradual degradation in the instrument. Individual failures may not trigger a suspicion that the instrument is in need of adjustment or tuning.
Home Tools & Aids Search Handbook Previous Page Next Page