Home
Scholarly Works
Optimization of calibration intervals for...
Journal article

Optimization of calibration intervals for automatic test equipment

Abstract

This paper proposed an approach for evaluating the optimal calibration interval of ATE. The on-station calibration strategy of ATE is analysed on the basis of the metrology chain. The calibration interval of a single instrument is determined by a grey prediction model based on fresh information. The out of tolerance-calibration matrix, which is used to calculate the metrology contribution rate, is established according to the multi-signal flow model. The calibration interval of ATE is optimized by weighted average to overcome the disadvantages of the traditional experiential method. The results demonstrate that compared with the cannikin principle and criteria of the highest rank of the metrology chain principle, our approach can make better use of test resources and save more time.

Authors

Wang J; Zhang Q; Jiang W

Journal

Measurement, Vol. 103, , pp. 87–92

Publisher

Elsevier

Publication Date

June 1, 2017

DOI

10.1016/j.measurement.2017.01.062

ISSN

0263-2241

Contact the Experts team