Part 3. Calibration of Geotechnical Instruments (Including Case Study)

INTRODUCTION: As we have seen, the accuracy of transducers is transferred to them by the process of calibration against (or comparison with) some standard. The standard will itself have a specified accuracy.

Doebelin (2003) points out that the standard must have an accuracy higher than the accuracy of the instrument being calibrated, and that the standard itself must have been established by acceptable means. Sydenham et al. (1989) make the important point that “… Associated with the calibration are two costs – that of making it and that of not making it”. 

It is evident that the calibration means of transferring the accuracy must encompass the whole measurement system involved and will be affected by resolution, stability and repeatability – and above all be part of the laboratory culture. It is desirable that transducers have a near-linear relationship between the set standard and the measured quantity so that the relationship can be expressed as a parameter in engineering units per transducer output which is usually in mV e.g. kPa/mV. The variability in this relationship can be expressed as an accuracy or linearity such as say 0.1%. But it is not always possible, and indeed with modern software, not always necessary for linearity to be achieved. 


PLEASE NOTE: By clicking to download a white paper, you confirm that you have read and agree to the following provisions:
 
1. Terms of Use for White Papers
You are permitted to download, use and distribute copies of any white papers made available for download on this page, except that: You must only use and distribute the white papers in their entirety without amendment, deletion, text, graphics or other content.
2. Disclaimer
The information contained in these documents is written by and is the opinion of GDS Instruments. White papers are for informational purposes and not reviewed by third parties.