0% found this document useful (0 votes)
82 views

2 Metrology Calibration 7 Feb 2020

This document discusses metrology and calibration. It defines metrology as the science of measurement and calibration as relating the output of a measurement to an input standard. Accuracy refers to how close a measurement is to the true value, while precision refers to the repeatability of measurements. Standards must be certified for accuracy, precision, traceability, and measurement uncertainty. Calibration frequency ensures instruments maintain accuracy over time. Examples are given to illustrate the differences between accuracy and precision in measurements. Traceability establishes the calibration chain from instruments to primary standards.

Uploaded by

Mr-Mk Mughal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
82 views

2 Metrology Calibration 7 Feb 2020

This document discusses metrology and calibration. It defines metrology as the science of measurement and calibration as relating the output of a measurement to an input standard. Accuracy refers to how close a measurement is to the true value, while precision refers to the repeatability of measurements. Standards must be certified for accuracy, precision, traceability, and measurement uncertainty. Calibration frequency ensures instruments maintain accuracy over time. Examples are given to illustrate the differences between accuracy and precision in measurements. Traceability establishes the calibration chain from instruments to primary standards.

Uploaded by

Mr-Mk Mughal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Metrology

l andd Calibration:
lib i
Official Code: QLTY06005
Lecture 2

1
Presentation Outlines
 What is Metrology & Calibration. Re-cap
 Standards
d d
 Accuracy
 Precision
 Examples of Accuracy & Precision
 Traceability
bili
 Uncertainty of Measurement
 Calibration Frequency
 Questions
Recap Metrology & Calibration
Recap--
• M
Metrology
t l or th
the Science
S i off measurement.
t
• Calibration is the relationship between the output
(your own measurement) and the input (the
standard).
• If the output (your own measurement) is the same
as the input (gauge block) or within the stated
accuracy of the instrument then the instrument is
said to have passed its calibration.
• For this approach
pp to work there must be a high
g
degree of confidence in the inputs that we
measure.
3
Calibration Example
Micrometers can be calibrated usingg a spec
special
a se
set o
of
gauge blocks (standards) and a set of optical parallels.

Metric gauge blocks used


for micrometer calibration
Recap Standards
Recap--
Standards must be certified by a calibration
laboratory with regard to:
1. Accuracy
2. Precision
3 Traceability
3.
4. Known uncertainty of measurement

5
Variation in Manufactured Parts
1. Within-piece
p variation: Surface finish
2. Piece-to-piece variation: Among pieces produced at the
same time (such as intensity of light bulbs)
3. Time-to-time variation: Difference in product produced at
different times of the day (services at morning and day ,
cutting characteristics as the tool wears)
Sources of Measurement Errors

Operators Methods Measurement


Materials Instruments

INPUTS PROCESS OUTPUTS

Tools Human
Machines Environment Inspection
Performance
Systematic
y Errors
• Systematic errors
– Result of an experimental mistake
– Systematic errors in experimental observations
usually come from the measuring instruments
• there is something wrong with data handling
system
t
• The instrument is wrongly used by the
experimenter
• Examples of systematic errors :
– errors in measurements of temperature due to poor
thermal contact between thermometer and the
substance
Precision and Accuracy
Accuracy – closest to TRUE Value
Precision – repeatedd Same Value
l
Accurate but not precise

Average
A
Value
Master Value
(Reference Standard)

Bias
1

True Value

Observed Average Value


Accuracy
• Accuracy is the extent to which a reading might
be wrong, and is often quoted as a percentage of
the full scale reading of an instrument.
• Example:
p for example p ap pressure ggaugeg has a
range of 0 to 10 bar, and if it has a quoted
inaccuracy of +/- 1% of full scale reading (f.s.)
then the maximum error to be expected in any
reading is 0.1 bar (1% of 10 bar). This means that
when the instrument is reading 1.0 bar, the
possible error is 10% of this value…

10
Accuracy

11
Accuracyy
• For this reason it is important that instruments
are chosen
h suchh that
h their
h range is appropriate to
the spread of values being measured, in order
for the best possible accuracy to be maintained in
instrument readings.
Example:
Thus if a company were measuring pressures
with an expected value of between 0 and 1 barbar,
they would not use an instrument with a range of
0 to 10 bar to carryy out readings,
g but more likelyy
an instrument with a range of 0 to 1 or 2 bar.

12
Precision
This is also called repeatability and is a measure of the
spread of results of repeated measurements carried out by
the same person using the same instrument on the same
component.
component
Precise but not accurate
Operator ‘A’ Operator ‘B’
Operator ‘B’

Operator ‘A’

1st Trial 2nd Trial


Precision
EExamples
l off Precision
P i i and d Accuracy:
A
• Example 1: An example of a sensor with BAD accuracy
and BAD precision.
precision
• Suppose a lab refrigerator holds a constant
temperature
p of 40C. A temperature
p sensor is tested 10
times in the refrigerator. The temperatures from the
test show the temperatures to be 4.0, 4.3, 4.9,4.7, 3.8,
3 9 4.8,
3.9, 4 8 5.1,
5 1 3.7,
3 7 3.5.
35
• This distribution shows no tendency towards a
particular value which shows lack of p
p precision,, and
does not acceptably match the actual temperature, i-e
40C – lack of accuracy.

14
Precision
EExamplel 2:
2 An
A example l off a sensor with
ith GOOD
accuracy and BAD precision.
• Suppose a lab refrigerator holds a constant
temperature of 4oC. A temperature sensor is
tested 10 times in the refrigerator.
refrigerator The
temperatures from the test show values of 4.1,
4.2, 4.3, 4.1, 4.4, 3.9, 3.8, 3.7,3.6, 4.0.
• This distribution shows no particular tendency
towards a particular value (lack of precision) but
values
l measured d are close
l to the
h actuall value
l
(high accuracy).

15
Precision
EExamplel 3:
3 An
A example l off a sensor with
ith BAD
accuracy and GOOD precision.
• Suppose a lab refrigerator holds a constant
temperature of 4oC. A temperature sensor is
tested 10 times in the refrigerator.
refrigerator The
temperatures from the test are 4.4, 4.4, 4.5, 4.6,
4.4, 4.5, 4.4, 4.5, 4.5, 4.6.
• This distribution does show a tendency towards a
particular value (high precision) but each
measurement is wellll off ff the
h actuall temperature
(low accuracy)

16
Precision
Examplel 4: An example l off a sensor with
i h GOOD
GOO
accuracy and GOOD precision:
• Suppose a lab refrigerator holds a constant
temperature of 4oC.A temperature sensor is
tested 10 times in the refrigerator. The
temperatures from the test are 4.0, 4.1, 4.0, 4.1,
4 1 4.0,
4.1, 4 0 4.2,
4 2 4.0,
4 0 4.1,
4 1 4.1.
41
• This distribution does show a tendency towards a
particular value (high precision) and each value is
close to the actual temperature (high accuracy)
17
Precision & Accuracy
Suppose we have a reference part with a ‘true’ diameter of 5.0 mm

1.Method A gives the following readings:


5.1, 6.6, 4.1, 3.3, 6.4

2.Method B gives the following readings:


4.1, 3.8, 4.4, 4.2, 4.0

Questions:
1.Which method is more accurate?
2.Which method is more precise?
3.Which method do you prefer? Why?
Precision & Accuracy
4.1
4 1
3.8
4.4
42
4.2
4.0 Precise, but not Accurate

5.0
6.5
4.0
3.2
6.3 Accurate, but
b not Precise
i

It would be better to choose the Method B that is


more Precise but not Accurate,
Accurate because it is easier to
“shift the mean” rather than “reduce the variability”
Standards and Calibration

https://masy.com/index.php/2019/11/22/metrological-traceability/
Traceability
• Calibration has a chainlike structure in which each instrument
in the chain is calibrated against a more accurate instrument
above it in the chain as demonstrated in the following slide.
• The traceability of the calibration system must be defined by
each company and supported by calibration certificates.

21
Uncertainty Estimation
• When we measure some physical quantity with an
instrument and obtain a numerical value,
value we want to
know how close this value is to the true value. The
difference between the true value and the measured
value is the error.

• The accuracy of a result can be quantified by calculating


the percent error. The percent error can only be found if
the true value is known.
known Although the percent error is
usually written as an absolute value, it can be expressed
a negative or positive sign to indicate the direction of
error from true value.
Uncertainty of measurement:
How do
H d we know
k when
h we can truly l believe
b li a
measurement?
• No
N measurementt is i ever correct.
t There
Th iis always
l an
unknown, finite, non-zero difference between a
measured value and the corresponding true value value.
• Most instruments have specified or implied tolerance
limits within which the true value of the measurement
should lie if the instrument is functioning correctly.
• One can never be 100% sure that an instrument is
operating within its specified tolerance limits

23
Uncertainty of measurement:
BUT
BUT…….
• There are steps that can be taken to minimize
the probability of a measurement falling
p
outside specified tolerance or uncertaintyy
bands.
• Regular
R l ttraceable bl calibration
lib ti iis a method
th d ffor
gaining quantifiable confidence in a
measurement system

24
Calibration Frequency
• The frequency with which calibrations should be
carried
i d outt iis an iimportant
t tb butt sometimes
ti
difficult to answer question.
• Allll measuring instruments/devices
/d whether
h h
simple or sophisticated will change over time, but
th iissue iis h
the how much hddo th
they change?
h ?
• Calibration of instruments used to obtain
measurements is mandatory and to achieve the
required accuracy this calibration must be carried
out at a pre-determined
d i d ffrequency.

25
Calibration Frequency
• Changes in the instruments characteristics are
brought about by dirt, dust particles, the
mechanical wear on the instrument and other
environmental factors may also influence or
cause changes.
• To a great extent the amount of change will be
determined by the amount of use the instrument
receives and on the amount of wear and tear.
• However,, some drift mayy also occur even while
the instrument is in storage.

26
Calibration Frequency
q y

27
Calibration Frequency
Calibration
C lib ti frequency
f will
ill therefore
th f be
b dependent
d d t on a
number of factors:
• What is the measuringg instrument used for?
• Type of measuring instrument
• Trends from previous calibrations
• Tendency to wear and drift
• Accuracy of the measurement sought
• Manufacturers
M f t recommendations
d ti
• What would the implications be of an inaccurate reading?
• How often is the instrument used?
• Environmental conditions

28
Questions or Feedback.
Feedback

29

You might also like