I - QUALITY AND RELIABILITY OF MEASUREMENT |
---|
What is, and how can we determine the "quality" of a measurement?
What are "accuracy" and "precision" of measurement?
Acuracy and precision
The Quality of measurement is normally expressed in terms of its accuracy, which is, how fine, sharp, exact it is, and its precision , that is, how trustworthy, reliable, repeatable measurement outcomes are.
The accuracy of measurements is described by the identifiable distance between markers in "scales" of reference, that is continuous sequences of increasing quantity presented in sequences of increasing value. These sequences of markers are typically presented visually as "tick marks" in measuring rulers and yardsticks, measuring tapes, and swinging needle angles in dial markings of weighting scales or a diversity of measurement devices and controls. Depending on their degree of accuracy, scales themselves are identified within broad degrees of coarse or fine capabilities, as described in the previous chapter.
The degree of precision normally indicates the degree of trust on a measurement, dictated by the given instruments and procedures and the consequences of possible errors due to the measurements.
In general terms, the degrees of accuracy range between "estimated-or-loose/rough" for most casual situations, to "medium", to "fine" or perhaps "very fine/ultra-high/tight", for industrial or scientific applications. The precision of measurement tools can be stated from "very loose" unreliable untrustworthy, to "reasonably helpful" and trustworthy for practical uses to "very tight" highly reliable and trustworthy, for critical and delicate situations.
Measurement instruments are devices created to facilitate measurement activities by easing the comparison between the objects being measured, against the size of "official", widely recognized, unchanging objects (or events). To maintain consistency around the world, selected highly stable objects or events of reference, called "standard units of measurement" are defined and maintained to remain uniquely reliable through time and made available for eventual comparison and validation (calibration, described ahead) of various other measuring devices or processes.
Call the animated illustration of accumulating and counting standard units of weight - grams in the metric system, where the units (grams) are being counted and grouped in multiples of 5 and 10 to configure an increasing scale. Practical instruments, like the scales for weighting are artifacts that make it easy to evaluate the weight of an object by comparing it to a count of these standard units of measurement in a scale. Call the scale and balance and turn ON power of its TOTAL WEIGHT display to observe weighting standard reference and non-standard common items using instruments called the scale and the balance. To support trustworthy accuracy and precision through time and space, standard units of measure are carefully made and maintained in recognized, controlled and safe places and environments, guaranteeing their availability and unchanged nature by recognized standards measurement organizations. To implement the validity of readouts of measuring instruments in their use, everywhere. This normally requires for them to be "calibrated" that is, the steps required to insure that each time the instruments are used, they will recognize and be true to the actual, officially recognized physical magnitudes being measured. Calibration provides periodic verification, adjustment and validation as necessary, and typically required by human activities, such as in industrial, commercial and scientific operations, to verify that the instrument properly recognizes official standard units of measurement (or certified copies of them) at all times.
In the activities following below, different levels of "loose" and "tight" scales of measurement (sharpness) can be chosen. The precision (trustability) of each measurement, will depend on the specific different procedures and instruments selected.
HANDS ON Examples of gradual increase of accuracy offered by different scales:
CATEGORICAL/nominal/classification scales (1), (2)
INTERVAL AND RATIO scales (1), (2)
Through time, two predominant groups of standard units of measurement have been developed and used around the world: a) the Metric (base-10) or Decimal, MKS (Meters, Kilograms, Seconds) system, and the English (Base-2, Feet, Pounds, Seconds) system.
Call the Measuring Tool Size display to observe the sizing or open gap width of common mechanic wrenches in the metric and the English systems, by using (clicking and dragging) the (yellow) length-strips provided to the left of each wrench set, focusing and zooming in as needed, to view image details closer, more clearly.
The displays Using a Ruler (1) and Using a Ruler (2) below allow the measurement of physical (or space) size of various small objects. The ruler instrument in each display offer two different interval/ratio scales tick-marked on each of its two edges, one in centimeters (Metric System) and the other in inches (English System). Call the display to measure and record on a sheet of paper the height and width in both centimeters and in inches of each object presented.
|
---|
The caliper (or Vernier, displayed in the Accuracy and Precision activity - image below) can also be used to evaluate small but critical straight-line distances of objects to evaluate their accurate sizes. The caliper relies on the sliding movement of tightly fitting solid feelers and indicator parts. Through its mechanism, the caliper produces clearer, unambiguous (accurate), and more consistent (precise) measurements of distance.
Observe and use the caliper in the activity to measure and record, as accurately as possible the size of the various small objects shown. Measure each once with the ruler, and then with the caliper, to appreciate its visual and/or mechanical advantages affecting the precision of its measurements.
QUESTIONS:
1.- Which instrument (ruler or caliper) is more likely to be accurate (exact)? Why?
2.- Which instrument (ruler or caliper) is more likely to be more precise (consistent, same reading each time)? Why?
3.- What would it take for a ruler to become a more accurate and precise instrument?
DOING AND RECORDING MEASUREMENTS
Gross and fine measurement
In practice and in general, the degree of quality of measurements can be simplified to be within one of two major groups: Gross, or fine.
Gross measurement is looser, and most likely to be less accurate, less precise, and likely of relatively lower quality. Gross measurement includes rough measurement, guessing, estimation, and can be done using simpler or no instruments and/or basic procedures.
Fine measurement is tighter, more accurate, precise and of higher quality. Fine measurement would require better quality instruments and/or procedures.
REPRESENTATION/EXPRESSION OF QUANTITY
As stated in the INTRODUCTION section, in addition to its meaning as a verb describing the activity of determining how big, or small things are, the word measurement also can refer to a noun (or a "thing") meaning that surrogate (borrowed) representations: images, sounds, physical objects, and symbols can be used to express the magnitude, size, or quantity of things being measured.
The important role that these representations play in mathematics was also highlighted in the introductory section: they can allow us to perceive, understand, manipulate, and communicate about our environment more effectively and efficiently.
With the widening range of choices of tools and methods available for the collection, representation, storage, processing, and communication of information, it is becoming increasingly important that we learn about, and become proficient in how to take advantage of these choices to view, analyze, and manipulate quantities in the environment. The mounting pressures to become effective and competitive in the workplace, makes it imperative for us to always understand, use and if not available, create better tools and instruments and methods of measurement.
In this section, some of the most common forms of measurement representation and tools are described. Simpler practical situations are also included, simulated though illustrative interactive displays.
The most common choices of form represent quantity are:
Analog (concrete, iconographic, visually or perceptually explicit, mimicked forms), or symbolic/digital/numeric (abstract, orthographic, implicit, symbol-coded objects) encoding
Positive (forward) or negative (opposing) for directional or locational meaning
Integer (whole units) or fractional (using smaller parts of a unit)
Physical (tangible) or virtual (intangible, most likely, computer-generated)
Analog and digital representations
As described in more detail in Chapter 5 ahead, analog representations use look-alike, mimicking images, pictures, objects, sounds or movements that imitate, resemble, suggest or insinuate instinctively the form, size or magnitude of the objects or actions represented. Analog representations tend to be more natural, intuitive, and, in general, easier to perceive to the eyes, hearing and other human (or animal) senses. Digital representations on the other hand consist primarily of numbers and symbols that encode, symbolize or somehow be "decoded" to mean the magnitudes being described.
|
---|