Introduction to Mechanical Measurements:
The science of measurement is known as metrology. Measurement is done to know whether the component which has been manufactured is as per the requirements or not. Measurements will be of mainly length, mass, time, angle, temperature, squareness, roundness, roughness, parallelism etc. For measuring any quantity there must be some unit to measure and express.
Measurement is defined as the process or the act of measurement. It consists of obtaining a quantitative comparison between a predefined standard and a measurand or unknown magnitude.
Definitions and terms Applicable to Mechanical Measurements:
- Least Count
- Response time
This term indicates the closeness with which the scale of the instrument may be read. For example, an instrument with 30 cm scale will have a higher readability than an instrument with a 15 cm scale.
It is the smallest difference between two indications that can be detected on the instrument scale. or in other words, it is the least value that can be measured with that particular device.
It represents the least value and largest value that can be measured using that instrument.
It is defined as the ratio of the linear movement of the pointer on the instrument to the change in the measured variable causing this motion.
The sensitivity of an instrument should be high and the instrument should not have a range greatly exceeding the value to be measured. However, some clearance should be kept for accidental overloads.
Repeatability is the ability of the measuring instrument. to repeat the same results during the act of measurements for the same quantity is known as repeatability.
In other words.
If an instrument is used to measure same or an identical input many times at different time intervals, the output is not the same but shown a scatter. This scatters or deviation from the ideas static characteristics, in absolute units or a fraction of the full scale is called repeatability error, as shown in fig below.
It is the difference between the indications of a measuring instrument when the same value of the measured quantity is reached by increasing or by decreasing that quantity.
The phenomenon of hysteresis is due to the presence of dry friction as well as to the properties of elastic elements. It results in the loading and unloading curves of the instrument being separated by a difference called the hysteresis error. It also results in the pointer not returning completely to zero when the load is removed.
Hysteresis is particularly noted in instruments having elastic elements. The phenomenon of Hysteresis in materials is due mainly to the presence of internal stresses. It can be reduced considerably by proper heat treatment.
The accuracy of an instrument deviation of the reading from a known input. or in other words, accuracy is the closeness with which the readings of an instrument approach the true values of the quantities measured.
It is expressed in percentage, based on either actual scale reading or full-scale reading as below:
The precision of an instrument indicates its ability to reproduce a certain reading with a given accuracy. or in other words, it is the degree of agreement between repeated result.
Precision refers to the repeatability of measuring process i.e., The closeness with which the measurement of the same physical quantity agrees with is another.
Consider an example to differentiate between precision and accuracy.
Let the known voltage of 101 volts with a certain meter, then five readings are taken which indicated 105, 104, 106, 104 and 106 volts. From these values the accuracy of an instrument is
While precision is ± 1% because the maximum deviation (106) from the mean reading (105) is only 1 volt. So the instrument could be calibrated so that it could be used to dependable measure voltages within ±1 VOH. From which it is clear that accuracy can be improved up but not beyond the precision of the instrument by calibration.
Resolution is also called as discrimination. and defined as the smallest increment of the input signal that a measuring system is capable of displaying.
If the instrument input is increased very gradually from zero, there will be some minimum value below which no output change can be detected. This minimum value defined the threshold of the instrument. The main difference between threshold and resolution are :
- Thus resolution defined the smallest measurable input change. While threshold defines the smallest measurable input.
- The threshold is measured when the input is varied from zero while the resolution is measured when the input is varied from any arbitrary non zero value.
Wherever exact quality of control is required, this is the major requirement of the instruments. It is defined as the degree of closeness with which the same value of a variable may be measured at different times.
Reproducibility is affected due to several factors such as the drift in the calibration of a thermocouple at high temperature due to contamination. Periodic checking and maintenance of instruments are generally done to obtain reproducibility. Perfect reproducibility means that an instrument has no drift.
Any measuring system must be provable, i.e., it must prove its ability to measure reliably. The procedure adopted for this is called ‘calibration’.
When the system is prepared to measure quantities, known values of the input quantities are feel to the system and the corresponding outputs are measured. A graph relating the output with input is plotted and is known as ‘Calibration Graph’.
The procedure involves comparison of a particular instrument with either:
- a primary standard
- secondary standard with a higher accuracy than the instrument to be calibrated
- A known input source.
This is the concept of establishing a valid calibration of measuring instrument. or measurement standard by step-by-step comparison with better standards up to an accepted or specified standard.
It is the time which elapses after a sudden change in the measured quantity until the instrument gives and indication differing from the true value by an amount less than a given permissible error.
It is the characteristic of a measuring instrument to give indications of the value of a measured quantity whose average differs from the true value of that quantity.
Bias errors are due to the algebraic summation of all the systematic errors affecting the indication of the instrument. The main sources of bias are maladjustment instrument of the instrument, permanent set, non-linearity errors, errors of material measures such as capacity measures, gauge blocked etc.
It is the total error of a measure or measuring instrument under specified conditions of use and including bias and repeatability errors.
Inaccuracy is specified by two limiting values obtained by adding and subtracting to the bias errors the limiting value of the repeatability errors.
If the known systematic errors are corrected, the remaining inaccuracy in due to the random errors and the residual systematic error that also have a random character. This inaccuracy is called uncertainty of measurement.
Above all are important terms that are used in the mechanical measurements.
Hey, if you like the article on mechanical measurements share with your friends. If you have any question leave a comment, subscribe to our newsletter to get notified about new posts.
And finally, Join our community by liking our Facebook Page.
- Micrometre Screw Gauge and Types OF Micrometers [The Complete Guide]
- Vernier Caliper [Complete Guide] Vernier Caliper Types, Parts, Errors, Advantages and more.
Read about machines:
- What is Shaper Machine and The 4 Major Types of Shaper Machines
- 22 Different Types of Lathe Machine Operations