Measurement, Uncertainty, and Error

Ace your homework & exams now with Quizwiz!

the difference between accuracy and precision?

Accuracy: refers to how close a measured value is to an accepted value Precision: refers to how close together a series of measurements are to each other

calculate a fractional or percent uncertainty from an absolute uncertainty?

Addition and Subtraction: perform operation between set values and add uncertainty Ex: (196 +/- 1) - (152 +/- 1) = 44 +/- 2 Multiplication/Division: Perform operation and use add percent uncertainties and multiply that by set value Ex: (10.0 +/- 0.1)(5.0 +/- 1)(6.0 +/- 0.1) = 300 +/- 14 (+/-10 with sig figs) Percent uncertainty = 0.1/10 = 1% = 0.1 / 5 = 2% = 0.1 / 6 = 1.7% Total = 4.7% ----> 0.047 x 300 = 14

assign uncertainties for digital and analog devices?

Analog: +/- half the smallest known scale division (4.20 +/- 0.05) Digital: +/- smallest scale division (19.16 +/- 0.01)

Why is uncertainty inherent in all measurements and cannot be completely eliminated?

Uncertainty is caused by the precision of the measuring device as well as the human error in the person who used said device.

the difference between uncertainty and error?

Uncertainty: is inherent in all measurements and cannot be taken out no matter how precise the measuring instrument is. It is the range of possible values within which the true value of the measurement lies. It shows how error (which may or may not be present) may affect the measured value Error: is the difference between a measured value and the true value for a measurement and can either be random or systematic. It can be mitigated by either averaging trials or reevaluating the set up.

determine the number of significant figures in any number?

Non-zero digits are always significant. Any zeros between two significant digits are significant. A final zero or trailing zeros in the decimal portion ONLY are significant.

Why is a value's uncertainty cannot be more precise than its measured value?

Precision is based on how many significant numbers are in a number. A measurement value cannot be more precise than the measuring instrument's uncertainty because all digits beside the last digit of a measured value are known. The last digit of a measured value is estimated, therefore the uncertainty is in the last digit and must be in the same number place as no other digits further can be found.

the difference between random error and systematic error?

Random: uncontrollable variation in measurements minimized by averaging multiple trials. (variation is different amounts in both direction) Systematic Error: all measurements are off by the same value, perhaps caused by not starting at 0 (calibration), averaging does not help


Related study sets

Psychological Manipulation Techniques & Tactics

View Set

Information Retention and Access

View Set

Study For AP World History Final

View Set