I equate accuracy with precision. I can get both speakers within the tolerance limit of my laser measure to .125 of an inch.
Actually,
accuracy and
precision are
completely different things, though they are often confused with one another and used interchangeably.
As they are completely different things, they are also measured differently statistically.
Accuracy is a meaure of how close to the "truth" something, e.g., a measurement, actually is. And, there is always "a truth." The way that accuracy is measured is by comparison to a standard or set of standards that represents "the truth". The difference between the observed (measured value) and the "truth" is referred to as "bias". Accuracy is reported as some unit of 'bias' relative to these standards. This is why NIST is so important, because they provide these standards.
I'd predict that the 0.125 inch you cite above as the "tolerance" is actually the meaure of your laser measure's
bias, not a measure of its precision.
Precision is the "repeatibility" of a
set of measurements. It is a measure of the variation of a set of repeated measurements. It is usually expressed as standard deviation, variance or C.V. (coefficient of variation, which is unitless).
So, it is entirely possible to have an inaccurate, precise set of measurements, and also an accurate, imprecise set of measurements. Worst case is the measurements are both inaccurate AND imprecise.
Any measurement device, e.g. a laser measure should be both ACCURATE (reflective of the "truth" of the measurement, i.e, minimal bias) and also PRECISE, so that one can take repeated measurements and have statistical confidence that the variance in the repeated set of measure is acceptably small for the task at hand, or fit for purpose.