Originally Posted by Harbinger[TG]
You can't really have one without the other. And it would be precision not accuracy.
Accuracy vs precision, semantics. How close the paint hit to what you are aiming at.
Not really sure what you mean by "Can't have one without the other." Yes, all data sets have both a range and variance. But if you use just the range (i.e. measure how far apart the two furthest shots are and draw a circle with that diameter), it is technically a less true representation of the accuracy/precision of the marker/paint/barrel. If you measure the variance, this allows you to draw a truer representation of the accuracy/precision since accuracy/precision means how close the data falls to the center (target).
Again, looking at the original poster's example. If you use the range to draw a circle, you have to conclude that both groupings are equally accurate/precise since the same sized circle just captures all the data points. If you measure the variance of the groupings, the one has less variance since many of the data points are close together in the center. If you could realistically produce two setups, one that could shoot each spread (which is unlikely because the one that has all the data points on the fringe isn't a bell curve, and it seems something else is at work). But ignoring that, if you could get a setup to shoot each pattern, you would pick the second one, the one with lower variance if you wanted to hit your target (the center) more often.
Again, with samples that have large differences in shot spread (like Carter's .50 vs .68 video), the range method they used is almost always going to say the same thing as the variance method, that is, the smaller pattern is more accurate/precise. When the patterns are close in size, like the OP's example, the range method is not adequate.