top of page

Day 64 – The Scale of Confusion

Writer's picture: LGLG

How accurate are your gadgets?

Most of us endurance athletes record a lot of data. We monitor heart rates, power, pace, speed, cadence, time, weight, lactate, VO2max, calories expended, calories consumed, water intake, sodium intake… the list goes on.

To make many of these measurements possible, we have to use gadgets: some high tech like GPS sport watches and some low tech like your doctor’s beam scale. On the other hand, some measurements are subjective and only require your keen judgement like RPE and quality of sleep, but today I want to focus on the gadgets.

Usually, you step on a scale and you take the displayed weight as the absolute truth. However, the reason I am writing this post is because my sister was quite upset that she stood on her scale, recorded a number, stepped off then stepped on the scale again and read a different number. “How can that be?”, she asked. “How do I know which one is right?”

In case you are wondering the same thing, this blog post is a primer on precision, accuracy and confidence intervals. Don’t worry. I’ll hide the math in a black box, but you’re going to have to trust me.

All measurement devices are subject to measurement error. The next time you put gasoline in your car at a gas station in the U.S., look for the government sticker on the pump. It’s from your state’s Office of Weights and Measures and it certifies that the pump is not ripping you off, but it doesn’t guarantee that you get 10.000 gallons like the pump readout says. It just means that the pump delivers an amount of gas that is within a specified tolerance.

Your scale and GPS watch and every other gadget are the same way. They all measure something within a tolerance. Sometimes the product literature will explain it in plain english. Actually, it’s not plain english. It’s plain obscure. Down in the technical specifications there might be a number with a little &plusminus; sign and it tells you the standard error which is, itself, a measure of tolerance. The standard error simply means that your device – let’s use a scale as an example – is not perfect. The scale doesn’t measure your real weight. It approximates your real weight.

The Example

How accurate is it?

A couple of years ago we bought a scale and I’ve trusted it all this time. However, today I want to find its standard error because I can’t find it in the product literature. Any good scientific investigation will start with a hypothesis, a list of assumptions, a method to test the hypothesis and beer… lots of beer and obscure drawings on napkins from the local pub. I don’t have any beer so we’ll skip that for now.

My hypothesis

The scale measures my weight with a standard error different from zero. This is what I want to test.

My Assumptions

  1. My weight on the scale and the variability of the measurement (standard error) are not correlated

  2. The measurement errors occur randomly – no trends

  3. The measurements are unbiased – the scale measures mass consistently and without other influences like a five pound cat on the platform.

  4. Effects of temperature, humidity, air density, air movement and type of music I listen to are negligible.

Measurement bias

My Method

If there is a non zero standard error associated with this scale, then I expect that it will show different weights each time I step on it even if my real weight doesn’t change. Therefore, I will step on and off the scale to get 30 measurements. The measurements will be summarized to determine if there is any variability in the 30 measurements.

Here We Go!

I will spare you the tedium of this process and get straight to the results.

  1. n (number): 30 weights we recorded

  2. Mean (average): 170.25 lb

  3. SE (standard error): 0.02 lb

I’m surprised at the accuracy. What these results say is that there is a 66% probability that my actual weight is between 170.23 (170.25 – 0.02) pounds and 170.27 (170.25 + 0.02) pounds. That is what statisticians call a confidence interval: a 66% confidence interval. The 99% confidence interval is from 170.20 pounds to 170.30 pounds. That means I can hold my head high and declare with confidence that I weigh 170 pounds (to the nearest whole pound). For all practical purposes, the standard error is zero, because the variability in weight does not affect the final estimate of the weight which I’ve reported in whole pounds rather than 100ths of a pound. This scale is very accurate to the whole pound.

In conclusion, my sister needs to get a different scale.

Until tomorrow…

2 views0 comments

Recent Posts

See All

Comments


  • Facebook
  • Twitter

©2022 by TriRiot. Proudly created with Wix.com

bottom of page