Percent error equals the absolute value of measured minus actual, divided by the absolute value of actual, times 100

Solution

Share:

How It Works

Percent error tells you how far a measured value strays from a known or accepted value. It divides the absolute difference by the accepted value and multiplies by 100. A lower percent error means your measurement was more accurate.

Example Problem

A student measures the boiling point of water at 99.1°C. The accepted value is 100°C.

  1. Difference: |99.1 − 100| = 0.9
  2. Percent error: (0.9 / 100) × 100 = 0.9%

Frequently Asked Questions

What is the difference between percent error and percent difference?

Percent error compares a measurement to a known standard. Percent difference compares two equal-status measurements and divides by their average. Use percent error when one value is the accepted truth.

Is a percent error of 5% acceptable?

It depends on the field. In a chemistry lab, 5% is often acceptable for student experiments. In precision manufacturing or pharmaceutical work, tolerances of less than 1% are common.

Can percent error be over 100%?

Yes. If the measured value is more than double or less than zero compared to the accepted value, the percent error exceeds 100%. This usually indicates a major experimental problem.

Related Calculators