A micrometer is a device most often used by engineers to precisely measure machines and other industrial instruments. Although the device originated as an analog system where you read the markings of the micrometer to determine the measurements, standard micrometers have now all become digital, allowing you to calibrate the device through the aid of a computer.
Instructions
1. Open the micrometer program you run on your computer. There is a wide variety of different programs in use, so the exact name can differ. The program is either going to be on the desktop, or accessed through "Start," "All Programs."
2. Select the "Adjust" option, followed by "Micrometer" and finally "Calibrate." This opens up a small calibration window in the middle of the screen and an image of the micrometer scale to its side.
3. Click on the left-most measurement line on the micrometer scale, then click the right most measurement line. A set measurement of the distant between the two clicks appears in the micrometer calibration window.
4. Click "OK" an the line measurement is calibrated. This calibration becomes the default measurement standard for every time you use the micrometer, until you recalibrate the device.
Tags: calibration window, line measurement, measurement line, micrometer scale