Information displayed below is a subset of the entire knowledge base and may be incomplete intensionally or inadvertently. If you detect a serious error or want access to the complete knowledge base, please contact us.
Iron saturation is a clinical test that measures the amount of iron in a patient's serum. It is used to diagnose iron deficiency anemia and other iron-related disorders. The test is performed by taking a sample of the patient's blood and measuring the amount of iron in the serum. The iron saturation is then calculated by dividing the amount of iron in the serum by the total amount of iron-binding proteins in the serum. The iron saturation is expressed as a percentage.
The most common method for obtaining a sample of the patient's serum is through a venipuncture. This involves inserting a needle into a vein in the arm and drawing a sample of blood. The sample is then sent to a laboratory for analysis. Other methods for obtaining a sample of the patient's serum include finger prick, heel prick, and capillary puncture.
All of the following must be considered when interpreting clinical findings and are too extensive to be covered on this site: