Intravenous unfractionated heparin (UFH) remains an important therapeutic agent, particularly in the inpatient setting, for anticoagulation. Historically, the activated partial thromboplastin time (aPTT) has been the primary laboratory test used to monitor and adjust UFH. The aPTT test has evolved since the 1950s, and the historical goal range of 1.5-2.5 times the control aPTT, which first gained favor in the 1970s, has fallen out of favor due to a high degree of variability in aPTT readings from one laboratory to another, and even from one reagent to another. As a result, it is now recommended that the aPTT goal range be based on a corresponding heparin concentration of 0.2-0.4 unit/ml by protamine titration or 0.3-0.7 unit/ml by antifactor Xa assay. Given that several biologic factors can influence the aPTT independent of the effects of UFH, many institutions have transitioned to monitoring heparin with antifactor Xa levels, rather than the aPTT. Clinical data from the last 10-20 years have begun to show that a conversion from aPTT to antifactor Xa monitoring may offer a smoother dose-response curve, such that levels remain more stable, requiring fewer blood samples and dosage adjustments. Given the minimal increased acquisition cost of the antifactor Xa reagents, it can be argued that the antifactor Xa is a cost-effective method for monitoring UFH. In this review, we discuss the relative advantages and disadvantages of the aPTT, antifactor Xa, and protamine titration tests, and provide a clinical framework to guide practitioners who are seeking to optimize UFH monitoring within their own institutions.