The rising incidence of neutralizing antibodies (inhibitors) against therapeutic factor VIII prompted the conduct of studies to answer the question as to whether this rise is related to the introduction of recombinant factor VIII products. The present article summarizes current opinions and results of non-clinical and clinical studies on the immunogenic potential of recombinant compared to plasma-derived factor VIII concentrates. Numerous studies provided circumstantial evidence that von Willebrand factor, the natural chaperone protein present in plasma-derived factor VIII products, plays an important role in protecting exogenous factor VIII from uptake by antigen presenting cells and from recognition by immune effectors. However, the definite contribution of von Willebrand factor in reducing the inhibitor risk and in the achievement of immune tolerance is still under debate.
ABSTRACTinhibitors in stable PTPs after switching treatment is consistently lower than in previously untreated patients (PUPs), 31,32 probably because many of them have developed some kind of cross-tolerance. Thus, the immunogenicity of a given FVIII concentrate in PTPs is relevant for PTPs only. However, in daily clinical practice, more situations that demand a decision between treatment options are related to PUPs, i.e. patients in their early years of life who are newly diagnosed with hemophilia A or who sustain their first bleeding episode. 27,33 Are inhibitor testing methods a critical confounder?One of the most cited arguments used to confute a higher incidence of inhibitors associated with rFVIII treatment is that the testing frequencies and methods used for the detection of inhibitors have increased and improved over time. These improvements coincide with the introduction of rFVIII, which might have favored the detection of borderline and transient inhibitors in rFVIII-treated patient groups. 23,31,34 In fact, in 1995, the so-called "Nijmegen" method, a modification of the till then most widely used "Bethesda assay", was launched. 35 However, despite the improvements made over recent years, the methods used to detect inhibitors have not yet been standardized, 28,36 and the classical Bethesda assay originally published in 1975 is still the most frequently used assay. 34,36 Furthermore, the major advantage of the Nijmegen modification of the Bethesda assay lies in the improvement in the test's specificity near the cut-off value. 28,35,37 Hence, the advent and use of the Nijmegen assay should have caused a decrease in the rate of false positive test results and, consequently, a decrease rather than an increase in the inhibitor incidence in studies dating back to the 1990s. On the other hand, it is very possible that the increased testing frequency in the last two decades had an impact on the reported incidence of inhibitors. Studies reporting on the incidence of high-responding inhibitors [> 5 Bethesda units (BU) per milliliter] (Figure 1) avoided potential bias from assay performance and testing frequencies, because high-res...