Mitigating Risk of Immunosuppression by Immune Monitoring: Are We There?

    loading  Checking for direct PDF access through Ovid

Excerpt

The advent of the calcineurin inhibitors (CNIs), cyclosporine and tacrolimus, heralded a new era in transplantation. The introduction of cyclosporine in the late 1980s reduced the rejection rates in kidney allograft recipients from approximately 80% to approximately 40% with a concomitant improvement in graft survival to more than 80% in the first year after transplant.1 This opened a new era in transplantation, allowing kidney transplantation to become a more mainstream option for patients with end-stage renal failure. It also allowed the development and expansion of other solid organ transplants including liver, heart, lungs, and intestine. They also allowed for the first time successful transplantation of skin containing vascularized composite organs such as upper extremities and face.
The benefits of reduced rejection rates and improved early allograft survival were met with a challenge in long-term graft survival.2 Calcineurin inhibitor toxicity is considered an impediment to long-term survival for both kidney and other organ transplant recipients.3
In our own program, in response to a high prevalence of polyoma virus nephropathy, a modest decrease of tacrolimus target levels by 2 ng/mL resulted in a significant decrease in polyoma virus nephropathy and beneficial effects on metabolic profiles and renal allograft histology (Figure 1).4 This benefit was not without consequences. There was a numerical increase in rejection rates. This highlighted the importance of finding a method of determining which recipients would benefit from a decrease in immunosuppression without the increased risk of rejection. We, as in other programs, used pretransplant characteristics in an attempt to achieve the goal of reaping the benefit of reduced immunosuppression without increasing the risk of acute rejection or increasing the rate of de novo donor-specific antibodies. Currently, this has resulted in the application of 4 different immunosuppressive regimens deployed at the time of transplantation based on age, presence of preformed donor-specific antibodies, and HLA matching with annual monitoring for donor specific alloantibody using a solid phase assay. The long-term effectiveness of this strategy is being assessed.
It is thus evident that there is a need for an assay that can monitor the degree of immunosuppression, assessed by the reactivity of either T and B cells, or subsets thereof, on a regular basis, before the development of cellular rejection, donor specific alloantibody, or the adverse effects of immunosuppressive agents or immunosuppression itself.
Assays using donor-specific antigen to monitor immune responsiveness to the allograft are limited in their applicability in wider clinical contexts5; however, the use of nonpolymorphic HLA-derived peptides may allow assessment of lymphocyte responses independent of donor HLA peptides.6 Many investigators use (interferon gamma) production of T cells (either whole population or specific subsets).
Monitoring of microRNA has been used to describe changes in allograft histology, acute rejection events, and prediction of malignancy; however, significant overlap occurs between microRNA detection and clinical events limiting their applicability.7
Another method of monitoring the degree of immunosuppression is by monitoring the transcriptional activity of nuclear factor of activated T cells (NFAT)-regulated genes in peripheral blood.8 This method uses real-time polymerase chain reaction to quantify the expression of the NFAT-regulated genes of interleukin 2, interferon gamma, and granulocyte-macrophage colony-stimulating factor. Residual gene expression is calculated as the percent of postdrug peak to the baseline. Target values for NFAT residual expression (NFAT-RE) are believed to be between 15% and 30%. Lower values indicate over immunosuppression with greater risk of opportunistic infections. Higher values represent underimmunosuppression and increased risk of rejection.
In this issue, Sommerer et al9 present the results of a randomized trial comparing standard pharmacokinetic dosing of cyclosporine to NFAT-RE–guided dosing in stable prevalent renal transplant recipients.
    loading  Loading Related Articles