INTRODUCTION — Despite significant improvements in one-year kidney allograft survival [1], the rate of chronic graft loss after the first year remains substantial, although it seems to be improving over time (figure 1). As an example, a 2004 study that analyzed first kidney transplants performed between 1995 and 2000 found that, despite a reduction in acute rejection rates, there was no improvement over the last 10 years in long-term allograft survival [2]. However, the rate of decline in kidney allograft function appears to have slowed [3], suggesting that improved results relating to long-term allograft survival are possible. The 2016 annual data report of the Organ Procurement and Transplantation Network (OPTN)/Scientific Registry of Transplant Recipients (SRTR) highlights ongoing, albeit incremental, improvements in graft and patient survival for both deceased-donor and living-donor kidney transplants [4]. Similarly, two subsequent reports from the United States and Europe have shown improvements in long-term graft survival since the mid-1990s (figure 2) [5-7].
Allograft survival rates also vary among different ethnic groups [8]. This may be explained in part by the presence of a disproportionately higher number of risk factors among certain patient populations and differences in access to health care [9-14].
The following discussion will review the determinants of short-term and long-term graft survival. However, this distinction is to some degree arbitrary since most factors can affect both. As an example, any short-term event that predisposes to episodes of acute rejection will then lead to a greater likelihood of chronic graft loss. In addition, many of these factors influence each other, such as human leukocyte antigen (HLA) mismatching, which may increase the risk of acute kidney rejection and subsequent premature allograft failure. A discussion of patient survival after kidney transplantation is presented separately. (See "Kidney transplantation in adults: Patient survival after kidney transplantation".)
SHORT-TERM SURVIVAL — The risk of graft loss has traditionally been divided into an early, high-risk period and a later period of constant low risk [15,16]. A major improvement in kidney allograft survival in the past 20 years has been the relative elimination of the early risk period.
A number of factors have been shown to influence short-term graft survival. These include delayed allograft function, human leukocyte antigen (HLA) antibodies, type of donor kidney, donor illness, medical center factors, and other factors.
Delayed allograft function — The presence of delayed graft function has a major adverse impact upon both short- and long-term allograft survival. In one single-center study of 518 patients, multivariate analysis found that delayed graft function was the principal factor underlying kidney survival at one year [17]; by comparison, acute rejection, HLA matching, degree of sensitization, and retransplantation did not significantly affect short-term survival.
Issues surrounding delayed allograft function and long-term allograft survival are discussed below. (See 'Tissue injury' below.)
Human leukocyte antigen antibodies — Some data suggest that the presence of HLA antibodies is associated with an increased risk of early graft loss. Based upon data from nearly 5000 patients, the frequency of HLA antibodies was 21 percent among kidney transplant recipients [18]. Over 2000 patients were followed prospectively, with 91 grafts failing and 34 deaths. The risk of allograft failure at one year was significantly higher among those with HLA antibodies (6.6 versus 3.3 percent), as well as among those who developed such antibodies de novo (8.6 versus 3 percent).
In addition, such antibodies place patients awaiting transplantation at a significant disadvantage as their waiting time for an allograft is markedly prolonged and they are at increased risk of both delayed graft function and rejection in the perioperative period. Some of these issues are discussed in detail separately. (See "Kidney transplantation in adults: HLA-incompatible transplantation" and "Kidney transplantation in adults: HLA matching and outcomes".)
The presence of HLA antibodies also has an adverse effect upon long-term allograft survival. (See 'Human leukocyte antigen matching' below.)
Type of kidney — It is generally acknowledged that not all kidneys for transplantation are created equal. Such organs have long been characterized by donor source, whether living or deceased. Living-donor transplants have a greater short-term survival rate than deceased-donor kidneys [19,20]. Allograft survival rates for living-donor transplants and deceased, non-expanded-criteria donors (ECDs; now referred to as Kidney Donor Profile Index [KDPI] >85 percent) are 98 versus 96 percent at three months and 96 versus 92 percent at one year, respectively [20]. A graft survival benefit with living transplants is also observed with second allografts [21]. This difference reflects the optimal circumstances surrounding living, related donation compared with the potentially injurious effect of deceased-donor donation. (See 'Tissue injury' below.)
Prior to 2014, deceased donors were categorized as standard-criteria donors (SCDs) or ECDs. ECD kidneys are from brain-dead donors that are further defined by an increase of 1.7 in relative risk of premature graft failure within the first year posttransplant. Such kidneys derive from donors >60 years of age, as well as those with a history of hypertension or stroke as the cause of death. ECD kidneys are associated with decreased short-term allograft survival, particularly among those undergoing retransplantation [22-24].
However, the utility of this rudimentary classification was limited by the presence of substantial overlap in outcomes and kidney quality since the ECD category is broad and nonuniform and includes kidneys that are likely to have different outcomes. As an example, the ECD category would include both a perfectly healthy 60 year old with a normal serum creatinine and a 55 year old with diabetes, hypertension, and a creatinine of 1.6 mg/dL.
A central feature of the revised kidney allocation system, which was implemented in the United States in 2014, is the Kidney Donor Profile Index (KDPI), which is a continuous, rather than a dichotomous, index of the quality of donor kidneys, from the highest quality, with the longest projected functional lifetime after transplant, to the lowest quality, with the shortest projected functional lifetime (figure 3) [25]. The more granular KDPI, based solely on donor characteristics, combines a variety of donor risk factors for graft failure (table 1) into a single number. In one study that used the KDPI (or Kidney Donor Risk Index [KDRI]) to classify deceased-donor kidneys, the highest quintile (with highest risk for graft failure or lowest projected survival) had a calculated five-year survival of 63 percent compared with those in the lowest two risk quintiles, which had projected five-year survivals of 82 and 79 percent, respectively. The KDPI system overlapped with the ECD/SCD system in that the percentage of ECD kidneys rose with increasing risk quintile. However, 32 percent of ECD kidneys studied had KDRI less than the 85th percentile [26]. (See "Kidney transplantation in adults: Organ sharing".)
Center effect — In the United States, Europe, and Canada, a marked center effect with respect to short-term graft survival has been reported that cannot be explained by different clinical features of enrolled patients. This appears to be a consistent finding that has been observed over the last few decades, a period of increasingly effective immunosuppressive therapy [27-30]. Possible underlying reasons include center volume and differences in long-term patient management experience. Center-specific data are compulsorily reported to the United States Renal Data System (USRDS) as public information.
Donor age — Increasing donor age is associated with inferior survival of allografts from deceased donors [31,32]. In an analysis of 6490 deceased-donor kidney transplants identified from the UK transplant registry, compared with donor age <40 years, donor age >60 years was associated with increased risk of graft failure (hazard ratio [HR] 2.35, 95% CI 1.85-3.00) [31].
Donor illness — Deceased-donor allograft survival may also vary by the cause of donor death and/or history of specific comorbid illness [19,33]. As an example, kidneys transplanted from stroke victims had lower one-year allograft survival rates than seen overall with deceased-donor kidney transplantation (79 versus 84 percent) [19].
Dialysis and preemptive transplantation — Early allograft survival may vary with the use of maintenance dialysis and perhaps the type of dialysis prior to transplantation. This is discussed in detail separately. (See "Kidney transplantation in adults: Timing of transplantation and issues related to dialysis".)
LONG-TERM SURVIVAL — The identification of clinical risk factors for long-term allograft failure has been provided by several studies performed in the current immunosuppressive era [34,35]. In the Assessment of Lescol in Renal Transplantation (ALERT) trial, for example, 2000 patients were randomly assigned to fluvastatin or placebo and followed for five to six years [34]. In the placebo group, allograft loss was noted in 137 patients, which was principally caused by chronic allograft nephropathy (see "Kidney transplantation in adults: Chronic allograft nephropathy"). With multivariate analysis, independent baseline risk factors for enhanced risk of allograft loss included increased serum creatinine concentration (relative risk [RR] of 3.12 per every increase of 100 micromol/L), proteinuria (RR of 1.6 per every 1 g/day), and pulse pressure (RR of 1.12 per every increase of 10 mmHg).
An additional possible marker for enhanced risk of poor long-term survival is decreased kidney function at one year [36,37]. In one study, there was a progressive decline in allograft half-life for each incremental increase in the serum creatinine concentration at six months and at one year [36].
The exact mechanisms responsible for the pathogenesis of chronic allograft dysfunction leading to long-term graft loss are unknown. Both alloantigen-dependent and independent factors are thought to play a role [38-40]. Long-term graft survival is generally measured in terms of the half-life, which is defined as the time beyond the first year posttransplant at which 50 percent of grafts are no longer functioning.
Alloantigen-dependent factors — Chronic rejection and allograft loss are more likely to develop in patients with a history of acute rejection, a greater degree of human leukocyte antigen (HLA) mismatching, infection, and/or inadequate immunosuppressive therapy [41,42]. These observations are consistent with an important role for immunologic or alloantigen-dependent injury in the subsequent development of chronic allograft dysfunction. (See "Kidney transplantation in adults: Chronic allograft nephropathy".)
Episodes of acute rejection — Patients with a history of acute rejection episodes are more likely to have late allograft failure [43-45]. This was shown in a study of 63,045 primary kidney recipients in which an acute rejection episode in 1996 to 1997 was associated with a 5.2-fold increased risk for chronic allograft nephropathy compared with no rejection in a reference group from 1988 to 1989 [44].
Human leukocyte antigen matching — An increased degree of HLA antigen mismatching is associated with a greater risk of chronic graft loss, presumably due to ongoing specific immunologic injury. (See "Kidney transplantation in adults: HLA matching and outcomes".)
When organ allocation is based upon HLA typing, one concern is the effect of cold ischemia time upon long-term survival. The beneficial effect of HLA matching appears to generally outweigh the detrimental effect of prolonging the cold ischemia time in transported kidneys [46]. The current registry data indicate that the five-year graft survival of six-antigen-matched cadaver kidneys is the same regardless of whether the kidneys undergo 3 or 36 hours of cold ischemia.
However, a deleterious effect of prolonged cold ischemia time becomes evident in recipients of mismatched grafts. A stepwise reduction of 1 to 2 percent in survival has been observed in association with incremental 12-hour increases in cold ischemia time. The adverse effect of cold ischemia can also be observed when allograft survival is compared between the recipients of a first deceased donor transplanted and the recipient of the second donor kidney transplanted (mean of 19.9 versus 25.6 hours cold ischemia time) [47]. The 5- and 10-year survival was 72 and 55 percent for the first kidney transplanted, respectively, compared with 65 and 40 percent for the second kidney transplanted.
Insight into the relative contributions to allograft survival of HLA matching and cold ischemia time is also provided by examining the graft survival rates of living, unrelated kidneys. Graft survival rates of kidneys donated by a spouse are very similar to those of kidneys donated by family members with a one-haplotype mismatch (half-life of 12 years) [48-50]. If HLA mismatching were the only factor affecting graft survival, the living, unrelated grafts should have poorer long-term survival, possibly in the range of randomly matched cadaver grafts (which generally have approximately three HLA mismatches and an average half-life of 8.4 years).
Thus, both cold ischemia time and HLA matching exert important effects upon long-term graft survival. However, six-antigen matching completely overcomes the effect of ischemia, and lesser degrees of matching (even in the presence of ischemia) yield superior results than zero-antigen-matched kidneys. Changes in allograft immunogenicity induced by tissue injury prior to death in cadaveric transplants also may contribute to the lower long-term survival with cadaveric as opposed to living-donor kidneys. (See 'Tissue injury' below.)
Sensitization — Antibodies against HLA class I (A, B, C) or class II (DR, DQ) are found in subjects who have been immunized to these glycoproteins by pregnancy, blood transfusion, or a prior HLA-mismatched allograft. Highly sensitized patients have a decreased statistical chance of being transplanted because of the high likelihood of a positive pretransplant crossmatch [51].
Increases in sensitization to lymphocyte antigens, as measured by the panel reactive antibody (PRA) status, incrementally augment the risk of graft loss. Five-year deceased-donor non-expanded criteria donor (ECD) allograft survival rates in the 2007 Scientific Registry of Transplant Recipients (SRTR) annual report were [20]:
●Seventy-one percent for patients with a PRA of 0 to 9 percent
●Sixty-nine percent with a PRA of 10 to 79 percent
●Sixty-nine percent with a PRA of >80 percent
In a large retrospective analysis of the United Network for Organ Sharing (UNOS) registry, the 10-year allograft survival rate was 44 percent among highly sensitized patients (PRA ≥98 percent), compared with 52 percent among nonsensitized patients (PRA = 0) [52].
It is increasingly appreciated that the presence of preformed HLA donor-specific antibodies (DSAs) has an adverse effect upon allograft outcomes. Using very sensitive HLA-specific assays, allograft survival at eight years was evaluated among 43 patients with and 194 patients without preformed DSAs [53]. Allograft survival was 68 versus 77 percent for those with and without DSAs, respectively, while the incidence of antibody-mediated rejection was ninefold higher in those with preformed DSAs.
Complement-binding HLA DSAs, in particular, are associated with decreased graft survival. This was shown in a study of 1016 patients who received a kidney transplant at two centers between 2005 and 2011 [54]. All patients were tested for circulating HLA DSAs using stored serum samples, which were obtained at the time of transplantation and at the time of allograft biopsy (which was performed either at one year after transplantation or during an episode of acute rejection during the first year after transplantation). Serum samples of patients who were found to have HLA DSAs were further tested for the presence of C1q-binding HLA DSAs using a single-antigen flow bead assay. Overall, patients with HLA DSAs had worse five-year graft survival compared with those who did not have such antibodies (83 versus 94 percent, respectively). Patients with C1q-binding DSA HLA antibodies (n = 77) had worse graft survival compared with patients with non-C1q-binding DSA HLA antibodies (n = 239) and compared with patients without DSA HLA antibodies (n = 770; 54 versus 93 and 94 percent, respectively). After adjusting for multiple clinical, functional, histologic, and other immunologic factors, the presence of C1q-binding DSA HLA was associated with a greater than fourfold increased risk of graft loss (hazard ratio [HR] 4.78, 95% CI 2.69-8.49). C1q-binding antibodies were also associated with an increased rate of antibody-mediated rejection and increased C4d deposition in the allograft.
However, the C1q-binding assay is not widely used and has not been validated in other institutions. Furthermore, it is likely that there are some noncomplement-binding DSAs that are clinically relevant in graft failure and will not be detected by this assay.
Limited data suggest that sensitization may, in some cases, represent non-HLA immunity, with increased levels associated with decreased survival. As an example, a retrospective study of over 4000 recipients of HLA-identical sibling transplants evaluated the long-term survival in association with PRA levels [55]. Given that these transplants were between HLA-identical siblings, the presence of PRA was thought to represent antibodies against HLA antigens not found in the transplanted kidney. However, 10-year survival was highest among those with no PRA (72 percent) versus those with either 1 to 50 percent PRA (63 percent) or >50 percent PRA (56 percent) [55]. This suggests that non-HLA transplantation immunity, as detected by PRA, may have a role in long-term allograft survival.
The importance of non-HLA immunity was further supported by a two-year prospective study of 2231 patients [56]. At two years, allograft survival was significantly higher among the 1781 patients without HLA antibodies (93 versus 85 percent). Outcome correlated better with the results with cytotoxicity testing versus enzyme-linked immunosorbent assay (ELISA) or HLA beads, suggesting that the presence of non-HLA antibodies correlates with decreased survival.
Although mismatching at the Rh blood group antigen is not considered a risk factor for allograft rejection, a multivariate analysis of United Network of Organ Sharing (UNOS) data found that Rh incompatibility may confer worse long-term survival [57]. In this study, Rh identity between the recipient and donor was significantly related to better graft outcome (RR 0.43).
Alloantigen-independent factors — Several clues from clinical transplant databases support the hypothesis that inadequate kidney mass, prior and ongoing tissue injury [58], posttransplant hypertension, hyperlipidemia, a more marginal kidney, and recurrent or de novo glomerular disease contribute to chronic allograft dysfunction and poor long-term kidney survival [59].
Other substantial risk factors for allograft failure, particularly over the long term, are calcineurin inhibitor nephrotoxicity and death of the transplant recipient [60]. These are discussed in detail separately. (See "Kidney transplantation in adults: Chronic allograft nephropathy" and "Cyclosporine and tacrolimus nephrotoxicity" and "Kidney transplantation in adults: Patient survival after kidney transplantation".)
Tissue injury — Allograft injury plays a major role in both short- and long-term graft function, as well as in the induction of kidney allograft rejection [61]. Such injury may be induced by different events, including brain death, cold ischemia time, ischemia and/or reperfusion, and infection.
●Brain death – Brain death resulting from trauma or catastrophic intracranial hemorrhage is associated with a variety of adverse effects upon donor organs prior to transplantation. Under such circumstances, allograft tissue becomes "primed" and recipient T-cell activation by donor alloantigen is more likely to occur [62].
Among patients with brain death, intracranial pressure rises due to cerebral edema, which results in compression of brain tissue and subsequent venous congestion and increasing brain turgor. Massive release of catecholamines then ensues, which causes profound vasoconstriction and endothelial injury [63]. Injury-induced inflammation also causes upregulation of adhesion molecules and class-II major histocompatibility complex (MHC) on kidney allograft endothelium [64,65]. In addition, a procoagulant state results from endothelial activation coupled with release of cytokines, complement activation, and depletion of tissue plasminogen activator [66].
Other endocrine aberrations resulting from brain death include the initial release of anterior pituitary hormones, which leads to a subsequent reduction in the levels of circulating thyroid hormone, cortisol, insulin, and antidiuretic hormone (vasopressin). Arginine vasopressin deficiency (previously called central diabetes insipidus) rapidly occurs, and cardiac arrhythmias and rapid fluctuations in blood pressure are common. Such factors may obviously adversely affect the function and integrity of the kidney.
●Ischemia and/or reperfusion injury – Ischemia and/or reperfusion injury is thought to be a critical risk factor for both early delayed graft function and late allograft dysfunction. In one report that included data from 3829 adult recipients of a first deceased-donor kidney, there was a proportional increase in the risk for allograft failure and death over 12 years for each hour of cold ischemia time (HRs of 1.013 and 1.018, respectively) [67].
The dominant cause of delayed graft function is postischemic acute tubular necrosis (ATN) [68]. The incidence of ATN increases when the cold ischemia time exceeds 18 hours [69], particularly with older donor kidneys [24]. Cold ischemia time alone is also a principal factor underlying long-term allograft survival [17]. (See "Kidney transplantation in adults: Evaluation and diagnosis of acute kidney allograft dysfunction".)
Preservation of the allograft with pulsatile preservation may improve long-term survival compared with that observed with simple cold storage. (See "Deceased- and living-donor kidney allograft recovery".)
Injury alone may not influence graft survival in the absence of rejection, as shown in some [70-72] but not all studies [73]. As an example, among over 9000 kidneys retrieved and analyzed from the United Kingdom national transplant database, damage recognized at organ retrieval or placement was not significantly associated with survival at three years [72].
The circumstances surrounding organ removal, storage, and engraftment may increase graft immunogenicity [64,66,68,74-76]. Such factors include the upregulation of MHC antigens and triggering of the cytokine-adhesion molecule cascade. These factors may ultimately influence the development of chronic allograft dysfunction. As an example, blockade of costimulatory pathways can ameliorate organ dysfunction in a rat model of ischemia/reperfusion [77].
There are limited data concerning the relative degree of injury with grafts harvested from deceased donors with or without a heartbeat. Those from donors without a heartbeat or deceased cardiac donors (DCDs) are thought to be associated with some degree of irreversible damage, thereby leading to relatively poor long-term allograft survival. However, some data suggest that outcomes may be satisfactory. (See 'Type of kidney donor' below.)
●Cytomegalovirus (CMV) – The presence of seropositivity to CMV predisposes patients to both acute and chronic graft loss. Seronegative recipients of seronegative grafts appear to experience a 10 percent higher graft survival than recipients of seropositive grafts. This difference may be due to infection-induced cytokine activation, which in turn may promote graft injury [78].
Graft survival rates are also affected by pretransplant recipient/donor CMV serostatus and whether prophylaxis was administered. (See "Clinical manifestations, diagnosis, and management of cytomegalovirus disease in kidney transplant patients".)
Inadequate kidney mass — Transplantation of inadequate kidney mass may be associated with a higher risk of kidney graft failure [79]. Data supporting this hypothesis in humans is based upon the observation of lesser graft survival rates with older and very young donor kidneys, recipients in whom the donor kidney has disproportionately fewer nephrons, and ratio of donor kidney weight to recipient weight:
●Old and very young donor kidneys – Old and very young kidneys have relatively decreased numbers of functioning nephrons and survive less well when transplanted. In addition to fewer nephrons, other factors inherent to an older kidney may also influence overall allograft survival [80].
●Disproportionately fewer nephrons – Large-sized recipients place a large physiologic demand on the transplanted kidney. This increased demand upon relatively "insufficient" numbers of transplanted nephrons may be a factor in the lower long-term graft survival rate observed among some patients who weigh >100 kg [81].
●Several different studies have evaluated the relationship between allograft function/survival and the ratio of donor kidney weight to recipient weight [82-88]. In the largest such study of 1189 kidney transplant recipients, the ratio of the weight of the kidney to the weight of the recipient (KwRw ratio) was evaluated with respect to the risk of proteinuria, glomerulosclerosis, and allograft failure [88]. A low KwRw ratio (<2.3 g/kg) was associated with an increased risk of glomerulosclerosis (17 versus 4.7 percent), proteinuria, and long-term allograft loss (1.55-fold increased risk [95% CI 1.01-2.12] beginning two years postsurgery). The magnitude of risk for decreased allograft survival with a low KwRw ratio is approximately the same as the risk due to an acute rejection episode or delayed allograft function.
Drug noncompliance — Noncompliance is one of the more important risk factors for kidney graft loss over the long term [89-92]. An accurate assessment of the frequency of noncompliance and its contribution to allograft loss is difficult because of the wide variability in study design and results. (See "Psychiatric aspects of organ transplantation".)
Posttransplant hypertension — Long-term allograft and patient survival may be negatively influenced by posttransplant hypertension [93-96]. A multivariate analysis found that, after adjustment for baseline kidney function, the relative risk of allograft failure was 1.3 for each 10 mmHg increase in the mean arterial pressure measured at one year after transplantation [97]. (See "Hypertension after kidney transplantation".)
Hyperlipidemia — Hyperlipidemia is an established risk factor for arteriosclerosis and coronary artery disease in all patients, including cardiac and kidney transplant recipients; it may also result in increased graft loss. (See "Kidney transplantation in adults: Lipid abnormalities after kidney transplantation".)
Recurrent or de novo glomerular disease — Recurrent or de novo glomerular disease results in significantly lower long-term allograft survival:
●A retrospective study evaluated the outcomes of nearly 5000 kidney transplants followed in the Renal Allograft Disease Registry, 167 of which had clinical and biopsy evidence of recurrent or de novo glomerular disease [98]. The relative risk for allograft failure was 1.9 for those with glomerular disease. At five years, patients with glomerular disease had a much higher rate of allograft failure (60 versus 32 percent in those without glomerular disease).
●Among 1505 kidney transplant recipients with end-stage kidney disease (ESKD) due to glomerulonephritis, allograft loss as a result of biopsy-proven recurrent glomerulonephritis was observed in 52 patients (3.5 percent) [99]. The 10-year incidence of graft loss because of recurrent disease was estimated to be 8.4 percent (95% CI 5.9-12.0); recurrent glomerulonephritis was the third most common cause of allograft loss among these patients.
The improved long-term survival observed in well-matched HLA grafts may also be less apparent in transplant recipients with a primary glomerular disease. In one report of 60 patients in whom a HLA-identical-, living-, related-donor transplant had been performed, allograft survival was correlated with the original kidney disease [100]. Among the 33 patients with an underlying glomerulonephritis, the 5-, 10-, and 20-year graft survival was 88, 70, and 63 percent, respectively; by comparison, no case of graft loss was observed among the remaining patients with nonglomerular kidney diseases. Recurrent disease appeared to be the principal cause of graft loss.
Type of kidney donor — Living-donor kidney transplants have a greater long-term graft survival rate than deceased-donor kidneys (primarily from brain-dead donors). In the 2009 SRTR report, for example, the five-year allograft survival rates for living-donor, non-ECD, and ECD kidneys were 81, 72, and 57 percent, respectively [101].
A graft survival benefit with living transplants is also observed with second allografts [21]. This difference reflects the optimal circumstances surrounding living, related donation compared with the potentially injurious effect of cadaveric donation. In addition, decreasing allograft survival correlates with decreasing quality of the cadaveric kidney.
There is also an increasing body of data suggesting that long-term survival with kidneys donated after cardiac death (DCD kidneys) appears to be adequate, particularly as reported from Japan. In one study from Japan, for example, the overall graft survival rates at 5 and 10 years were 72 and 53 percent, respectively [31,102-104].
It is not clear whether DCD kidneys from older donors have worse outcomes compared with kidneys from brain-death donors of a comparable age. In one small study, allografts from DCD donors >65 years of age had a lower glomerular filtration rate (GFR) at one year and a trend toward a lower five-year graft survival compared with younger DCD donors [105]. However, there was no difference in graft survival when kidneys with extensive vascular pathology were excluded from the analysis. Additionally, in the study of UK registry data cited above, there was no difference in graft loss for DCD donors >60 years of age compared with brain-death donors in the same age group [31].
Over the last several years, there has also been a steady improvement in outcome with DCD kidneys from the United States [102,106-110]. In the UNOS database, the outcome of DCD kidney transplantation at five years posttransplant, both in terms of patient (81 percent) and graft (67 percent) survival, is not significantly different from that of kidneys from donors deceased after brain death [106].
Gene polymorphisms — Variations in the ability to mount an effective immune response against the allograft, as well as differences in underlying factors affecting allograft fibrosis, represent alloantigen-independent factors that may affect graft survival:
●Caveolin-1 (CAV1) – CAV1 appears to have an important role in the development of organ fibrosis. A retrospective study found that a CAV1 polymorphism is associated with an increased risk of allograft failure, which is closely related to allograft fibrosis [111].
●Chemokine receptor 5 (CCR5) – One study evaluated allograft survival among 1227 kidney transplant recipients who were screened for the presence of a 32-base-pair deletion for the CCR5 [112]. The intact molecule is a cell surface receptor for several chemokines as well as the human immunodeficiency virus (HIV). Patients who were homozygous for this deletion (which results in a nonfunctional receptor and occurs in approximately 1 percent of White populations in Europe and North America) had significantly longer allograft survival than those who were nonhomozygous (90 versus 25 percent survival at 20 years, as predicted by Kaplan-Meier survival plots).
●Complement 3 (C3) allotypes – The relationship between clinical outcome and C3 allotypes is unclear. There are two C3 allotypes, F for fast and S for slow, which appear to have an influence on inflammatory disorders. One study found that allograft survival was significantly higher with C3F/F and C3F/S donor kidneys, with the benefit unique to those recipients without the C3F allele [113]. By comparison, a much larger study found no association between improved survival and the different forms of C3 [114].
Ultrasonographic resistive index — An increased resistance index, as determined by Doppler ultrasonography, has been associated with markedly decreased allograft survival in some [115-118] but not all [119] studies:
●A single-center German study evaluated 601 stable patients at least three months after successful kidney transplant surgery; all patients were then followed for at least three years [115]. A resistance index of >80 percent, which nonspecifically reflects increased vascular resistance, was seen in 20 percent of patients. Compared with patients who had a lower resistive index, patients with a resistive index >80 percent were more likely to achieve the composite endpoint of a 50 percent decrease in graft function, need for dialysis, or recipient death (17 versus 88 percent, respectively). Patients with a resistive index of >80 percent had a multivariate-determined relative risk of graft loss of 9.1 (95% CI 6.6-12.7).
●A more recent study correlated resistive index with patient survival, graft survival, and allograft histology [119]. This prospective study of 321 kidney transplant recipients confirmed the previous study’s findings that a resistive index >0.8 was associated with an increased risk for a composite endpoint of a 50 percent decrease in graft function, need for dialysis, or recipient death [119].
The more recent study showed that a resistive index ≥0.8 at 3, 12, and 24 months was associated with a higher mortality compared with recipients with index <0.8 (with HRs of 5.2, 3.5, and 4.1, respectively) but not with graft failure or with kidney histology obtained by protocol biopsy [119]. At protocol-specified biopsy time points, a high resistive index correlated most closely with older age of the recipient; other factors included higher pulse pressure, lower mean blood pressure, beta blocker use, and diuretic use. These data suggest that the resistive index measured at predefined times after transplantation reflects recipient but not graft characteristics.
Among biopsies performed for graft dysfunction, a high resistive index correlated with antibody-mediated acute rejection and acute tubular necrosis but could not distinguish between these diagnoses.
Differences in study design, in particular the timing of measurement of resistive index, the relatively short follow-up of this study, changing immunosuppression, and antiviral prophylaxis as well as differences in the technical performances of the resistive indices may explain the discordant results.
Hyperhomocysteinemia — Elevated homocysteine levels are associated with decreased allograft survival. A prospective study of 733 kidney transplant recipients compared baseline fasting plasma total homocysteine levels with kidney allograft survival during a median follow-up of 6.1 years [120]. After statistical adjustment, elevated levels of homocysteine (≥12 micromol/L) were associated with an increased risk of allograft loss (HR 1.63, 95% CI 1.09-2.44) and mortality (HR 2.44, 95% CI 1.45-4.12). A randomized control trial (Folic Acid for Vascular Outcomes Reduction in Transplant Recipients [FAVORIT]) did not show a benefit for high-dose B complex vitamin supplementation, however [121].
Proteinuria — The presence of proteinuria (including levels of <1 gram/day) is a predictor of long-term allograft loss [34,122,123]. This correlation likely reflects the presence of ischemia-reperfusion lesions, ongoing immunologic/nonimmunologic allograft injuries, and other pathogenic processes.
Geography — Geography may be a predictor of long-term graft survival. As an example, one international analysis indicated that the long-term survival of transplant recipients in the United States is significantly lower compared with that of transplant recipients in Australia/New Zealand, Europe, and Canada [124]. The reason for this observation is unknown, although variability in access to immunotherapy long term is likely a leading contender.
OPTIMIZING OUTCOMES — Pre-, peri-, and posttransplantation factors affect graft survival after transplantation.
●Pretransplantation – The goal of the transplant team is to engraft as many recipients as possible while minimizing complications that impact graft and patient survival. Unfortunately, such goals may be at variance. As an example, the increased use of KDPI >85 percent kidneys can increase the number of transplants performed while negatively impacting graft survival. Thus, a key part of the organ selection process potentially is to balance such variables by providing secondary benefits such as reducing waiting time for those who agree to receive such grafts. (See "Kidney transplantation in adults: Organ sharing".)
Transplant programs should also consider patient selection criteria as important predictors of outcome. For example, transplanting a higher proportion of "high-risk" candidates will adversely impact survival rates. Such case mix-adjusted information is tracked by both the United Network for Organ Sharing (UNOS) and Centers for Medicare and Medicaid Services (CMS).
●Peritransplantation – Minimizing cold and warm ischemia time is an attractive target for reducing the risk of delayed graft function and decreased graft survival. In truth, delays in organ allocation and transportation are logistic realities over which programs often have little control.
●Posttransplantation – Careful titration of immunosuppression is essential following transplantation. Minimizing rejection rates must be balanced against the risk of infection. Addressing comorbidities such as hypertension and dyslipidemia may improve long-term outcomes. (See "Hypertension after kidney transplantation" and "Kidney transplantation in adults: Lipid abnormalities after kidney transplantation".)
SUMMARY AND RECOMMENDATIONS
●Overview – There are multiple determinants of short-term and long-term graft survival; most factors can affect both.
●Factors affecting short-term graft survival – Major factors that decrease short-term graft survival include delayed allograft function and the presence of human leukocyte antigen (HLA) antibodies. In addition, living-donor transplants have a greater short-term survival rate than deceased-donor kidneys. Kidneys classified in the lowest quintile of the Kidney Donor Profile Index (KDPI) have the shortest projected functional lifetime (figure 3). Donor illness, cytomegalovirus (CMV) seropositivity, and factors related to the transplant medical center may also play a role. Early allograft survival may vary with the use of maintenance dialysis and perhaps the type of dialysis prior to transplantation. (See 'Short-term survival' above.)
●Factors affecting long-term graft survival
•Alloantigen-dependent factors – Late allograft failure is increased in patients with delayed graft function, HLA mismatching, cold ischemia time, the development of anti-HLA antibodies, and inadequate immunosuppressive therapy. The beneficial effect of HLA matching outweighs the detrimental effect of prolonging the cold ischemia time in transported kidneys. Preformed HLA donor-specific antibodies (DSAs) are associated with decreased allograft survival and may represent non-HLA immunity in some cases. (See 'Alloantigen-dependent factors' above.)
•Alloantigen-independent factors – Alloantigen-independent factors contribute to decreased graft survival. Such factors include inadequate kidney mass, prior and ongoing tissue injury, noncompliance, posttransplant hypertension, hyperlipidemia, a more marginal kidney, calcineurin toxicity, CMV seropositivity, and recurrent or de novo glomerular disease. (See 'Alloantigen-independent factors' above.)
آیا می خواهید مدیلیب را به صفحه اصلی خود اضافه کنید؟