Core body temperature and the effect of cooled dialysate
Humans keep their core body temperature (CBT) within a narrow range. When CBT rises, the body increases peripheral blood flow or initiates sweating in an attempt to remove heat from the body by convection or radiation, respectively. Shivering is usually an involuntarily thermoregulatory mechanism employed by the body to generate heat when CBT falls. While on dialysis, the dialysate temperature is set to an arbitrary standard of 37 °C (
98.6 °F) in an effort to achieve a “normal” CBT and maintain isothermia. However, a significant amount of variability exists amongst individuals when it comes to CBT. First, CBT follows a circadian pattern that peaks between 4 and 9 pm and nadirs between 2 and 8 am[11,12]. CBT tends to be lower in elderly individuals[12], higher in women than in men, and is highest in black women[13]. In hemodialysis-dependent individuals, CBT is usually lower than in the non-dialysis population[11,14], with nearly 40% having a CBT less than 36.5 °C[15] compared to the mean CBT of 37 °C (range, 36.2 °C to 37.5 °C) in non-dialysis dependent individuals[12]. The importance of this becomes evident when one considers that even a slight change in CBT on dialysis initiates thermoregulatory mechanisms which may be detrimental on dialysis. For example, a supraphysiologic dialysate temperature, such as 37 °C, could raise the CBT in any given individual resulting in vasodilation and consequent cardiovascular instability. The vasodilation may be in direct competition with the expected vasoconstriction which occurs in the setting of ultrafiltration and could further lead to hemodynamic compromise.
Since supraphysiologic dialysate temperature was viewed as suboptimal and potentially detrimental, the idea that subphysiologic dialysate temperature might be beneficial arose, specifically in those individuals that suffered from IDH. Cooled dialysate temperature was postulated to be beneficial for the following reasons: First, it avoided heat accumulation and hence counterproductive thermoregulatory vasodilation; second, it likely led to a catecholamine surge which induced both peripheral vasoconstriction and cardiac inotropy[16]. However, at the time various potential consequences surrounding cooled dialysate remained unclear. Would dialysis adequacy be inferior? Would it cause prolonged vasoconstriction potentially placing vulnerable vascular beds at risk for ischemia? Would patients be tolerant of the cooled CBT on dialysis? Finally, would it be effective at minimizing IDH?
Various studies have since been performed to address the above issues. Kaufman et al[17] aimed to evaluate the efficacy of cooled dialysate during short-time, high Kt/V dialysis treatments. He postulated that cooled dialysate might increase urea compartmentalization during dialysis treatment leading to increased urea rebound post-dialysis and hence decrease dialysis efficacy. The study was performed in 15 patients who underwent a total of 56 dialysis sessions. Each participant served as their own control. Dialysate temperatures were adjusted to either lower CBT (cooled dialysis) or keep CBT at a thermoneutral temperature. Dialysate cooling resulted in -266 ± 15 kJ heat-energy exchange per treatment whereas thermoneutral dialysis averaged 5 ± 31 kJ per treatment; dialysate temperature averaged 35.7 °C ± 0.02 °C and 37.1 °C ± 0.02 °C, respectively. Cooled dialysis resulted in statistically greater increases in the peripheral vascular resistance index and MAP. It also reduced the maximum intradialytic fall in MAP and necessary interventions by staff to address hypotensive symptoms. There were no statistical changes in blood volume, cardiac index, urea rebound, or effective Kt/V. The authors concluded that cooling dialysate stabilized hemodynamics during dialysis, reduced the number of staff interventions required to address IDH symptoms, and did so without affecting the efficacy of high-efficiency dialysis. A systematic review that evaluated 22 studies comprising of 408 patients has since concluded that using cooling dialysate temperature does not reduce dialysis adequacy[18].
Ayoub et al[19] aimed to gauge patient perception of cooled dialysate. Five patients known to have IDH were dialyzed for three sessions using cooled dialysate (35 °C) followed by another three sessions with dialysate temperature set at 36.5 °C. The same was done in a second group of five patients known to have stable blood pressures during and after their dialysis sessions. Their results demonstrated that cooling dialysate resulted in a statistically significant increase in ultrafiltration in the group known to have IDH. This group also experienced significantly higher intra- and post-dialysis MAPs with cooled dialysate. While the IDH-prone group had no episodes of hypotension with cooled dialysate, they had a total of seven episodes of hypotension with neutral temperature dialysate, all requiring nursing intervention (P < 0.001). There was no statistical difference in intra-dialytic pulse rates between the two groups nor did cooling dialysis have an effect on urea removal between the two groups. Patients’ perception about cooled dialysis was assessed by a questionnaire designed specifically for this study. It comprised of the following questions: “How did you feel while being dialysed on cool temperature? Compared with normal temperature dialysis of 36.5 °C, did you feel any differences while being dialysed on cool temperature? If yes, what were the differences? Would you like to continue cool temperature dialysis?” The results of the questions were as follows: 80% of patients felt more energetic after being dialyzed with cooled dialysate; 80% felt a dramatic improvement in their general health with cooled dialysate; 80% requested to always be dialyzed with cooled dialysate; 20% reported feeling cold during dialysis. The authors concluded that for patients prone to IDH, cooled dialysate improved hemodynamic stability during and after dialysis, improved tolerance of dialysis, reduced the number of nursing interventions required to address IDH, and had an overall positive impact on patients’ energy and activities of daily living. This is the only study to date that has specifically assessed patient perception of cooled dialysate temperature. However, a systematic review by Selby et al[18] pooled the results of five studies in which symptoms were reported during cooled dialysis. Their analysis demonstrated that patients undergoing cooled dialysis were 1.98 (95%CI: 0.38-3.57) times more likely to become symptomatic than patients dialyzed with standard dialysate temperatures. When the analysis omitted the study by Ayoub and Finalyson[19] due to milder symptoms being reported compared to the other four studies, the results were non-significant with symptoms occurring 1.5 (95%CI: -0.2-3.2) times more often with cooled dialysis than during standard dialysis.
A similar study by Jost et al[20] compared cooled dialysate to thermoneutral dialysate to specifically evaluate its efficacy on “problem” patients. The design used a double-blinded, cross-over protocol to evaluate 12 patients, six of whom were prone to IDH and six known to have large interdialytic weight gains defined as consistently gaining > 4 kg in the interdialytic period. Each patient served has their own control and was randomly assigned to one session of dialysis at 35 °C and one at 37 °C. Results demonstrated significantly lower blood pressures at 1, 2, and 3 h of dialysis at a thermoneutral dialysate temperature when compared to the cooled dialysate temperature. A total of 18 episodes of symptomatic hypotension occurred during the study period, 16 of which occurred in the IDH-prone group. Furthermore, no episodes of symptomatic hypotension occurred during cooled dialysis (P < 0.01). The authors concluded that cooling dialysate significantly improved hemodynamic tolerance during dialysis and also significantly reduced the incidence of IDH during dialysis in patients prone to IDH. These studies added to the literature supporting cooled dialysate as an effective way of reducing IDH.
Cooled dialysate compared to other modalities used to minimize IDH
Dheenan and Henrich[21] were the first to compare cooled dialysate to other methods that are commonly employed to mitigate IDH. They used a single-blinded, cross-over protocol to evaluate 10 patients on chronic hemodialysis with a history of IDH. Patients were randomized to one week periods (three dialysis sessions) of five varying dialysis protocols performed in a random and blinded fashion. Each patient underwent four protocols commonly employed to minimize IDH in addition to a standard dialysis protocol which served as a control. The protocols were as follows: A standard dialysis group with dialysate sodium of 138 mEq/L (served as the control group), high sodium dialysate (patient dialyzed using a steady dialysate sodium of 144 mEq/L), sodium modeling using a step function design (dialysate sodium declined from 152 to 140 mEq/L in the last 30 min of dialysis), ultrafiltration (one hour of isolated ultrafiltration in which 50% of the target weight loss was removed followed by three hours of isovolemic dialysis), and cool temperature dialysis in which dialysate was cooled to 35 °C (sodium concentration was 140 mEq/L in this group). The results revealed indistinguishable weight losses with each protocol suggesting that the volume of ultrafiltration was consistent across each protocol. However, the results demonstrated superiority of sodium modeling and cooled dialysate groups over the other groups, and multiple similarities between these two methods. Both had significantly fewer hypotensive signs and symptoms per treatment and fewer hypotensive episodes per treatment when compared with standard treatment. Both also had significantly fewer nursing interventions for IDH per treatment when compared to the ultrafiltration and control group. The nadir MAP was significantly lower in the control and ultrafiltration groups whereas the upright post-dialysis blood pressure was best preserved in the sodium modeling and cooled dialysate groups. Sodium modeling was tolerated by all but one patient who developed hypertension, headache, and nausea; 6 out of the 10 reported increased thirst sensation however this did not translate into increased interdialytic weight gain during the one week follow-up period. Cooled dialysate, however, was not well tolerated. Seven of 10 patients reported a “cold” sensation and two patients were noted to be shivering on dialysis.
A similar study by Rezki et al[22] evaluated 16 patients in a two-phase protocol. The first phase consisted of three standard HD sessions with a sodium concentration of 140 mEq/L with dialysate temperature at 37 °C and served as the control for each patient. During the second phase, patients were dialyzed successively under the following conditions: Fixed sodium dialysate concentration at 144 mEq/L, sodium modeling from 152 to 138 mEq/L, one hour of ultrafiltration alone followed by three hours of standard dialysis, dialysis with cooled dialysate (T < 37 °C), and a combination of sodium modeling with cooled dialysate. When compared to the control protocol, there was a statistically significant decrease in the signs and symptoms of hypotension and in the incidence of IDH when patients were dialyzed with sodium modeling, cooled dialysate, or the combination protocol. When compared to the control protocol, fewer medical staff interventions were required when patients were dialyzed with the combination protocol or cooled dialysate. There was no increase in subjective thirst or in interdialytic weight gain when a protocol employing sodium modeling was performed. In this study, four of the 16 patients noted shivering when dialyzed with cooled dialysate.
Both of these studies suggest that cooling dialysate temperature is as effective a method as sodium modeling when it comes to mitigating IDH. They also suggest that cooling dialysate may be poorly tolerated and associated with patient discomfort on HD. However, sodium modeling has been associated with a number of side effects including worse hypertension and increased interdialytic weight gain due to increased thirst[23]. Whether one method is superior at reducing IDH or is better tolerated than the other remains to be seen in a larger trial with longer follow-up periods.
Effect of cooled dialysate on vulnerable vascular beds
One of the questions that arose when cooled dialysate was first introduced was whether vasoconstriction would also occur at an arteriolar level and potentially place vulnerable vascular beds at risk for end-organ injury. Since that time, it has become apparent that dialysis itself is a hemodynamic stressor[24] which triggers circulatory stress and consequently damages vasculature in the heart, mesentery, and brain[25-27] amongst other organs. Two recent trials demonstrated that cooled dialysate imparts a protective effect in these organs.
Eldehni et al[25] hypothesized that ultrastructural injury to the white matter in the brain might be mitigated by cooling dialysate hence reducing dialysis-induced circulatory stress. This was evaluated by randomizing 38 incident dialysis patients to dialyze for 12 mo at either 37 °C or 0.5 °C below their core body temperature; the latter was determined by averaging each patient’s temperature by tympanic thermometer during six sessions prior to commencing the trial. An individualized temperature was chosen as it is thought to be better tolerated than an arbitrary temperature of 35 °C[28]. A form of magnetic resonance imaging (MRI) called diffusion tensor imaging (DTI) was used to evaluate the structural integrity of the brain white matter at baseline and after 12 mo of thrice-weekly dialysis. DTI was chosen as an imaging modality as it has previously been used to detect clinically significant changes in cerebral small vessel disease[29]. Additionally, MAP extrema points were measured over the course of 12 mo. MAP extrema points measure the frequency and amplitude required to maintain optimal organ perfusion; higher extrema points correlate with high variation in organ perfusion and translates to detrimental perfusion of vulnerable vascular beds[25,30]. After 12 mo, patients dialyzed at 37 °C exhibited patterns of ischemic brain injury on MRI that were not noted in the cooled dialysate group. Additionally, patients dialyzed at 37 °C had a notable worsening of their MAP extrema points that was not seen in the cooled dialysate group. Both of these results were statistically significant. The authors concluded that cooling dialysate minimized injurious perfusion of cerebral vascular beds and consequently decreased the degree of brain injury noted on DTI. An advantage to this study is the long-term follow-up over the course of one year. However, despite having a larger sample size than in earlier studies evaluating the effects of cooled dialysate, it was still limited by a small sample size. Additionally, the study suffered from a high dropout rate of 47.9%, although this was primarily due to difficulty in recruiting patients on incident HD; there were no dropouts reported as a result of the intervention.
Odudu et al[24] used the same patient population and study design as Eldehni et al[25] to evaluate whether cooled dialysate would have cardioprotective effects over the course of a 12 mo follow-up. Fifty-four incident dialysis patients were randomized to a dialysate temperature of either 37 °C or 0.5 °C below their core body temperature and followed for 12 mo. Tagged cardiac magnetic resonance imaging was performed at baseline and at 12 mo; the imaging modality was chosen for its high reproducibility and use as a reference standard technique to evaluate regional left ventricular (LV) strain. While there was no statistically significant change in the study’s primary outcome, change in resting ejection fraction, there were multiple significant secondary outcomes of note. The cooled dialysate group experienced a significant reduction in both LV mass as well as LV end-diastolic volumes. The control group had a significant reduction in peak systolic strain, diastolic function, and segmental LV strain whereas these functions were preserved in the cooled dialysate group. As markers of subclinical cardiomyopathy, these findings suggest that cooled dialysate had a protective cardiac effect over the one year study period. Lastly, aortic distensibility, an independent marker for future cardiovascular events, was also preserved in the cooled dialysate group and significantly decreased in the control group. Whether these findings suggest that cooled dialysate may one day be linked to a decreased risk of cardiovascular events in the dialysis population remains to be seen.