ABSTRACT

Hot corrosion process shows a strong temperature dependence. For most of the superalloys, the corrosion rate is maximal within the temperature range 11231173 K and decreases markedly at temperatures up to 1273 K. However, the time required to initiate hot corrosion attack decreases as temperature is increased (Fig. 6.32). For a fixed amount of sulfur in the gas, the S03 pressure decreases as the temperature increases, resulting a lower hot corrosion rate. Moreover, for the same ingestion rate of salt, less is deposited on test specimens as the temperature is increased. Accordingly, less attack takes place with smaller amount of salt deposit at higher temperatures. The effect of alloy composition is still a matter of debate. It is generally agreed that the content of chromium in the alloy is the most important factor and for Ni-based alloys at least 15% Cr is required for better resistance to such corrosion processes. Much of the disagreement concerning the effects of other elements is possibly due to interactive effects within the alloy scale and salt. Certain alloying elements are found to have beneficial effects

1273K,~: 1.0atm CYCLIC HOT CORAOSOI 1hr CYCLES 5.0 mg/cm2 Na2 S04

over certain composition ranges but can be deleterious over others. This is clearly demonstrated in Fig. 6.34 for the corrosion of Ni-30% Cr-6% AI and Ni-30% Cr alloys. In the hot corrosion resistance of Ni-based superalloys apart from the beneficial effects shown by chromium; Co and Ta slightly improve the resistance; Ti seems to do little, whereas Mo and W are detrimental, especially at higher temperatures.