A novel framework for the detection of radiation-induced lung injury (RILI) from four-dimensional (4D) computed tomography (CT) has been proposed. Our framework performs 4D CT lung fields segmentation, deformable image registration (DIR), extraction of textural and functional features, and classification of lung voxels using deep 3D convolutional neural networks (CNN). The 4D CT image segmentation extracts the lung fields inside the exhale phase using our multiscale Gaussian adaptive shape prior technique followed by label propagation to other 4D CT phases using a newly developed adaptive shape model. Then the 4D CT DIR locally aligns consecutive phases of the respiratory cycle using the 3D Laplace equation for finding voxel correspondences between the isosurfaces for the fixed and moving lungs and generalized Gaussian Markov random field as an anatomical consistency constraint. In addition to common lung functionality features, such as ventilation and elasticity, specific regional textural features are estimated by modeling the segmented images as samples of a novel seventh-order contrast-offset-invariant Markov-Gibbs random field. Finally, a deep 3D CNN is applied to distinguish between the injured and normal lung tissues. The 4D CT data sets collected from 13 patients who underwent radiation therapy have been used in the evaluation of the proposed framework. The experimental results show the promise of our framework.