ABSTRACT

Numerical methods are presented for handling two of the most important computational problems associated with the study of parameter dependence of solutions to the periodically time dependent Schrödinger equation as approximated by a finite system of ordinary differential equations (ODEs): 1) lengthy integration times due to the growth in parameters as the number of states increases and 2) repeated integrations over a grid in parameter space. The new methods are independent of the numerical algorithm used to integrate the ODE system. A first method uses a simple diagonal transformation of the solution to control the magnitude of terms appearing in the differential system, and a second method relies on matrix interpolation together with a Taylor series expansion to reduce the number of integrations in parameter space. The combination of these two procedures results in a dramatic decrease in computation time for an example problem of an electron confined to a quantum well.