The notion of rate of change is the fundamental concept to which the derivative relates. Calculus becomes important when the relationship is not linear. Rates of change are everywhere and of interest throughout every scientific study, be it hard sciences or soft sciences. Speed is the rate of change of distance per unit of time. Return on investment is a rate of change. Social scientists may study how happiness depends on income and would be interested perhaps in the rate of change of happiness per increase in income. There are various notations for the derivative as it was developed by different mathematicians and used in various fields over the last 300 years. The chain rule is essential to the study of neural networks. One function that often shows up in neural networks is the exponential function.