Breadcrumbs Section. Click here to navigate to respective pages.
Chapter

Chapter
Schwartz Information Criteria
DOI link for Schwartz Information Criteria
Schwartz Information Criteria book
Schwartz Information Criteria
DOI link for Schwartz Information Criteria
Schwartz Information Criteria book
Click here to navigate to parent product.
ABSTRACT
The Schwartz Information Criteria, also known as the Bayesian Information Criteria (BIC), is an adaptation of the Akaike Information Criteria (AIC) that addresses a defect of that measure, where larger datasets from the same data source tend to increase their complexity quickly (i.e., the number of parameters in the model). The AIC is defined as the sum of -2*log-likelihood of the data plus twice the number of parameters in the model. Schwartz derived a related criteria that uses log(N) times the number of parameters instead of twice that number. Since the penalty term increases with the sample size, it greatly slows the growth of model complexity.