Skip to main content
Taylor & Francis Group Logo
    Advanced Search

    Click here to search products using title name,author name and keywords.

    • Login
    • Hi, User  
      • Your Account
      • Logout
      Advanced Search

      Click here to search products using title name,author name and keywords.

      Breadcrumbs Section. Click here to navigate to respective pages.

      Book

      Tree-Based Methods for Statistical Learning in R
      loading

      Book

      Tree-Based Methods for Statistical Learning in R

      DOI link for Tree-Based Methods for Statistical Learning in R

      Tree-Based Methods for Statistical Learning in R book

      Tree-Based Methods for Statistical Learning in R

      DOI link for Tree-Based Methods for Statistical Learning in R

      Tree-Based Methods for Statistical Learning in R book

      ByBrandon M. Greenwell
      Edition 1st Edition
      First Published 2022
      eBook Published 23 June 2022
      Pub. Location New York
      Imprint Chapman and Hall/CRC
      DOI https://doi.org/10.1201/9781003089032
      Pages 404
      eBook ISBN 9781003089032
      Subjects Engineering & Technology, Mathematics & Statistics
      Share
      Share

      Get Citation

      Greenwell, B.M. (2022). Tree-Based Methods for Statistical Learning in R (1st ed.). Chapman and Hall/CRC. https://doi.org/10.1201/9781003089032

      ABSTRACT

      Tree-based Methods for Statistical Learning in R provides a thorough introduction to both individual decision tree algorithms (Part I) and ensembles thereof (Part II). Part I of the book brings several different tree algorithms into focus, both conventional and contemporary. Building a strong foundation for how individual decision trees work will help readers better understand tree-based ensembles at a deeper level, which lie at the cutting edge of modern statistical and machine learning methodology.

      The book follows up most ideas and mathematical concepts with code-based examples in the R statistical language; with an emphasis on using as few external packages as possible. For example, users will be exposed to writing their own random forest and gradient tree boosting functions using simple for loops and basic tree fitting software (like rpart and party/partykit), and more. The core chapters also end with a detailed section on relevant software in both R and other opensource alternatives (e.g., Python, Spark, and Julia), and example usage on real data sets. While the book mostly uses R, it is meant to be equally accessible and useful to non-R programmers.

      Consumers of this book will have gained a solid foundation (and appreciation) for tree-based methods and how they can be used to solve practical problems and challenges data scientists often face in applied work.

      Features:

      • Thorough coverage, from the ground up, of tree-based methods (e.g., CART, conditional inference trees, bagging, boosting, and random forests).

      • A companion website containing additional supplementary material and the code to reproduce every example and figure in the book.
      • A companion R package, called treemisc, which contains several data sets and functions used throughout the book (e.g., there’s an implementation of gradient tree boosting with LAD loss that shows how to perform the line search step by updating the terminal node estimates of a fitted rpart tree).
      • Interesting examples that are of practical use; for example, how to construct partial dependence plots from a fitted model in Spark MLlib (using only Spark operations), or post-processing tree ensembles via the LASSO to reduce the number of trees while maintaining, or even improving performance.

      TABLE OF CONTENTS

      chapter 1|36 pages

      Introduction

      part Part I|140 pages

      Decision trees

      chapter 382|72 pages

      Binary recursive partitioning with CART

      chapter 3|36 pages

      Conditional inference trees

      chapter 4|30 pages

      The hitchhiker's GUIDE to modern decision trees

      part Part II|182 pages

      Tree-based ensembles

      chapter 1785|24 pages

      Ensemble algorithms

      chapter 6|26 pages

      Peeking inside the “black box”: post-hoc interpretability

      chapter 7|80 pages

      Random forests

      chapter 8|50 pages

      Gradient boosting machines

      T&F logoTaylor & Francis Group logo
      • Policies
        • Privacy Policy
        • Terms & Conditions
        • Cookie Policy
        • Privacy Policy
        • Terms & Conditions
        • Cookie Policy
      • Journals
        • Taylor & Francis Online
        • CogentOA
        • Taylor & Francis Online
        • CogentOA
      • Corporate
        • Taylor & Francis Group
        • Taylor & Francis Group
        • Taylor & Francis Group
        • Taylor & Francis Group
      • Help & Contact
        • Students/Researchers
        • Librarians/Institutions
        • Students/Researchers
        • Librarians/Institutions
      • Connect with us

      Connect with us

      Registered in England & Wales No. 3099067
      5 Howick Place | London | SW1P 1WG © 2022 Informa UK Limited