Wednesday, September 25, 2013

K-Fold Cross validation: Random Forest vs GBM

K-Fold Cross validation: Random Forest vs GBM from Wallace Campbell on Vimeo.

In this video, I demonstrate how to use k-fold cross validation to obtain a reliable estimate of a model's out of sample predictive accuracy as well as compare two different types of models (a Random Forest and a GBM). I use data Kaggle's Amazon competition as an example.

3 comments:

  1. Can you specify how to get the Kaggle dataset being used here, pls?

    ReplyDelete
    Replies
    1. Yeah, you gotta have a Kaggle user account first. I used the Amazon competition data from: http://www.kaggle.com/c/amazon-employee-access-challenge/data. Might end up posting it to Github if I use that data set again.

      Delete