Plot learning curves with caret package and R

Gabriele B picture Gabriele B · Dec 4, 2013 · Viewed 9.9k times · Source

I would like to study the optimal tradeoff between bias/variance for model tuning. I'm using caret for R which allows me to plot the performance metric (AUC, accuracy...) against the hyperparameters of the model (mtry, lambda, etc.) and automatically chooses the max. This typically returns a good model, but if I want to dig further and choose a different bias/variance tradeoff I need a learning curve, not a performance curve.

For the sake of simplicity, let's say my model is a random forest, which has just one hyperparameter 'mtry'

I would like to plot the learning curves of both training and test sets. Something like this:

learning curve

(red curve is the test set)

On the y axis I put an error metric (number of misclassified examples or something like that); on the x axis 'mtry' or alternatively the training set size.

Questions:

  1. Has caret the functionality to iteratively train models based of training set folds different in size? If I have to code by hand, how can I do that?

  2. If I want to put the hyperparameter on the x axis, I need all the models trained by caret::train, not just the final model (the one with maximum performance got after CV). Are these "discarded" model still available after train?

Answer

Stephen Henderson picture Stephen Henderson · Dec 4, 2013
  1. Caret will iteratively test lots of cv models for you if you set the trainControl() function and the parameters (e.g. mtry) using a tuneGrid(). Both of these are then passed as control options to the train() function. The specifics of the tuneGrid parameters (e.g. mtry, ntree) will be different for each model type.

  2. Yes the final trainFit model will contain the error rate (however you specified it) for all folds of your CV.

So you could specify e.g. a 10-fold CV times a grid with 10 values of mtry -which would be 100 iterations. You might want to go get a cup of tea or possibly lunch.

If this sounds complicated ... there is a very good example here - caret being one of the best documented packages about.