Fast Cross-Validation for Incremental Learning
Pooria Joulani, Andr´ as Gy¨
- rgy, Csaba Szepesv´
Fast Cross-Validation for Incremental Learning Pooria Joulani, Andr - - PowerPoint PPT Presentation
Fast Cross-Validation for Incremental Learning Pooria Joulani, Andr as Gy orgy, Csaba Szepesv ari Department of Computing Science University of Alberta Edmonton, Alberta July 11, 2015 Appearing in the International Joint Conference on
Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)! Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)!
Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)!
◮ type of the learning problem (classification, regression, density
Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)!
◮ type of the learning problem (classification, regression, density
◮ inner structure of the algorithm (e.g., QP, influence matrix, etc.); Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)!
◮ type of the learning problem (classification, regression, density
◮ inner structure of the algorithm (e.g., QP, influence matrix, etc.); ◮ loss function used for CV (accuracy, F-measure, etc.). Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)!
◮ type of the learning problem (classification, regression, density
◮ inner structure of the algorithm (e.g., QP, influence matrix, etc.); ◮ loss function used for CV (accuracy, F-measure, etc.).
Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ k-fold CV: running time penalty O(log k) instead of O(k)! ◮ Leave-One-Out in O(log n)!
◮ type of the learning problem (classification, regression, density
◮ inner structure of the algorithm (e.g., QP, influence matrix, etc.); ◮ loss function used for CV (accuracy, F-measure, etc.).
Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 1 / 3
◮ CV over the 0-1 loss.
◮ CV over the squared loss. Joulani, Gy¨
ari Fast Cross-Validation for Incremental Learning July 11, 2015 2 / 3