A closed testing procedure to select an appropriate method for updating prediction models

Yvonne Vergouwe, Daan Nieboer, Rianne Oostenbrink, Thomas P.A. Debray, Gordon D. Murray, Michael W. Kattan, Hendrik Koffijberg, Karel G.M. Moons, Ewout W. Steyerberg

Research output: Contribution to journalArticleAcademicpeer-review

15 Citations (Scopus)

Abstract

Prediction models fitted with logistic regression often show poor performance when applied in populations other than the development population. Model updating may improve predictions. Previously suggested methods vary in their extensiveness of updating the model. We aim to define a strategy in selecting an appropriate update method that considers the balance between the amount of evidence for updating in the new patient sample and the danger of overfitting. We consider recalibration in the large (re-estimation of model intercept); recalibration (re-estimation of intercept and slope) and model revision (re-estimation of all coefficients) as update methods. We propose a closed testing procedure that allows the extensiveness of the updating to increase progressively from a minimum (the original model) to a maximum (a completely revised model). The procedure involves multiple testing with maintaining approximately the chosen type I error rate. We illustrate this approach with three clinical examples: patients with prostate cancer, traumatic brain injury and children presenting with fever. The need for updating the prostate cancer model was completely driven by a different model intercept in the update sample (adjustment: 2.58). Separate testing of model revision against the original model showed statistically significant results, but led to overfitting (calibration slope at internal validation = 0.86). The closed testing procedure selected recalibration in the large as update method, without overfitting. The advantage of the closed testing procedure was confirmed by the other two examples. We conclude that the proposed closed testing procedure may be useful in selecting appropriate update methods for previously developed prediction models
Original languageEnglish
Pages (from-to)4529-4539
JournalStatistics in medicine
Volume36
Issue number28
DOIs
Publication statusPublished - 10 Dec 2017

Fingerprint

Prediction Model
Updating
Closed
Testing
Update
Overfitting
Intercept
Prostate Cancer
Model
Slope
Prostatic Neoplasms
Model Updating
Multiple Testing
Type I Error Rate
Logistic Regression
Brain Neoplasms
Calibration
Adjustment
Population
Fever

Cite this

Vergouwe, Y., Nieboer, D., Oostenbrink, R., Debray, T. P. A., Murray, G. D., Kattan, M. W., ... Steyerberg, E. W. (2017). A closed testing procedure to select an appropriate method for updating prediction models. Statistics in medicine, 36(28), 4529-4539. https://doi.org/10.1002/sim.7179
Vergouwe, Yvonne ; Nieboer, Daan ; Oostenbrink, Rianne ; Debray, Thomas P.A. ; Murray, Gordon D. ; Kattan, Michael W. ; Koffijberg, Hendrik ; Moons, Karel G.M. ; Steyerberg, Ewout W. / A closed testing procedure to select an appropriate method for updating prediction models. In: Statistics in medicine. 2017 ; Vol. 36, No. 28. pp. 4529-4539.
@article{32c9c4b586d942e98f46300658e573aa,
title = "A closed testing procedure to select an appropriate method for updating prediction models",
abstract = "Prediction models fitted with logistic regression often show poor performance when applied in populations other than the development population. Model updating may improve predictions. Previously suggested methods vary in their extensiveness of updating the model. We aim to define a strategy in selecting an appropriate update method that considers the balance between the amount of evidence for updating in the new patient sample and the danger of overfitting. We consider recalibration in the large (re-estimation of model intercept); recalibration (re-estimation of intercept and slope) and model revision (re-estimation of all coefficients) as update methods. We propose a closed testing procedure that allows the extensiveness of the updating to increase progressively from a minimum (the original model) to a maximum (a completely revised model). The procedure involves multiple testing with maintaining approximately the chosen type I error rate. We illustrate this approach with three clinical examples: patients with prostate cancer, traumatic brain injury and children presenting with fever. The need for updating the prostate cancer model was completely driven by a different model intercept in the update sample (adjustment: 2.58). Separate testing of model revision against the original model showed statistically significant results, but led to overfitting (calibration slope at internal validation = 0.86). The closed testing procedure selected recalibration in the large as update method, without overfitting. The advantage of the closed testing procedure was confirmed by the other two examples. We conclude that the proposed closed testing procedure may be useful in selecting appropriate update methods for previously developed prediction models",
author = "Yvonne Vergouwe and Daan Nieboer and Rianne Oostenbrink and Debray, {Thomas P.A.} and Murray, {Gordon D.} and Kattan, {Michael W.} and Hendrik Koffijberg and Moons, {Karel G.M.} and Steyerberg, {Ewout W.}",
year = "2017",
month = "12",
day = "10",
doi = "10.1002/sim.7179",
language = "English",
volume = "36",
pages = "4529--4539",
journal = "Statistics in medicine",
issn = "0277-6715",
publisher = "Wiley",
number = "28",

}

Vergouwe, Y, Nieboer, D, Oostenbrink, R, Debray, TPA, Murray, GD, Kattan, MW, Koffijberg, H, Moons, KGM & Steyerberg, EW 2017, 'A closed testing procedure to select an appropriate method for updating prediction models', Statistics in medicine, vol. 36, no. 28, pp. 4529-4539. https://doi.org/10.1002/sim.7179

A closed testing procedure to select an appropriate method for updating prediction models. / Vergouwe, Yvonne; Nieboer, Daan; Oostenbrink, Rianne; Debray, Thomas P.A.; Murray, Gordon D.; Kattan, Michael W.; Koffijberg, Hendrik; Moons, Karel G.M.; Steyerberg, Ewout W.

In: Statistics in medicine, Vol. 36, No. 28, 10.12.2017, p. 4529-4539.

Research output: Contribution to journalArticleAcademicpeer-review

TY - JOUR

T1 - A closed testing procedure to select an appropriate method for updating prediction models

AU - Vergouwe, Yvonne

AU - Nieboer, Daan

AU - Oostenbrink, Rianne

AU - Debray, Thomas P.A.

AU - Murray, Gordon D.

AU - Kattan, Michael W.

AU - Koffijberg, Hendrik

AU - Moons, Karel G.M.

AU - Steyerberg, Ewout W.

PY - 2017/12/10

Y1 - 2017/12/10

N2 - Prediction models fitted with logistic regression often show poor performance when applied in populations other than the development population. Model updating may improve predictions. Previously suggested methods vary in their extensiveness of updating the model. We aim to define a strategy in selecting an appropriate update method that considers the balance between the amount of evidence for updating in the new patient sample and the danger of overfitting. We consider recalibration in the large (re-estimation of model intercept); recalibration (re-estimation of intercept and slope) and model revision (re-estimation of all coefficients) as update methods. We propose a closed testing procedure that allows the extensiveness of the updating to increase progressively from a minimum (the original model) to a maximum (a completely revised model). The procedure involves multiple testing with maintaining approximately the chosen type I error rate. We illustrate this approach with three clinical examples: patients with prostate cancer, traumatic brain injury and children presenting with fever. The need for updating the prostate cancer model was completely driven by a different model intercept in the update sample (adjustment: 2.58). Separate testing of model revision against the original model showed statistically significant results, but led to overfitting (calibration slope at internal validation = 0.86). The closed testing procedure selected recalibration in the large as update method, without overfitting. The advantage of the closed testing procedure was confirmed by the other two examples. We conclude that the proposed closed testing procedure may be useful in selecting appropriate update methods for previously developed prediction models

AB - Prediction models fitted with logistic regression often show poor performance when applied in populations other than the development population. Model updating may improve predictions. Previously suggested methods vary in their extensiveness of updating the model. We aim to define a strategy in selecting an appropriate update method that considers the balance between the amount of evidence for updating in the new patient sample and the danger of overfitting. We consider recalibration in the large (re-estimation of model intercept); recalibration (re-estimation of intercept and slope) and model revision (re-estimation of all coefficients) as update methods. We propose a closed testing procedure that allows the extensiveness of the updating to increase progressively from a minimum (the original model) to a maximum (a completely revised model). The procedure involves multiple testing with maintaining approximately the chosen type I error rate. We illustrate this approach with three clinical examples: patients with prostate cancer, traumatic brain injury and children presenting with fever. The need for updating the prostate cancer model was completely driven by a different model intercept in the update sample (adjustment: 2.58). Separate testing of model revision against the original model showed statistically significant results, but led to overfitting (calibration slope at internal validation = 0.86). The closed testing procedure selected recalibration in the large as update method, without overfitting. The advantage of the closed testing procedure was confirmed by the other two examples. We conclude that the proposed closed testing procedure may be useful in selecting appropriate update methods for previously developed prediction models

U2 - 10.1002/sim.7179

DO - 10.1002/sim.7179

M3 - Article

VL - 36

SP - 4529

EP - 4539

JO - Statistics in medicine

JF - Statistics in medicine

SN - 0277-6715

IS - 28

ER -

Vergouwe Y, Nieboer D, Oostenbrink R, Debray TPA, Murray GD, Kattan MW et al. A closed testing procedure to select an appropriate method for updating prediction models. Statistics in medicine. 2017 Dec 10;36(28):4529-4539. https://doi.org/10.1002/sim.7179