Purpose Interdisciplinary rounds (IDRs) in the intensive care unit (ICU) are increasingly recommended to support quality improvement, but uncertainty exists about assessing the quality of IDRs. We developed, tested, and applied an instrument to assess the quality of IDRs in ICUs. Materials and Methods Delphi rounds were done to analyze videotaped patient presentations and elaborated together with previous literature search. The IDR Assessment Scale was developed, statistically tested, and applied to 98 videotaped patient presentations during 22 IDRs in 3 ICUs for adults in 2 hospitals in Groningen, The Netherlands. Results The IDR Assessment Scale had 19 quality indicators, subdivided in 2 domains: “patient plan of care” and “process.” Indicators were “essential” or “supportive.” The interrater reliability of 9 videotaped patient presentations among at least 3 raters was satisfactory (κ = 0.85). The overall item score correlations between 3 raters were excellent (r = 0.80-0.94). Internal consistency in 98 videotaped patient presentations was acceptable (α = .78). Application to IDRs demonstrated that indicators could be unambiguously rated. Conclusions The quality of IDRs in the ICU can be reliably assessed for patient plan of care and process with the IDR Assessment Scale.