Silt density index (SDI) testing is a widely-accepted method for estimating the rate at which colloidal and particle fouling will occur in water purification systems when using reverse osmosis (RO) or nanofiltration (NF) membranes. However, the SDI has several deficiencies. For example, the SDI has no linear relationship with the particle concentration, is not based on any fouling mechanism, and is not corrected for temperature, pressure and membrane resistance. The accuracy and reproducibility of the SDI is often questioned. In this study, mathematical models were developed to investigate the sensitivity of SDI for the following types of errors: errors due to inaccurate lab or field equipment, systematic errors, and errors resulting from artifacts and personal observations and experience. The mathematical results were verified experimentally. Both the mathematical models and experimental results show that the membrane resistance RM has the highest impact on the SDI results. The allowable ASTM variation in RM is responsible for a deviation in SDI between 2.29 and 3.98 at a level of SDI = 3. Besides that, a 1 s error in measuring the time to collect the second sample t2 results in ±0.07 at SDIO = 3. The artifacts and personal experience also influence the SDI results. The total error in measuring SDI was estimated to be equal to ±2.11 in the field and only ±0.4 in the lab in level of SDIO = 3. Furthermore, several recommendations are mentioned based on these theoretical results and our personal experience. This study demonstrates the sensitivity of the SDI for errors in RM and the accuracy of the equipments, and explains the difficulties in reproducing SDI results for the same water.