It is known that the sensitivity of X-ray phase-contrast grating interferometry with regard to electron density variations present in the sample is related to the minimum detectable refraction angle. In this article a numerical framework is developed that allows for a realistic and quantitative determination of the sensitivity. The framework is validated by comparisons with experimental results and then used for the quantification of several influences on the sensitivity, such as spatial coherence or the number of phase step images. In particular, we identify the ideal inter-grating distance with respect to the highest sensitivity for parallel beam geometry. This knowledge will help to optimize existing synchrotron-based grating interferometry setups.