When estimating a regression function or its derivatives, local polynomials are an attractive choice due to their flexibility and asymptotic performance. Seifert and Gasser proposed ridging of local polynomials to overcome problems with variance for random design while retaining their advantages. In this article we present a data-independent rule of thumb and a data-adaptive spatial choice of the ridge parameter in local linear regression. In a framework of penalized local least squares regression, the methods are generalized to higher order polynomials, to estimation of derivatives, and to multivariate designs. The main message is that ridging is a powerful tool for improving the performance of local polynomials. A rule of thumb offers drastic improvements; data-adaptive ridging brings further but modest gains in mean square error.