The second algorithm starts from a previously estimated quantile regression at a similar quantile index and updates it using a single Newton–Raphson iteration. This step allows for a reduction in the effective sample size. The first algorithm applies the preprocessing idea of Portnoy and Koenker (Stat Sci 12(4):279–300, 1997) but exploits a previously estimated quantile regression to guess the sign of the residuals. We suggest two new fast algorithms for the estimation of a sequence of quantile regressions at many quantile indexes. Despite numerous algorithmic improvements, the computation time is still non-negligible because researchers often estimate many quantile regressions and use the bootstrap for inference. The widespread use of quantile regression methods depends crucially on the existence of fast algorithms.