We present a symbolic technique of error analysis of numerical algorithms, which acts as a preprocessor to the actual numerical computation. The technique provides rigorous, a priori bounds which are universal for inputs in a prescribed domain. The method is an alternative to interval arithmetic in applications where speed and rigor but not necessarily the tight bounds are the main concerns. The technique was invented to perform the computer-assisted proof of chaos in the Lorenz equations.