We describe a method for removing the numerical errors in the modeling of linear
evolution equations that are caused by approximating the time derivative by a finite
difference operator. The method is based on integral transforms realized as certain
Fourier integral operators, called time dispersion transforms, and we prove that,
under an assumption about the frequency content, it yields a solution with correct
evolution throughout the entire lifespan. We demonstrate the method on a
model equation as well as on the simulation of elastic and viscoelastic wave
propagation.