Epoch

An Epoch is a full pass on the training set during model training. Thus a model that is trained for N epochs "sees" each training point N times during training. This is a common measure of how long a model is trained for. Another common measure is the number of iterations of the stochastic gradient descent algorithm, where in each iteration the model weights are updated based on a given "batch" of data.
Related concepts:
Gradient Descent