A digital delay line is implemented by allocating a buffer of values in memory.
The input and output points are represented by one or more pointers, which are incremented at each time step.
If a delay line length is to be supported, it is necessary to use more than one pointer.
Memory allocation can be a relatively time consuming operation in a realtime synthesis environment. Thus, in situations where the delay length may change over time, a large buffer of some maximum size is usually created during initialization.