The need for a "preview" of the audio being cut arose with the pitch computer which adjusted the groove pitch to allow for more lateral real estate for grooves with higher modulation levels. It was initially accomplished by custom modifying a tape recorder's tape path to allow for enough tape between the preview head and the play head to provide delay equivalent to one record revolution. And that's how it was done for years. But no longer.
Lexicon invented the first practical digital audio delay unit in 1971, the "Delta T-101", but it had neither sufficient delay time nor audio specs to be used in mastering (60dB s/n, 10kHz bandwidth), so was relegated to sound reinforcement. While specs improved with the next generation it took quite a while before memory costs and density improved enough to provide sufficient delay (just under 2 seconds), which at 44.1/16 PCM is almost 400K of RAM (4x 8-bit bytes). Sounds like nothing now, but 1K was a lot back then, and fast memory was expensive.
But then,16 bit PCM wasn't exactly popular or affordable yet because ADCs and DACs didn't exist as monolithic solutions, they were custom made of massive numbers of individual parts, so there were other coding techniques such as "delta modulation" which were more practical, but they didn't perform well. I recall several iterations of digital delay that were barely tolerable for sound reinforcement, and 200ms was a lot of delay.
The analog "bucket brigade" devices introduced first invented in the early 1970s were CCDs (charge-coupled devices), where the analog input was sampled by storing it's instantaneous voltage in a capacitor, then clocking that charge through a chain of switches and capacitors and buffer amps with enough stages to provide useful delay. Clock speed was completely arbitrary, but of course tied to both delay time and maximum frequency, as well as noise, which was pretty bad. Decent audio BBD/CCD chips with lower distortion and noise, and higher stage count, didn't arrive until 1980 or so, and even then they were two chains of 512 stages...not enough for any significant delay at high enough clock speed for decent audio without cascading many, many chips. They were useful only for effects and short delays, and were never used for mastering preview functions.
What really pushed mastering labs to digital preview was digital recording. Recordings made or at least mastered any digital system created the need for a digital preview system. A huge number of early digital recordings were made on a Sony PCM 1610/1630 system (U-matic video deck and 16/44 PCM converter) and needed a new form of delay because they never hit analog tape at all. Sony introduced a mastering preview delay unit that took data from the PCM-1610 and clocked it through many, many memory chips, and out to a DAC, which drove the lathe. The actual PCM-1610 DAC output would feed the lathe's pitch computer. The system was lossless. When memory and ADC/DAC technology matured (about the same time as the Sony device) other digital delays were introduced. Once a digital preview system became more practical or economical than a custom modified tape recorder, they started to become common. Consider that a 15ips master tape would require zig-zagging nearly 30" of tape between the preview and play heads without introducing additional flutter. And 30ips tapes required almost 60" of tape to be somehow handled between preview and play. Since mastering operations were a very small market segment for tape recorder manufacturers, a lot of these were custom jobs. And that's where digital delay edged its way in. You could use a standard tape machine without mods. The bulk of records mastered between the late 1970s and the demise of vinyl were cut through a digital delay, even if they were analog recordings. And that condition largely remains today. A quick survey of mastering labs web sites showed mostly stock analog recorders, and no mention of analog preview. In fact, after looking over about two dozen sites, I found only two with visible modified analog machines in their studio photos. And none of the manufacturers of the machines in use are making machines today. The Ampex ATR-100 is popular (unmodified), but I found a lab using a modified MCI machine, and another using a modified Studer A-80. What that tells us is, to achieve the preview function, mastering labs are almost universally cutting through a digital delay.
One more note on analog delay. The mechanical "delay lines" mentioned earlier were actually misnamed. The actual time delay to the first arrival was relatively short, because the metal springs or plates/sheets transmit sound faster than air, but because the also resonated and bounced internal reflections around a semblance of reverb could be achieved, but not delay. In the later days of springs and plates digital delays were inserted ahead of them as a pre-delay, making them more realistic.