Friday, April 18, 2008

why video looks crappy

Technological developments are driven by practical necessity. It is no surprise, then, that technology for recording an artform, and technology for displaying artistic works, operate in lock step. When these two become out of sync, the quality of works created in the earlier format becomes less readily apparent.

Consider, for example, the case of video. When video recording technology first became widespread in the 80s, during the VHS / Betamax wars, video playback was heavily affected by tape condition, which deteriorated with repeated use. As a consequence, both mainstream movies, shot on film and transfered to video for the home market, and independent movies, shot on and marketed directly to video, were subjected to the same degenerative process. Obviously, many qualitative differences remained, yet certain basic parameters of visual quality (grain, depth, range of color / brightness, etc.) are distorted to the extent that many of the failings of the newer technology are masked.

As playback technology developed, however, and DVDs gained ascendency, the playback media no longer suffered degradation with repeated play. Furthermore, resolution increased and the failings of the newer technology when reproduced at home became more readily apparent.

The consequence of this, of course, is that when we watch on dvd a movie originally shot on video, the distance between its quality and that of better financed productions is more easily apparent than at the time of the movie's production.

No comments: