When the server sends a video bitstream to the client terminal, it is possible that a frame will be dropped (e.g., due to corruption in the video bitstream caused by noise). If this happens, and other frames rely on the dropped frame, the video will likely be displayed with glitches. For example, if the video bitstream includes a series of P-frames and the first P-frame in the series is dropped, each of the subsequent P-frames will refer back to incomplete data (i.e., they will be based on information from the dropped P-frame). As a result, a glitch will appear in the display and persist until an I-frame is sent. If the video has low entropy, the decoder on the server may send I-frames relatively infrequently such that the glitch will persist for a relatively long period. In the context of a remote display protocol session in which the desktop display itself is treated as a video, this low entropy condition may be a common occurrence. For example, during a Power Point presentation, there may be sufficient similarity between slides such that the decoder will not generate an I-frame when the presentation is advanced. If a glitch occurs in this scenario, it could potentially remain for the duration of the presentation.