{[['']]}
Local telephone lines use the same wires to send and receive, which results in a small amount of the outgoing signal being reflected back. This is useful for people talking on the phone, as it provides a signal to the speaker that their voice is making it through the system. However, this reflected signal causes problems for the modem, which is unable to distinguish between a signal from the remote modem and the echo of its own signal. This was why earlier modems split the signal frequencies into "answer" and "originate"; the modem could then ignore any signals in the frequency range it was using for transmission. Even with improvements to the phone system allowing higher speeds, this splitting of available phone signal bandwidth still imposed a half-speed limit on modems.
Echo cancellation eliminated this problem. Measuring the echo delays and magnitudes allowed the modem to tell if the received signal was from itself or the remote modem, and create an equal and opposite signal to cancel its own. Modems were then able to send over the whole frequency spectrum in both directions at the same time, leading to the development of 4,800 and 9,600 bit/s modems.
Increases in speed have used increasingly complicated communications theory. Twelve hundred and 2,400 bit/s modems used the phase shift key (PSK) concept. This could transmit two or three bits per symbol. The next major advance encoded four bits into a combination of amplitude and phase, known as Quadrature Amplitude Modulation (QAM).
The new V.27ter and V.32 standards were able to transmit 4 bits per symbol, at a rate of 1,200 or 2,400 baud, giving an effective bit rate of 4,800 or 9,600 bit/s. The carrier frequency was 1,650 Hz. For many years, most engineers considered this rate to be the limit of data communications over telephone networks.
Error correction and compression[edit source | editbeta]
Operations at these speeds pushed the limits of the phone lines, resulting in high error rates. This led to the introduction of error-correction systems built into the modems, made most famous with Microcom's MNP systems. A string of MNP standards came out in the 1980s, each increasing the effective data rate by minimizing overhead, from about 75% theoretical maximum in MNP 1, to 95% in MNP 4. The new method called MNP 5 added data compression to the system, thereby increasing overall throughput above the modem's rating. Generally the user could expect an MNP5 modem to transfer at about 130% the normal data rate of the modem. Details of MNP were later released and became popular on a series of 2,400-bit/s modems, and ultimately led to the development of V.42 and V.42bis ITU standards. V.42 and V.42bis were non-compatible with MNP but were similar in concept because they featured error correction and compression.
Another common feature of these high-speed modems was the concept of fallback, or speed hunting, allowing them to communicate with less-capable modems. During the call initiation, the modem would transmit a series of signals and wait for the remote modem to respond. They would start at high speeds and get progressively slower until there was a response. Thus, two USR modems would be able to connect at 9,600 bit/s, but, when a user with a 2,400-bit/s modem called in, the USR would fall back to the common 2,400-bit/s speed. This would also happen if a V.32 modem and a HST modem were connected. Because they used a different standard at 9,600 bit/s, they would fall back to their highest commonly supported standard at 2,400 bit/s. The same applies to V.32bis and 14,400 bit/s HST modem, which would still be able to communicate with each other at 2,400 bit/s.
Post a Comment