Chapter 22
Video Systems
Today’s video systems bear little resemblance to cable television (CATV) systems of yore. Those delivered a dozen or so channels to households in areas that could not receive signals off the air. Today, cable systems deliver at least 70 channels with a combination of entertainment and information services, bringing
broad bandwidths into homes and businesses and threatening to upset the traditional method of providing telephone service. Over the past few years, cable companies have rebuilt their networks into two-way broadband systems. The Internet was the first application, but once that was completed, some of the bandwidth
earmarked for IP was available for video-on-demand. Cable has captured more than half the broadband access market in the United States, and now it is poised to begin competing with the LECs to deliver telephone service. As the cable providers prepare to compete with the telephone companies, they are faced with competition of their own from satellite providers. Satellite video is digital and delivers better signal quality than the analog channels that the cable providers offer. Most cable providers also offer digital channels, and the broadcasters are converting their analog signals to digital. The FCC has mandated a transition to digital by the end of 2006 for broadcasters that use the airways, but that deadline will probably be extended. While entertainment is the prevalent use of video today, videoconferencing is an important facility for businesses and desktop video will soon be routine.
Distance learning for schools and telemedicine for healthcare organizations are emerging applications. As with most technologies, the key to expansion has been the development of standards. For years, the videoconferencing manufacturers
produced equipment with proprietary standards, which meant they could not interoperate. Then ITU-T completed its work on H.320 standards, which enables equipment to interoperate over dial-up or dedicated digital facilities. For packet networks, H.323 standards enable video communication across IP and frame relay.
AT&T demonstrated its Picturephone at the 1964 New York World’s Fair, but it turned out not to be practical. Like many other product developments, however, it was merely ahead of its time. Conferencing equipment built into desktop computers is becoming a regular feature in most companies and many homes. With
the growth of broadband connections to most homes and businesses, video will become a common enhancement for telephone calls.
VIDEO TECHNOLOGY
Video signals in North America are generated under the National Television Systems Committee (NTSC) system. In Europe they are generated under two different and incompatible standards: PAL (phase alternate line) and SECAM (sequential couleur avec memoire). An analog video signal is formed by scanning an image with a video camera. As the camera scans the image, it creates a signal that varies in voltage with variations in the degree of blackness of the image. In the NTSC system, the television raster has 525 horizontal scans, forming a raster that is composed of two fields of 262.5 lines each. The two fields are interlaced to form a frame (Figure 22-1). The frame repeats 30 times per second; the persistence of the human eye eliminates flicker. Since the two fields are interlaced, the screen is refreshed 60 times per second. On close inspection, a video screen is revealed to be a matrix of tiny dots. Each dot is called a picture element, abbreviated pixel. The resolution of a television picture is a function of the number of scan lines and pixels per frame, both of which affect the amount of bandwidth required to transmit a television signal. The NTSC system requires 4.2 MHz of bandwidth for satisfactory resolution. Because of the modulation system used and the need for guard bands between channels, the FCC assigns 6 MHz of bandwidth to analog television channels.
The signal resulting from each scan line varies between a black and a white voltage level (Figure 22-2a). A horizontal synchronizing pulse is inserted at the beginning of each line. Vertical pulses synchronize the frames as Figure 22-2b
shows. Between frames the signal is blanked during a vertical synchronizing interval to allow the scanning trace to return to the upper left corner of the screen. Acolor television signal has two parts: the luminance signal and the chrominance signal. A black and white picture has only the luminance signal, which controls
the brightness of the screen in step with the sweep of the horizontal trace. The chrominance signal modulates subcarriers that are transmitted with the video signal. The color demodulator in the receiver is synchronized by a color burst consisting of eight cycles of a 3.58 MHz signal that is applied to the horizontal
synchronizing pulse (Figure 22-2a). When no picture is being transmitted, the scanning voltage rests at the black level, and the television receiver’s screen is black. Because the signal is amplitude-modulated analog, any noise pulses that are higher in level than the black signal level appear on the screen as snow. A high-quality transmission medium keeps the signal above the noise to preserve satisfactory picture quality. The degree of resolution in a television picture depends on bandwidth. Signals sent through a narrow bandwidth are fuzzy with washed-out color. The channel also must be sufficiently linear. Lack of linearity results in high-level signals being amplified at a different rate than low-level signals, which affects picture contrast. Another critical requirement of the transmission medium is its envelope delay characteristic. Envelope delay is the difference in propagation speed of the various frequencies in the video passband. If envelope delay is excessive, the chrominance signal arrives at the receiver out of phase with the luminance signal, and color distortion results.
Digital Television
As with telephone transmission, analog impairments can be avoided by converting the signal to digital. An uncompressed studio-quality signal is about 270 Mbps, so efficient use of the bandwidth means the digital signal must be compressed. Using MPEG-2 compression, as many as 10 digital channels can be
squeezed in the 6 MHz space formerly occupied by one analog channel. This greater channel efficiency is the reason the FCC has mandated the conversion to digital TV. MPEG-2 is the same compression algorithm as DVD disks use. Although some slight decrease in quality results from the compression, it is hardly noticeable and it overcomes many of the impairments such as interference and ghosting that affect analog transmission.
Video Compression
Compression algorithms rely on the fact that a video signal contains considerable redundancy. Often large portions of background do not change between frames, but analog systems transmit these anyway. By removing redundancy, digital signal
processing can compress a video signal into a reasonably narrow bandwidth. The signal can be compressed to occupy as little as 64 Kbps, but the quality falls short, even for videoconferences, which need at least 384 Kbps, or six channels for reasonable quality. T1 bandwidth is needed for a signal that is approximately equal to the quality of a home VCR. Many standards, both public and proprietary have been developed for video compression. ITU’s H.261 is intended for videoconferencing, and is discussed later in more detail. The Motion Picture Experts Group, a working group of the International Organization for Standardization and the International Electrotechnical Commission, develops MPEG standards for encoding both video and audio. The principal standards of concern in telecommunications are MPEG-1,
-2, and -4. Other standards describe multimedia content and delivery. MPEG-1 compresses a video signal into bandwidths of up to 1.5 Mbps. The resolution is 288 lines per frame, which is home VCR quality. It is satisfactory for broadcast use if the scene does not have too much action. MPEG-2 codes studio quality
video into bit streams of varying bandwidth. The most popular studio signal, known as D-1 or CCIR 601, is coded at 270 Mbps and can be compressed into about 3Mbps. Scenes with high activity such as sports broadcasts require bit rates of about 5 or
6Mbps. MPEG-3 was intended for HDTV, but it was discovered that MPEG-2 syntax was sufficient. MPEG-4 is an enhancement for multimedia transmission. Video is compressed by predictive coding and eliminating redundancy. Flicking by at 30 frames per second, much in a moving picture does not change from frame to frame. Interframe encoding recognizes when portions of a frame remain constant, and transmits only the changed portions. Intraframe encoding provides another element of compression. The picture is broken into blocks of 16 Ч 16 pixels. Ablock is transmitted only when pixels within that block have changed. Otherwise, the decoding equipment retains the previous block and forwards it to the receiver with each frame. Predictive coding analyzes the elements that are changing, and predicts what the next frame will be. If the transmitting and receiving codecs both use the same prediction algorithm, only changes from the prediction must be transmitted, not the complete frame. Approximately every two seconds, the entire frame is refreshed, but in intervening frames, only the changed pixels are transmitted. The use of predictive coding requires a high-quality transmission facility. Lost packets in video-over-IP can have a detrimental effect beyond the information loss of one frame because of the predictive nature of the coding algorithm. Encoding systems are classified as lossless, or lossy. Information in a lossless coding system can be restored bit-for-bit. A lossy encoding system transmits only enough to retain intelligibility, but not enough to ensure the integrity of the received signal. Lossy systems work well for sending still scenes, but motion can cause a tiling or smearing effect as the receiving codec attempts to catch up with the transmitter. The higher the compression and the more vigorous the motion, the greater is the smearing effect.
High Definition Television (HDTV)
The present NTSC television standard was defined in 1941 when 525-line resolution was considered excellent quality and when such technologies as large-scale integration were hardly imagined. The standard was advanced for its time, but it is far from the present state of the art. Larger cities are running out of broadcast
channel capacity. Although not all channels are filled, co-channel interference prevents the FCC from assigning all available channels. Large television screens are becoming more the rule than the exception, and at close range the distance between scan lines is disconcerting. The 525 scan lines of broadcast television and the 4:3 width-to-height ratio of the screen, called its aspect ratio, limit picture quality. With wide-screen movies and the growing popularity of large-screen television sets, the definition of the current scanning system is considerably less than that of the original image. All of this leads to a transition to HDTV. HDTV has been introduced in most of the larger markets in the United States, but it has not been widely accepted yet because it requires replacement of existing television sets. Table 22-1 compares NTSC video with the new standard, which is known as Advanced Television Systems Committee (ATSC). The resolution of HDTV is far superior to NTSC. It has six times the number of pixels and supports wide screen television.
Video on Demand
A driving factor for digital television is video on demand (VoD). Today’s CATV systems deliver all channels to every residence, and unwanted or unauthorized channels are trapped out or scrambled. VoD refers to a broad spectrum of services that are delivered over a broadband medium. Entertainment, information, and education services are examples of VoD services that the service provider transmits on order. Subscribers can choose movies or programs they want via an onscreen menu, and control the sessions with VCR-like functions such as stop, fast-forward, and rewind. VoD may save a trip to the neighborhood video rental store, but the main question is delivery. The two principal delivery methods are via broadband cable or over DSL. Speed is the limiting factor with DSL. Reasonable quality can be obtained
with the popular ADSL, but studio quality HDTV requires at least 3Mbps, which requires VDSL. The limiting factor here is its range, which is about 4000 ft (1200 m). Delivery over cable is not too different from a range standpoint, however, to achieve the broadband speeds required without compromising quality, the streaming video signal must be brought to a neighborhood center over fiber. Cable providers offer VoD to digital service subscribers, which so far are a minority. Digital television sets are readily available, but for now most television sets are analog, which means a converter is required. Video servers can also be an obstacle. The amount of data that must be delivered to fill digital pipes to thousands of simultaneous users requires servers capable of storing many terabytes of data.
CABLE TELEVISION SYSTEMS
Cable television systems have three major components: headend equipment, trunk cable, and feeder and drop cable. Figure 22-3 is a block diagram of a conventional CATV system. All channels that originate at the headend are broadcast to all stations, which means the operator must scramble or block premium channels the user is not paying for. Headend equipment generates local video signals and receives signals from a variety of sources including off-the-air broadcasts, communication satellites, or microwave relay or fiber-optic connections from other providers. The analog signals from the headend are modulated to a channel within the bandwidth of the cable, which may be as great as 1 GHz. Headend equipment applies the signal to a trunk cable to carry the signal to local distribution systems. The trunk cable is equipped with broadband amplifiers that are equalized to carry the entire bandwidth. Amplifiers have about 20 dB of gain and are placed at intervals of approximately 500 m. Amplifiers known as bridgers couple the signal to feeder cables. Amplifiers contain automatic gain control circuitry to compensate for variations in cable loss. Power is applied to amplifiers over the coaxial center conductor. To continue essential services during power outages and amplifier failures, the cable operator provides redundant amplifiers and backup battery supplies. Because the CATV signals operate on the same frequencies as many radio services, the cable and amplifiers must be free of signal leakage since a leaking cable can interfere with another service or vice versa. As with other analog transmission media, noise and distortion are cumulative through successive amplifier stages. Analog impairments limit the serving radius of 70-channel CATV to about 8 km from the headend. Bridger amplifiers split feeder or distribution cable from the trunk cable. Multiple feeders are coupled with splitters or directional couplers, which match the impedance of the cables. The feeder cable is smaller and less expensive and has higher loss than trunk cable. Subscriber drops connect to the feeder cable through taps, which are passive devices that isolate the feeder cable from the drop. The tap must have enough isolation that shorts and opens at the television set do not affect other signals on the cable.
Hybrid Fiber Coax (HFC)
As cable technology improved, the bandwidth increased and more channels were added. Some operators added upstream channels to provide additional revenuegenerating services such as alarm monitoring, but the real revolution came with Internet access. Although the bulk of Internet traffic flow is downstream, a substantial amount of upstream bandwidth is needed. The conventional CATV model has either limited or no upstream bandwidth, and routing every channel past every subscriber does not work for Internet access because of the shared nature of the medium. As Internet traffic increases, the response time increases, generally in proportion to the number of subscribers. The answer is to rebuild cable networks with an HFC model similar to the one in Figure 22-4. Both entertainment and access bandwidth are brought to neighborhood centers on a fiber backbone. The entertainment channels are applied to the distribution cable as always, but the Internet channels are combined to a lesser number of subscribers. The response time on the shared portion of the bandwidth is limited by controlling the number of subscribers that a node serves. The HFC architecture gives cable companies control over shared bandwidth to make it suitable for other services. One logical candidate is telephone service. Once the cable infrastructure is in place, the incremental per-subscriber cost of VoIP is small. As shown in Figure 22-4, the IP bandwidth is delivered to a router at the headend. Voice packets are routed to a media gateway, which is controlled by a softswitch that can be located anywhere. This method of delivering telephone service has certain advantages that exist today by regulatory fiat, but may not persist. Congress has elected to exempt IP services from the many taxes and fees that it imposes on telephone service. Although it is impossible to foresee what Congress will do in the future, nothing suggests that this exemption is permanent. Another issue is exemption from equal access. Today the cable companies are not required to permit equal access to their facilities, which means competing service providers cannot rely on cable as an access medium. Either of these would affect the profitability of local telephone service over cable.
VIDEOCONFERENCING EQUIPMENT
Videoconferencing has considerable appeal, primarily as a substitute for travel, but it has yet to become a mainstream application. The primary drawback has been cost. Not long ago, the cameras, monitors, codec, and formal conference room setup could easily exceed $100,000 and the payback was difficult to quantify. Recently, equipment costs have dropped to a fraction of their previous levels and a conference unit can now be placed almost anywhere. The transmission costs, however, remain high. A reasonable videoconference requires 384 Kbps, or six BRI channels. Add usage costs to that and conferences, particularly multipoint conferences, are still expensive. A two-way videoconference over IP requires about 400 Kbps of full-duplex
bandwidth. With a stable private network in place, the quality is as good as ISDN. The Internet is not a suitable medium where true conference-quality video is required, so most external conferences will need an ISDN gateway.
A videoconference facility has some or all of the following subsystems integrated into a unit:
_ Video codec
_ Audio equipment
_ Video production and control equipment
_ Graphics equipment
_ Document hard copy equipment
_ Communications
A full description of these systems is beyond the scope of this book, but we will discuss them here briefly to illustrate the composition of a full videoconferencing facility. Personal video communications equipment is readily available and if the network has sufficient bandwidth and QoS, it is far less elaborate than a
formal conference facility, and will likely be the application that enables videoconferencing to fulfill its promise.
Video Codec
The codec converts the analog video signal to digital and compresses it for transmission. At the distant end the process is reversed. Codecs for dial-up conferences must support H.261. Many also support H.263 coding for IP conferences.
Audio Equipment
Most analysts believe that audio equipment is the most important part of a videoconference facility. In large videoconferences it is often impossible to show all participants, but it is important that everyone hear and be heard clearly. Audio equipment consists of microphones, speakers, and amplifiers placed strategically
around the room. Sometimes speaker telephones are used, but with less satisfactory results than codec audio. The codec robs a portion of the bandwidth to transmit the audio, so some products allow the operator to reduce the amount of audio bandwidth as a way of improving the video. When IP conferencing is used the
audio must be multiplexed on the bit stream because lip sync cannot be preserved with a separate audio channel.
Video and Control Equipment
Video equipment consists of two or three cameras and associated control equipment. The main camera usually is mounted at the front of the room and often automatically follows the voice of the speaker. Zoom, tilt, and azimuth controls are mounted on a console, where the conference participants can control them from a panel with a joystick. A second camera mounts overhead for graphic displays. The facility sometimes includes a third mobile camera that is operated independently. A switch at the console operator’s position selects the camera. Usually, one monitor shows the picture from the distant end and another shows the picture at the near end. In single-monitor conferences the near end can be
viewed in a window using the monitor’s picture-in-picture feature.
Digitizing and encoding equipment compresses full motion video or creates freeze-frames. In addition, encryption equipment may be included for security. Other equipment can freeze a full motion display for a few seconds while the participants send a graphic image over the circuit. Sometimes digital storage equipment
enables participants to transmit presentation material ahead of time so graphics transmission does not waste conference time.
Graphics Equipment
Videoconference facilities may include graphics-generating equipment to construct diagrams. Some systems provide desktop computer input so that tools in the computer can be used for generating graphics.
Communications
Digital communication facilities are required for videoconferences. ISDN is required for connections over the PSTN. Two 64 Kbps B channels and separate signaling channel provide enough bandwidth for an acceptable conference, but at least 384 Kbps is required for conference quality. Where an IP network with sufficient bandwidth and QoS is available, it is a good medium for conferences over the internal network. Frame relay is an excellent platform for videoconferencing if the access bandwidth is sufficient.
ITU-T H.320 Video Standards
Before the ITU-T H.320 standards were developed, manufacturers used proprietary standards, which meant that both ends of a videoconferencing session had to use the same equipment. Now, interoperability is assured by use of standards from the H.320 family. Table 22-2 lists the standards included under H.320. The amount of bandwidth supplied in the transmission facilities must be in multiples of 64 Kbps, known as P Ч 64 (pronounced P times 64). P can be from 1 to 30 64 Kbps channels; i.e. a single DS-0 up to full E1. Two options are offered. The full common intermediate format (CIF) offers frames of 288 lines by 352 pixels. This is approximately half the resolution of commercial television, which is 525 lines by 480 pixels. The second alternative is quarter CIF (QCIF), which is 144 lines by 176 pixels. The modulation method is the discreet cosine transform (DCT) algorithm.
H.323 Video Standards
The public Internet is not a sufficiently stable medium for reliable high-quality videoconferences, but for some it is good enough. Even better is the enterprise IP network or frame relay. The network can be designed with sufficient bandwidth and equipped with QoS protocols that make videoconferencing over IP an excellent alternative and considerably less costly than dial-up. We discussed H.323 signaling in Chapter 12. Those protocols work for either voice or video. The benefit of H.323 is its simplicity once it is set up and working. The standard RJ-45 Ethernet jack is ubiquitous, and can accept video without concern about how many channels are needed. A rollabout video unit can be hooked to any jack just like a laptop computer. One of the driving applications is desktop videoconferencing. The equipment is contained in a standard desktop computer that has a small video camera mounted on top of the monitor. The conferencing equipment is mounted on a board that plugs into a computer expansion slot, or it is an external box that plugs into a board in the computer. If the network is designed for video, the result is an economical method of conducting a personal videoconference. Conferences are spontaneous, with no need to schedule a conference room. The addition of a picture to the call allows the parties to pick up the nonverbal content of a session by seeing expressions. The screen is generally too small and the camera angle too narrow for group conferences, but for small groups it is excellent. Most products permit the users to share and view computer files over the network. As desktop video gains acceptance, it will be an effective tool for enhancing the quality of telephone calls and for enabling users to collaborate on files.
VIDEO APPLICATION ISSUES
Entertainment is likely to remain the primary driving force behind video into the future, but business, healthcare, and educational uses of television will be increasingly important. As CATV provides a broadband information pipeline into a substantial portion of American households, the growth of nonentertainment services
is expected.
Security
Television security applications take two forms—alarm systems and closed circuit television (CCTV) for monitoring unattended areas from a central location. Many businesses use CCTV for intrusion monitoring, and it is also widely used for intraorganizational information telecasts. Alarm services have principally relied on telephone lines to relay alarms to a center, which requires a separate line or automatic dialer. The expense of these devices can be saved by routing alarms over a CATV upstream channel, but to do so it requires a terminal to interface with the alarm unit. As described earlier, a computer in the headend scans the alarm terminals and forwards alarm information to a security agency as instructed by the user. H.323 video can also be used to deliver narrow-band IP video for security monitoring.
IP Services
Many CATV companies offer two-way data communications over their systems. As discussed in Chapter 8, the cable company allocates bandwidth for upstream and downstream using the DOCSIS protocol. VoD is delivered over IP bandwidth, using upstream channels to order services and control the delivery.
Control Systems
Two-way CATV systems offer the potential of controlling many functions in households and businesses. For example, utilities can use the system to poll remote gas, electric, and water meters to save the cost of manual meter reading. Power companies can use the system for load control. During periods of high demand, electric water heaters can be turned off and restored when reduced demand permits. A computer at the headend can remotely control a variety of household services such as appliances and environmental equipment. CATV companies themselves can use the system to register channels that viewers are watching and to bill for service consumed. They also can use the equipment to control addressable converters to unscramble a premium channel at the viewers’ request.
Opinion Polling
Experiments with opinion polling over CATV have been conducted. For example, CATV has been used to enable viewers to evaluate the television program they have just finished watching. The potential of this system for allowing viewers to watch a political body in action and immediately express their opinion has great
potential in a participatory democracy.
Streaming Video
The QoS requirements that may preclude using the Internet for the video transmission medium do not apply to streaming video. Enterprises can use the Internet for one-way broadcasts to employees and customers. Streaming video has endless
applications in training, education, entertainment, and other such purposes.
Videoconferencing
Companies that had never considered videoconferencing are investigating it more closely as the economics become more compelling. The first line of justification is generally replacement for travel, but as organizations adopt video as a way of doing business, the need for economic justification disappears. Videoconferencing makes it economical for more people to participate directly because travel is eliminated. A major advantage of videoconferencing where formal conference rooms are provided is scheduling. When users must reserve a facility, meetings must
begin and end on schedule, which is an added benefit.
Evaluation Considerations
As the cost of equipment drops, companies may find they have backed into the videoconferencing business without a plan. The result may be under-utilized equipment and network facilities, or the inverse, which is an overloaded network. Needs and expectations should be thoroughly assessed before embarking on
videoconferencing. The issues discussed in this section should be considered and the necessary controls imposed to increase the chances of success.
Type of Transmission Facilities
The initial issue to resolve is the telecommunications medium. BRI ISDN service is the default method where conferences use the PSTN. For higher bandwidths, inverse multiplexing may be required. Some units have the inverse multiplexer built in, while others use an external i-mux. Most digital PBXs can support PRI on the PSTN side and BRI toward the videoconferencing endpoints. Where IP bandwidth is available, H.323 video is an excellent alternative. The typical company in the market for H.323 video will have an existing internal network with spare capacity, or one to which capacity can be added inexpensively. Amajor advantage of IP conferencing is the fact that equipment can be relocated easily by plugging into a live Ethernet jack. The terminal equipment may be interchangeable for H.320 conferences. If external conferencing is required, a H.323-to-H.320 gateway will be required. If the video and audio signals are separate across the gateway, lip sync problems are apt to result.
Single- or Multipoint Conferences
Videoconferences are classed as point-to-point or multipoint. With terrestrial facilities, the distance and number of points served have a significant effect on transmission costs. Large companies with a significant amount of multipoint conferencing can often justify the cost of an MCU. Companies that use multipoint conferencing only occasionally can use bridging services offered by the major
IXCs. The IXCs offer “meet-me” conferencing in which conferees dial into a bridge. The control unit receives inputs from all locations, and sends each location the image that has seized the transmitting channel. The transmitting channel is allocated by one of three methods: under control of the conference leader, under
time control in which each location gets a share of the time, or by switching automatically to the location that is currently talking. The latter method is the most common, but it requires a disciplined approach.
Videoconferencing Equipment
The following issues should be evaluated in selecting videoconferencing equipment:
• Will the equipment used be videoconferencing appliances or PC-based?
• Will the conference be set up in formal conference rooms or from desktops?
• What level of quality is needed? Is a highly compressed signal satisfactory? Does the system offer full 30 frames per second or some lower factor?
• Can the system support multiple video formats (NTSC, SECAM, PAL)?
• Does the system support still graphics?
• How easily can information be brought into the conference? Is information sharing fundamental to the equipment or is it an add-on?
• Will fixed or portable equipment be used? Do the applications require frequent equipment relocation?
• Does audio ride on the video facility? Is wideband audio available?
• Is single-point or multipoint communication required? If multipoint, will the user supply its own multipoint control unit or use facilities offered by the IXC?
System Integration
Videoconference equipment is sometimes an assembly of units made by different manufacturers. To ensure compatibility, it is advisable to obtain equipment from a vendor who can integrate it into a complete system.
Security
The type of information being transmitted over the channel must be considered. If proprietary information is discussed during conferences, encryption of both video and audio signals may be required, particularly if IP is the transmission protocol.
Public or Private Facilities
Private videoconference facilities have a significant advantage over public access systems. Public facilities are unavailable in many localities, which may preclude holding many videoconferences. The travel time to a public facility offsets some
of the advantages of videoconference. Unless a private facility is used frequently, however, public facilities are usually the most cost-effective option.
Today’s video systems bear little resemblance to cable television (CATV) systems of yore. Those delivered a dozen or so channels to households in areas that could not receive signals off the air. Today, cable systems deliver at least 70 channels with a combination of entertainment and information services, bringing
broad bandwidths into homes and businesses and threatening to upset the traditional method of providing telephone service. Over the past few years, cable companies have rebuilt their networks into two-way broadband systems. The Internet was the first application, but once that was completed, some of the bandwidth
earmarked for IP was available for video-on-demand. Cable has captured more than half the broadband access market in the United States, and now it is poised to begin competing with the LECs to deliver telephone service. As the cable providers prepare to compete with the telephone companies, they are faced with competition of their own from satellite providers. Satellite video is digital and delivers better signal quality than the analog channels that the cable providers offer. Most cable providers also offer digital channels, and the broadcasters are converting their analog signals to digital. The FCC has mandated a transition to digital by the end of 2006 for broadcasters that use the airways, but that deadline will probably be extended. While entertainment is the prevalent use of video today, videoconferencing is an important facility for businesses and desktop video will soon be routine.
Distance learning for schools and telemedicine for healthcare organizations are emerging applications. As with most technologies, the key to expansion has been the development of standards. For years, the videoconferencing manufacturers
produced equipment with proprietary standards, which meant they could not interoperate. Then ITU-T completed its work on H.320 standards, which enables equipment to interoperate over dial-up or dedicated digital facilities. For packet networks, H.323 standards enable video communication across IP and frame relay.
AT&T demonstrated its Picturephone at the 1964 New York World’s Fair, but it turned out not to be practical. Like many other product developments, however, it was merely ahead of its time. Conferencing equipment built into desktop computers is becoming a regular feature in most companies and many homes. With
the growth of broadband connections to most homes and businesses, video will become a common enhancement for telephone calls.
VIDEO TECHNOLOGY
Video signals in North America are generated under the National Television Systems Committee (NTSC) system. In Europe they are generated under two different and incompatible standards: PAL (phase alternate line) and SECAM (sequential couleur avec memoire). An analog video signal is formed by scanning an image with a video camera. As the camera scans the image, it creates a signal that varies in voltage with variations in the degree of blackness of the image. In the NTSC system, the television raster has 525 horizontal scans, forming a raster that is composed of two fields of 262.5 lines each. The two fields are interlaced to form a frame (Figure 22-1). The frame repeats 30 times per second; the persistence of the human eye eliminates flicker. Since the two fields are interlaced, the screen is refreshed 60 times per second. On close inspection, a video screen is revealed to be a matrix of tiny dots. Each dot is called a picture element, abbreviated pixel. The resolution of a television picture is a function of the number of scan lines and pixels per frame, both of which affect the amount of bandwidth required to transmit a television signal. The NTSC system requires 4.2 MHz of bandwidth for satisfactory resolution. Because of the modulation system used and the need for guard bands between channels, the FCC assigns 6 MHz of bandwidth to analog television channels.
The signal resulting from each scan line varies between a black and a white voltage level (Figure 22-2a). A horizontal synchronizing pulse is inserted at the beginning of each line. Vertical pulses synchronize the frames as Figure 22-2b
shows. Between frames the signal is blanked during a vertical synchronizing interval to allow the scanning trace to return to the upper left corner of the screen. Acolor television signal has two parts: the luminance signal and the chrominance signal. A black and white picture has only the luminance signal, which controls
the brightness of the screen in step with the sweep of the horizontal trace. The chrominance signal modulates subcarriers that are transmitted with the video signal. The color demodulator in the receiver is synchronized by a color burst consisting of eight cycles of a 3.58 MHz signal that is applied to the horizontal
synchronizing pulse (Figure 22-2a). When no picture is being transmitted, the scanning voltage rests at the black level, and the television receiver’s screen is black. Because the signal is amplitude-modulated analog, any noise pulses that are higher in level than the black signal level appear on the screen as snow. A high-quality transmission medium keeps the signal above the noise to preserve satisfactory picture quality. The degree of resolution in a television picture depends on bandwidth. Signals sent through a narrow bandwidth are fuzzy with washed-out color. The channel also must be sufficiently linear. Lack of linearity results in high-level signals being amplified at a different rate than low-level signals, which affects picture contrast. Another critical requirement of the transmission medium is its envelope delay characteristic. Envelope delay is the difference in propagation speed of the various frequencies in the video passband. If envelope delay is excessive, the chrominance signal arrives at the receiver out of phase with the luminance signal, and color distortion results.
Digital Television
As with telephone transmission, analog impairments can be avoided by converting the signal to digital. An uncompressed studio-quality signal is about 270 Mbps, so efficient use of the bandwidth means the digital signal must be compressed. Using MPEG-2 compression, as many as 10 digital channels can be
squeezed in the 6 MHz space formerly occupied by one analog channel. This greater channel efficiency is the reason the FCC has mandated the conversion to digital TV. MPEG-2 is the same compression algorithm as DVD disks use. Although some slight decrease in quality results from the compression, it is hardly noticeable and it overcomes many of the impairments such as interference and ghosting that affect analog transmission.
Video Compression
Compression algorithms rely on the fact that a video signal contains considerable redundancy. Often large portions of background do not change between frames, but analog systems transmit these anyway. By removing redundancy, digital signal
processing can compress a video signal into a reasonably narrow bandwidth. The signal can be compressed to occupy as little as 64 Kbps, but the quality falls short, even for videoconferences, which need at least 384 Kbps, or six channels for reasonable quality. T1 bandwidth is needed for a signal that is approximately equal to the quality of a home VCR. Many standards, both public and proprietary have been developed for video compression. ITU’s H.261 is intended for videoconferencing, and is discussed later in more detail. The Motion Picture Experts Group, a working group of the International Organization for Standardization and the International Electrotechnical Commission, develops MPEG standards for encoding both video and audio. The principal standards of concern in telecommunications are MPEG-1,
-2, and -4. Other standards describe multimedia content and delivery. MPEG-1 compresses a video signal into bandwidths of up to 1.5 Mbps. The resolution is 288 lines per frame, which is home VCR quality. It is satisfactory for broadcast use if the scene does not have too much action. MPEG-2 codes studio quality
video into bit streams of varying bandwidth. The most popular studio signal, known as D-1 or CCIR 601, is coded at 270 Mbps and can be compressed into about 3Mbps. Scenes with high activity such as sports broadcasts require bit rates of about 5 or
6Mbps. MPEG-3 was intended for HDTV, but it was discovered that MPEG-2 syntax was sufficient. MPEG-4 is an enhancement for multimedia transmission. Video is compressed by predictive coding and eliminating redundancy. Flicking by at 30 frames per second, much in a moving picture does not change from frame to frame. Interframe encoding recognizes when portions of a frame remain constant, and transmits only the changed portions. Intraframe encoding provides another element of compression. The picture is broken into blocks of 16 Ч 16 pixels. Ablock is transmitted only when pixels within that block have changed. Otherwise, the decoding equipment retains the previous block and forwards it to the receiver with each frame. Predictive coding analyzes the elements that are changing, and predicts what the next frame will be. If the transmitting and receiving codecs both use the same prediction algorithm, only changes from the prediction must be transmitted, not the complete frame. Approximately every two seconds, the entire frame is refreshed, but in intervening frames, only the changed pixels are transmitted. The use of predictive coding requires a high-quality transmission facility. Lost packets in video-over-IP can have a detrimental effect beyond the information loss of one frame because of the predictive nature of the coding algorithm. Encoding systems are classified as lossless, or lossy. Information in a lossless coding system can be restored bit-for-bit. A lossy encoding system transmits only enough to retain intelligibility, but not enough to ensure the integrity of the received signal. Lossy systems work well for sending still scenes, but motion can cause a tiling or smearing effect as the receiving codec attempts to catch up with the transmitter. The higher the compression and the more vigorous the motion, the greater is the smearing effect.
High Definition Television (HDTV)
The present NTSC television standard was defined in 1941 when 525-line resolution was considered excellent quality and when such technologies as large-scale integration were hardly imagined. The standard was advanced for its time, but it is far from the present state of the art. Larger cities are running out of broadcast
channel capacity. Although not all channels are filled, co-channel interference prevents the FCC from assigning all available channels. Large television screens are becoming more the rule than the exception, and at close range the distance between scan lines is disconcerting. The 525 scan lines of broadcast television and the 4:3 width-to-height ratio of the screen, called its aspect ratio, limit picture quality. With wide-screen movies and the growing popularity of large-screen television sets, the definition of the current scanning system is considerably less than that of the original image. All of this leads to a transition to HDTV. HDTV has been introduced in most of the larger markets in the United States, but it has not been widely accepted yet because it requires replacement of existing television sets. Table 22-1 compares NTSC video with the new standard, which is known as Advanced Television Systems Committee (ATSC). The resolution of HDTV is far superior to NTSC. It has six times the number of pixels and supports wide screen television.
Video on Demand
A driving factor for digital television is video on demand (VoD). Today’s CATV systems deliver all channels to every residence, and unwanted or unauthorized channels are trapped out or scrambled. VoD refers to a broad spectrum of services that are delivered over a broadband medium. Entertainment, information, and education services are examples of VoD services that the service provider transmits on order. Subscribers can choose movies or programs they want via an onscreen menu, and control the sessions with VCR-like functions such as stop, fast-forward, and rewind. VoD may save a trip to the neighborhood video rental store, but the main question is delivery. The two principal delivery methods are via broadband cable or over DSL. Speed is the limiting factor with DSL. Reasonable quality can be obtained
with the popular ADSL, but studio quality HDTV requires at least 3Mbps, which requires VDSL. The limiting factor here is its range, which is about 4000 ft (1200 m). Delivery over cable is not too different from a range standpoint, however, to achieve the broadband speeds required without compromising quality, the streaming video signal must be brought to a neighborhood center over fiber. Cable providers offer VoD to digital service subscribers, which so far are a minority. Digital television sets are readily available, but for now most television sets are analog, which means a converter is required. Video servers can also be an obstacle. The amount of data that must be delivered to fill digital pipes to thousands of simultaneous users requires servers capable of storing many terabytes of data.
CABLE TELEVISION SYSTEMS
Cable television systems have three major components: headend equipment, trunk cable, and feeder and drop cable. Figure 22-3 is a block diagram of a conventional CATV system. All channels that originate at the headend are broadcast to all stations, which means the operator must scramble or block premium channels the user is not paying for. Headend equipment generates local video signals and receives signals from a variety of sources including off-the-air broadcasts, communication satellites, or microwave relay or fiber-optic connections from other providers. The analog signals from the headend are modulated to a channel within the bandwidth of the cable, which may be as great as 1 GHz. Headend equipment applies the signal to a trunk cable to carry the signal to local distribution systems. The trunk cable is equipped with broadband amplifiers that are equalized to carry the entire bandwidth. Amplifiers have about 20 dB of gain and are placed at intervals of approximately 500 m. Amplifiers known as bridgers couple the signal to feeder cables. Amplifiers contain automatic gain control circuitry to compensate for variations in cable loss. Power is applied to amplifiers over the coaxial center conductor. To continue essential services during power outages and amplifier failures, the cable operator provides redundant amplifiers and backup battery supplies. Because the CATV signals operate on the same frequencies as many radio services, the cable and amplifiers must be free of signal leakage since a leaking cable can interfere with another service or vice versa. As with other analog transmission media, noise and distortion are cumulative through successive amplifier stages. Analog impairments limit the serving radius of 70-channel CATV to about 8 km from the headend. Bridger amplifiers split feeder or distribution cable from the trunk cable. Multiple feeders are coupled with splitters or directional couplers, which match the impedance of the cables. The feeder cable is smaller and less expensive and has higher loss than trunk cable. Subscriber drops connect to the feeder cable through taps, which are passive devices that isolate the feeder cable from the drop. The tap must have enough isolation that shorts and opens at the television set do not affect other signals on the cable.
Hybrid Fiber Coax (HFC)
As cable technology improved, the bandwidth increased and more channels were added. Some operators added upstream channels to provide additional revenuegenerating services such as alarm monitoring, but the real revolution came with Internet access. Although the bulk of Internet traffic flow is downstream, a substantial amount of upstream bandwidth is needed. The conventional CATV model has either limited or no upstream bandwidth, and routing every channel past every subscriber does not work for Internet access because of the shared nature of the medium. As Internet traffic increases, the response time increases, generally in proportion to the number of subscribers. The answer is to rebuild cable networks with an HFC model similar to the one in Figure 22-4. Both entertainment and access bandwidth are brought to neighborhood centers on a fiber backbone. The entertainment channels are applied to the distribution cable as always, but the Internet channels are combined to a lesser number of subscribers. The response time on the shared portion of the bandwidth is limited by controlling the number of subscribers that a node serves. The HFC architecture gives cable companies control over shared bandwidth to make it suitable for other services. One logical candidate is telephone service. Once the cable infrastructure is in place, the incremental per-subscriber cost of VoIP is small. As shown in Figure 22-4, the IP bandwidth is delivered to a router at the headend. Voice packets are routed to a media gateway, which is controlled by a softswitch that can be located anywhere. This method of delivering telephone service has certain advantages that exist today by regulatory fiat, but may not persist. Congress has elected to exempt IP services from the many taxes and fees that it imposes on telephone service. Although it is impossible to foresee what Congress will do in the future, nothing suggests that this exemption is permanent. Another issue is exemption from equal access. Today the cable companies are not required to permit equal access to their facilities, which means competing service providers cannot rely on cable as an access medium. Either of these would affect the profitability of local telephone service over cable.
VIDEOCONFERENCING EQUIPMENT
Videoconferencing has considerable appeal, primarily as a substitute for travel, but it has yet to become a mainstream application. The primary drawback has been cost. Not long ago, the cameras, monitors, codec, and formal conference room setup could easily exceed $100,000 and the payback was difficult to quantify. Recently, equipment costs have dropped to a fraction of their previous levels and a conference unit can now be placed almost anywhere. The transmission costs, however, remain high. A reasonable videoconference requires 384 Kbps, or six BRI channels. Add usage costs to that and conferences, particularly multipoint conferences, are still expensive. A two-way videoconference over IP requires about 400 Kbps of full-duplex
bandwidth. With a stable private network in place, the quality is as good as ISDN. The Internet is not a suitable medium where true conference-quality video is required, so most external conferences will need an ISDN gateway.
A videoconference facility has some or all of the following subsystems integrated into a unit:
_ Video codec
_ Audio equipment
_ Video production and control equipment
_ Graphics equipment
_ Document hard copy equipment
_ Communications
A full description of these systems is beyond the scope of this book, but we will discuss them here briefly to illustrate the composition of a full videoconferencing facility. Personal video communications equipment is readily available and if the network has sufficient bandwidth and QoS, it is far less elaborate than a
formal conference facility, and will likely be the application that enables videoconferencing to fulfill its promise.
Video Codec
The codec converts the analog video signal to digital and compresses it for transmission. At the distant end the process is reversed. Codecs for dial-up conferences must support H.261. Many also support H.263 coding for IP conferences.
Audio Equipment
Most analysts believe that audio equipment is the most important part of a videoconference facility. In large videoconferences it is often impossible to show all participants, but it is important that everyone hear and be heard clearly. Audio equipment consists of microphones, speakers, and amplifiers placed strategically
around the room. Sometimes speaker telephones are used, but with less satisfactory results than codec audio. The codec robs a portion of the bandwidth to transmit the audio, so some products allow the operator to reduce the amount of audio bandwidth as a way of improving the video. When IP conferencing is used the
audio must be multiplexed on the bit stream because lip sync cannot be preserved with a separate audio channel.
Video and Control Equipment
Video equipment consists of two or three cameras and associated control equipment. The main camera usually is mounted at the front of the room and often automatically follows the voice of the speaker. Zoom, tilt, and azimuth controls are mounted on a console, where the conference participants can control them from a panel with a joystick. A second camera mounts overhead for graphic displays. The facility sometimes includes a third mobile camera that is operated independently. A switch at the console operator’s position selects the camera. Usually, one monitor shows the picture from the distant end and another shows the picture at the near end. In single-monitor conferences the near end can be
viewed in a window using the monitor’s picture-in-picture feature.
Digitizing and encoding equipment compresses full motion video or creates freeze-frames. In addition, encryption equipment may be included for security. Other equipment can freeze a full motion display for a few seconds while the participants send a graphic image over the circuit. Sometimes digital storage equipment
enables participants to transmit presentation material ahead of time so graphics transmission does not waste conference time.
Graphics Equipment
Videoconference facilities may include graphics-generating equipment to construct diagrams. Some systems provide desktop computer input so that tools in the computer can be used for generating graphics.
Communications
Digital communication facilities are required for videoconferences. ISDN is required for connections over the PSTN. Two 64 Kbps B channels and separate signaling channel provide enough bandwidth for an acceptable conference, but at least 384 Kbps is required for conference quality. Where an IP network with sufficient bandwidth and QoS is available, it is a good medium for conferences over the internal network. Frame relay is an excellent platform for videoconferencing if the access bandwidth is sufficient.
ITU-T H.320 Video Standards
Before the ITU-T H.320 standards were developed, manufacturers used proprietary standards, which meant that both ends of a videoconferencing session had to use the same equipment. Now, interoperability is assured by use of standards from the H.320 family. Table 22-2 lists the standards included under H.320. The amount of bandwidth supplied in the transmission facilities must be in multiples of 64 Kbps, known as P Ч 64 (pronounced P times 64). P can be from 1 to 30 64 Kbps channels; i.e. a single DS-0 up to full E1. Two options are offered. The full common intermediate format (CIF) offers frames of 288 lines by 352 pixels. This is approximately half the resolution of commercial television, which is 525 lines by 480 pixels. The second alternative is quarter CIF (QCIF), which is 144 lines by 176 pixels. The modulation method is the discreet cosine transform (DCT) algorithm.
H.323 Video Standards
The public Internet is not a sufficiently stable medium for reliable high-quality videoconferences, but for some it is good enough. Even better is the enterprise IP network or frame relay. The network can be designed with sufficient bandwidth and equipped with QoS protocols that make videoconferencing over IP an excellent alternative and considerably less costly than dial-up. We discussed H.323 signaling in Chapter 12. Those protocols work for either voice or video. The benefit of H.323 is its simplicity once it is set up and working. The standard RJ-45 Ethernet jack is ubiquitous, and can accept video without concern about how many channels are needed. A rollabout video unit can be hooked to any jack just like a laptop computer. One of the driving applications is desktop videoconferencing. The equipment is contained in a standard desktop computer that has a small video camera mounted on top of the monitor. The conferencing equipment is mounted on a board that plugs into a computer expansion slot, or it is an external box that plugs into a board in the computer. If the network is designed for video, the result is an economical method of conducting a personal videoconference. Conferences are spontaneous, with no need to schedule a conference room. The addition of a picture to the call allows the parties to pick up the nonverbal content of a session by seeing expressions. The screen is generally too small and the camera angle too narrow for group conferences, but for small groups it is excellent. Most products permit the users to share and view computer files over the network. As desktop video gains acceptance, it will be an effective tool for enhancing the quality of telephone calls and for enabling users to collaborate on files.
VIDEO APPLICATION ISSUES
Entertainment is likely to remain the primary driving force behind video into the future, but business, healthcare, and educational uses of television will be increasingly important. As CATV provides a broadband information pipeline into a substantial portion of American households, the growth of nonentertainment services
is expected.
Security
Television security applications take two forms—alarm systems and closed circuit television (CCTV) for monitoring unattended areas from a central location. Many businesses use CCTV for intrusion monitoring, and it is also widely used for intraorganizational information telecasts. Alarm services have principally relied on telephone lines to relay alarms to a center, which requires a separate line or automatic dialer. The expense of these devices can be saved by routing alarms over a CATV upstream channel, but to do so it requires a terminal to interface with the alarm unit. As described earlier, a computer in the headend scans the alarm terminals and forwards alarm information to a security agency as instructed by the user. H.323 video can also be used to deliver narrow-band IP video for security monitoring.
IP Services
Many CATV companies offer two-way data communications over their systems. As discussed in Chapter 8, the cable company allocates bandwidth for upstream and downstream using the DOCSIS protocol. VoD is delivered over IP bandwidth, using upstream channels to order services and control the delivery.
Control Systems
Two-way CATV systems offer the potential of controlling many functions in households and businesses. For example, utilities can use the system to poll remote gas, electric, and water meters to save the cost of manual meter reading. Power companies can use the system for load control. During periods of high demand, electric water heaters can be turned off and restored when reduced demand permits. A computer at the headend can remotely control a variety of household services such as appliances and environmental equipment. CATV companies themselves can use the system to register channels that viewers are watching and to bill for service consumed. They also can use the equipment to control addressable converters to unscramble a premium channel at the viewers’ request.
Opinion Polling
Experiments with opinion polling over CATV have been conducted. For example, CATV has been used to enable viewers to evaluate the television program they have just finished watching. The potential of this system for allowing viewers to watch a political body in action and immediately express their opinion has great
potential in a participatory democracy.
Streaming Video
The QoS requirements that may preclude using the Internet for the video transmission medium do not apply to streaming video. Enterprises can use the Internet for one-way broadcasts to employees and customers. Streaming video has endless
applications in training, education, entertainment, and other such purposes.
Videoconferencing
Companies that had never considered videoconferencing are investigating it more closely as the economics become more compelling. The first line of justification is generally replacement for travel, but as organizations adopt video as a way of doing business, the need for economic justification disappears. Videoconferencing makes it economical for more people to participate directly because travel is eliminated. A major advantage of videoconferencing where formal conference rooms are provided is scheduling. When users must reserve a facility, meetings must
begin and end on schedule, which is an added benefit.
Evaluation Considerations
As the cost of equipment drops, companies may find they have backed into the videoconferencing business without a plan. The result may be under-utilized equipment and network facilities, or the inverse, which is an overloaded network. Needs and expectations should be thoroughly assessed before embarking on
videoconferencing. The issues discussed in this section should be considered and the necessary controls imposed to increase the chances of success.
Type of Transmission Facilities
The initial issue to resolve is the telecommunications medium. BRI ISDN service is the default method where conferences use the PSTN. For higher bandwidths, inverse multiplexing may be required. Some units have the inverse multiplexer built in, while others use an external i-mux. Most digital PBXs can support PRI on the PSTN side and BRI toward the videoconferencing endpoints. Where IP bandwidth is available, H.323 video is an excellent alternative. The typical company in the market for H.323 video will have an existing internal network with spare capacity, or one to which capacity can be added inexpensively. Amajor advantage of IP conferencing is the fact that equipment can be relocated easily by plugging into a live Ethernet jack. The terminal equipment may be interchangeable for H.320 conferences. If external conferencing is required, a H.323-to-H.320 gateway will be required. If the video and audio signals are separate across the gateway, lip sync problems are apt to result.
Single- or Multipoint Conferences
Videoconferences are classed as point-to-point or multipoint. With terrestrial facilities, the distance and number of points served have a significant effect on transmission costs. Large companies with a significant amount of multipoint conferencing can often justify the cost of an MCU. Companies that use multipoint conferencing only occasionally can use bridging services offered by the major
IXCs. The IXCs offer “meet-me” conferencing in which conferees dial into a bridge. The control unit receives inputs from all locations, and sends each location the image that has seized the transmitting channel. The transmitting channel is allocated by one of three methods: under control of the conference leader, under
time control in which each location gets a share of the time, or by switching automatically to the location that is currently talking. The latter method is the most common, but it requires a disciplined approach.
Videoconferencing Equipment
The following issues should be evaluated in selecting videoconferencing equipment:
• Will the equipment used be videoconferencing appliances or PC-based?
• Will the conference be set up in formal conference rooms or from desktops?
• What level of quality is needed? Is a highly compressed signal satisfactory? Does the system offer full 30 frames per second or some lower factor?
• Can the system support multiple video formats (NTSC, SECAM, PAL)?
• Does the system support still graphics?
• How easily can information be brought into the conference? Is information sharing fundamental to the equipment or is it an add-on?
• Will fixed or portable equipment be used? Do the applications require frequent equipment relocation?
• Does audio ride on the video facility? Is wideband audio available?
• Is single-point or multipoint communication required? If multipoint, will the user supply its own multipoint control unit or use facilities offered by the IXC?
System Integration
Videoconference equipment is sometimes an assembly of units made by different manufacturers. To ensure compatibility, it is advisable to obtain equipment from a vendor who can integrate it into a complete system.
Security
The type of information being transmitted over the channel must be considered. If proprietary information is discussed during conferences, encryption of both video and audio signals may be required, particularly if IP is the transmission protocol.
Public or Private Facilities
Private videoconference facilities have a significant advantage over public access systems. Public facilities are unavailable in many localities, which may preclude holding many videoconferences. The travel time to a public facility offsets some
of the advantages of videoconference. Unless a private facility is used frequently, however, public facilities are usually the most cost-effective option.
No comments:
Post a Comment