Method and system for conducting video conferences

FIELD: communications engineering, possible use for conducting multi-point video/audio conferences which use transport protocols of TCP/IP stack in star circuit (point to multi-point) in broadcasting networks.

SUBSTANCE: method results in reduction of required total traffic capacity of data transmission channels between remote video conference communication terminals and the server for setting up multi-point video conferences(MCU) due to usage of filtration mechanisms for data streams which transmit video images and voice information.

EFFECT: increased efficiency.

2 cl, 6 dwg

 

The invention relates to communications, and more particularly to method and system implementing multipoint video/audio conferencing when using transport protocols TCP/IP stack in Wye (point-to-multipoint) in the broadcast (brodcasting) networks.

Currently there are many different systems conference (conference), implementing the appropriate methods. However, each one indispensable condition is to use a codec for converting the image signal into a digital signal.

For example, in patent EP 0353945, class 04N 7/15 from 07.02.1990 "the Way of establishing multilocale audio/video" is considered a specialized system designed to work with a specific satellite system AT&T SKYNET or with similar systems, providing direct connection to the terminal devices (codecs) in a satellite modem with optional intermediate encryption device).

However, these known solutions do not allow to organize multipoint video conferencing using standard videoconferencing equipment that uses transport protocols TCP/IP stack and standard signaling protocols for establishing connections, such as N, SIP, SCCP, and others, adopted in IP networks. In addition, they do not allow you to place the equipment videoconf is Rossvyaz in any of the IP networks, served by satellite station, at a considerable distance from the satellite station. At the same time if there is sufficient bandwidth brodcasting satellite channel cannot simultaneously serve several different video conferencing, involving various head and terminal devices (codecs).

Known solutions, implementing multipoint broadcasting on industry-standard technology, multicast, for example as described in French patent 2780229, class 04N 7/15 from 24.12.1999 "multipoint system to enable voice conferencing". When organizing multipoint video conferencing using servers in your organization multipoint video conferencing (hereinafter MCU - Multipoint Control Unit)that this patent is called a server assembly, is the transmission of audio/video information from the terminal devices (terminals, or codecs) participants to the MCU, the processing of information on MCU and sending the integrated flow audio/video data from the MCU to the target terminal.

Under the multicast means - (RFC1112) specially selected range of IP addresses that are destined for multicast (point-to-multipoint), when use of this address type the same data packet may take a few devices. When data packets are sent in the direction of the device is istam-recipients who Express interest in receiving them. In networks of complex topology packets with multicast addresses may be duplicated (to multiply) routers in the presence of the recipient. To route packets addressed in multicast group, use routing protocols that are different from routing protocols with unicast connections when the packets are addressed to a specific device, not the device group (link-type point-to-point).

When the integrated flow of data going to each terminal, identical in content and differ only in the identity of the recipient (in TCP/IP networks this ID is the combination of IP address/Protocol/port number). As a result, in transportation networks (e.g. satellite with head station or cable TV networks supporting TCP/IP) bandwidth from MCU is used very inefficiently, because the MCU sends multiple (number of participants) identical data streams in a broadcast network for each participant, with all the threads reach each of the target terminals, but in reality each terminal accepts only one.

This situation is not very critical for modern high-speed local area networks, but very significant when used in the research Institute, for example, rather expensive satellite channels. For video conferencing medium quality required bandwidth 512K per subscriber, when a video-conference from 10 participants will be required bandwidth in the direction from the MCU 5 MB/s, which is also will really do on a fairly small time video conferencing, and the rest of the time idle.

Another significant drawback of such systems is the use of multipoint broadcast standard multicast technology, which does not allow to organize a broadcast of the conference using the MCU that does not support multicast, and can not be used in this topology, a different class of devices (N or SIP MCU)that do not support multicast.

Proposed patent RU 2240657 C1 a method and system for implementing video conferencing to avoid these disadvantages, however, there are a number of significant limitations:

1) All videoconferencing terminals participating in the conference, receive the same stream video/audio data regardless of the mode of the conference, which is extremely inconvenient when conducting conferences activated by voice. In this mode, the server sends to all participants of the conference image from the speaker at this point in time, the participant is, and he, in turn, the image of the member who spoke before him. Thus, all participants the impression that they are engaged in dialogue. The limitation is that when using this technology speaking at this point in time, the participant will receive their own image.

2) the Number of videoconferencing terminals participating in the conference is limited to the number of connections supported by the MCU, which is inefficient from the point of view of the use of multicast as to receive streams video/audio without increasing the required bandwidth from MCU to terminals could significantly larger number of terminals.

3) In the proposed method does not provide the opportunity to save bandwidth on the reverse communication channels, i.e. in the direction of the videoconferencing terminal to the MCU, which is very critical for networks based on satellite communication channels. So, for the above example conference for 10 participants connected at 512 KB/s each, you will need a satellite resource, providing the ability to transfer data with a speed of 512 KB/s from the MCU to each of the terminals, and 10·512 KB/s in the opposite direction. Thus, the total bandwidth used for video conferences using the simultaneous presence on the screen four ABO the clients (in this mode, all terminals in the network is transmitted is identical to stream video, unlike activated by voice), $ 5.5 MB/s This gives an advantage compared with traditional technology, but inefficiently uses the data environment from the point of view that from 10 reverse video/audio streams are really popular are the only 4 that is formed is passed to the server image.

The task that sent the claimed invention is a method and system for implementing video conferencing, which would be free of these shortcomings.

When solving this task is achieved technical result consists in a significant reduction in the required total bandwidth of data transmission from the remote terminal video conferencing (VC) to the server of the organization multipoint video conferencing (MCU) through the use of filtering mechanisms for data streams, transmitting video and voice information.

Another technical result achieved the claimed invention is able to serve a much larger (almost unlimited) number of terminal VCCS receive video/audio from the server (passive mode)than the organization's server multipoint video conferencing, which determines only the number of terminals operating in "active" mode (the and the reception and transmission of information) using the mechanisms of modification of the signal information.

Another technical result achieved the claimed invention is the ability to change the set of "passive"/"active" participants not only in the organization of the conference, but also during its implementation.

These technical results are achieved by the proposed method and system for implementing a video conference according to the independent claims, dependent and subordinate points in all regions of the alternatives contained in the claims.

The basic principle used in the present technology, is forced to filter all messages sent MCU in broadcastnow the stream network audio/video data to the subscriber device (Descamps), in addition to carrying information that is different from the other, and the address conversion on the receiving side so that in the future, each subscriber codec accepts the data stream as assigned. Reverse channels used for data transmission from the subscriber codecs to the MCU, can be arranged in any manner, for the proposed technology is not essential, since the transmitted information is unique for each subscriber codec.

According to the proposed method of implementation of video conference between at least three devices and:

- as the transport Protocol uses TCP/IP;

- as a transport network, use the network supports multicast messages to multicast;

- at the Central site for processing and generation of audio/video streams using server multipoint conferences that support the standard signaling protocols to establish a connection with the exchange of audiovideoinformatsiynyy in IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol;

- on the peripheral pixels as target devices use videoconferencing device that supports standard signaling protocols to establish a connection with the transfer of audio/video streams in IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol;

- at the Central node broadcast address for streaming audio/video information coming from the videoconferencing server, so that for each thread, which carries information that is different from the other, the destination address is replaced by the address multicast group, and the port is the one on which the videoconferencing device expects to receive the appropriate type of audio/video information, and the streams than the primary information is filtered and the transport network does not transmit;

- on the peripheral points broadcast address for streaming audio/video information coming from the Central node, so that for threads that should be displayed at a given point in a given time, the destination address is replaced by the address of the video conferencing device that is installed at this point;

- on the peripheral points filtering streams of audio/video information is performed so that only a specified set points relays them to the side of the node, all other streams of audio/video filter and into the transport network is not passed in.

On the peripheral pixels for each thread, which carries information that is different from the other, the destination address may be substituted with the address multicast group, and the port is the one on which the server videoconferencing expects to receive the appropriate type of audio/video information.

Streams of audio/video data are delivered to the videoconferencing device connected to a subscriber terminal, according to standard protocols - H.323, SIP, SCCP and others, in which alarm and audio/video streams addressed with different combination of ip address/port/Protocol.

The number of terminals connected to the conference may be greater than the number of ports of the videoconferencing server.

For connecting devices videoconferencing can use isometsa Protocol signal exchange, other than the server uses video conferencing.

The set of target devices video conferencing can be changed during the conference.

During the conference at different points in time in the address of the video conferencing device can broadcast different streams of audio/video information.

The maximum number of videoconferencing devices participating in the same conference, does not depend on the server supports videoconferencing number of clients.

As the transport network may be any network that supports broadcast in accordance with TCP/IP, for example, a satellite network, a data network based on network or cable TV city broadband network access.

Also proposed is a system for implementing a video conference between at least three devices, videoconferencing, in which:

- as the transport Protocol uses TCP/IP;

- as a transport network, use the network supports multicast messages to multicast;

- at the Central node using the server multipoint conferences for processing and generation of audio/video streams, supports standard signaling protocols for establishing the connection the third in the exchange audiovideoinformatsiynyy in IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol;

- on the peripheral pixels as target devices use videoconferencing device that supports standard signaling protocols to establish a connection with the transfer of audio/video streams in IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol;

- address translation at a Central site for streaming audio/video data coming from the server videoconferencing is designed in such a way that for each thread, which carries information that is different from the other, the destination address is replaced by the address multicast group, and the port is the one on which the videoconferencing device expects to receive the appropriate type of audio/video information; however, the threads with the same information is filtered and the transport network does not transmit;

- address translation on the peripheral pixels for streaming audio/video information coming from the Central site, designed in such a way that for threads that should be displayed at a given point in a given time, the destination address is replaced by the address of the video conferencing device that is installed at this point;

filtering streams audio is videoinformation on the peripheral points made thus, that only a specified set of points can transfer them to the side of the node, and all the other streams of audio/video filter and into the transport network is not passed in.

For each stream, which carries information that is different from the other, the destination address may be substituted with the address multicast group, and the port is the one on which the server videoconferencing expects to receive the appropriate type of audio/video information.

Streams of audio/video information delivered to the videoconferencing device connected to a subscriber terminal, according to standard protocols - H.323, SIP, SCCP and others, in which alarm and audio/video streams addressed with different combination of ip address/port/Protocol.

The system can contain a block, modifying the signal information so that the number of terminals connected to the conference may be greater than the number of ports of the videoconferencing server.

For the connected devices, videoconferencing can be used the signal exchange Protocol that is used by the server video conferencing.

The set of target devices video conferencing can be changed during the conference.

During the conference on peripheral points can change the translation rule multicasting audio/video streams so that for those n the currents, to be displayed at a given point in a given time, address multicast group is replaced with the address of the video conferencing device that is installed at this point.

The maximum number of videoconferencing devices participating in the same conference, does not depend on the server supports videoconferencing number of clients.

As the transport network may be any network that supports broadcast in accordance with TCP/IP, such as a satellite network, a data network based on network or cable TV city broadband network access.

The elements of the system that implements the proposed method and system are:

1) a mechanism for establishing connections with remote terminals videoconferencing special server, transforming and multiplying the signal flows of information (SIP, H323 or other) so that connections are established from the "name" of the MCU;

2) a mechanism for ensuring the operation of the terminal in "passive" mode, performing a selective, based on address, Protocol, and port filtering of the transmitted data packets from the terminal to the MCU and selective, based on address, Protocol, and port address translation data packets transferred from the MCU to peripheral devices;

3) the mechanism control device is you implementing routing functionality, filtering, and NAT on the peripheral points, allowing real-time or programmed (scheduled) enable/disable streaming audio/video remote terminal VCCS that is passed to the MCU streams of audio/video data only from the active terminal, which can significantly save bandwidth.

4) the mechanism control terminals videoconferencing and MCU, providing the operator access to their user-defined functions;

5) the mechanism of connection terminals videoconferencing devices of videocelebrity protocols adopted for terminal VCS (SIP, H323 etc) with their subsequent conversion to the Protocol used to connect to a special server, except when these protocols are the same.

These mechanisms can be implemented on separate devices, or embedded in the video conferencing device. For simplicity, a case will be considered when they are implemented on separate devices, for simplicity, referred to as devices videocelebrity.

A distinctive feature of the proposed method and system is the use of a fixed total bandwidth depends only on the type of conference and independent of the total number of participants. So, for the above at the EPA conference on 10 participants connected at the speed of 512 KB/s, using the simultaneous presence of 4 participants required bandwidth in the direction from the MCU to the terminal will be 512 Kbps, and in the direction from the terminal to the MCU - 4·512 Kbit/s, that is, the total bandwidth required for the conference, will be reduced to 2.5 Mbit/s due to processing signaling proposed method, increasing the number of participants to 20, the bandwidth requirements of the network will not change, but the number employed in the MCU ports will not increase.

The main requirement to the network infrastructure for the effective use of this method is the availability of broadcast direct channel from one (any) of points participating in the video conference, conventionally called Central to other participants in the videoconference. The essence of the use of the broadcast channel is known from the literature and patents. At this Central point are audiovideofoto only from those points, which should see the rest of the participants (i.e. from points that are in "active" mode). As the reverse can be used in any communication channels, using as a transport Protocol TCP/IP. Supporting multicasting on these channels is optional.

The server management system of the device is videocelebrity manages devices so they all take transmitted by satellite channel audio/video information, but the transfer is only from the "active" at the moment users. For an organization standard management features video conferencing between "active" participants (such as voice activation mode or the simultaneous presence) using standard videoconferencing server (MCU)that supports the required number of active participants. The total number of participants can be much larger than itself supports MCU.

The General scheme of distribution of video/audio in videoconferencing network, built on the proposed method is presented in figure 1 (for activation by voice) and figure 2 (for the simultaneous presence).

In the "Central point" 1 (which may or may not be a Central office, is selected from the point of view of the network topology) is a device videocelebrity 2, the server holding the multipoint videoconferencing 3 (MCU). Optional in the "Central point" can be installed and the codec(s), plug(s) directly to the MCU.

In the peripheral points are devices videocelebrity 2 and conferencing terminals, which may be in the "passive" 4 or "active" 5 modes.

In this case, to transfer video/audio used is : multiple information streams, depending on the mode of the conference. In the activation mode voice (figure 1) is used multicasting stream from MCU to all passive terminals 6, unique streams from the MCU to the active terminal/from the active terminal to the MCU 7, 8. Using the simultaneous presence used multicasting stream from MCU to all terminals 6 and the reverse flow from the active terminal to the MCU 7, 8.

The device videocelebrity consists of the following functional blocks:

- operator interface that allows you to set the parameters of the conference, select active participants, to switch the active/inactive, etc.;

- control unit MCU, transmitting to the MCU conference settings in accordance with the parameters entered from the operator interface;

- control unit remote terminal VCCS;

- control unit remote devices videocelebrity that transform a received broadcast audio/video stream in the thread dedicated for the video conferencing terminal, and enabling/disabling streaming audio/video data to the Central point depending on a participant's status (active/inactive); and, if necessary, the transmission control Protocol acceptable to the connected terminal (SIP, H323 or other)

- control unit special the real server;

- block routing;

- filtering unit;

block address translation;

- unit diagnostics;

block special server performing a remote point from the "name" of the MCU, i.e. the connection so as if they were connected directly to the MCU.

In addition, the device management system of videocelebrity provides other support functions and on-line diagnostics of all network devices.

Network management devices videocelebrity organized in such a way that the operator can be located at any convenient point, in connection with a "Central" point.

Thanks to the transformation module protocols adopted to control terminals VCS (SIP, H323 etc.), the control Protocol used between devices videocelebrity, it is possible to connect devices videocelebrity any terminal that supports standard management protocols. This feature allows you to videoconference with participation of terminals that use different (for example, H323 and SIP control protocols.

Thus, the proposed method and system organization multipoint videoconferencing has the potential for significant savings total bandwidth (for example, when the total number of participants 10 and the mode of simultaneous presets the via 4 active participants in the total required bandwidth is equal to 5· M (where M is the transfer rate for a single audio/video stream), 4 thread from active participants and one broadcast to all participants, while in the standard configuration, the total bandwidth would be 20·M (one thread for each participant from the MCU and one from each party to the MCU). That is, even for speed videoconferencing low quality 256K savings total bandwidth is 20·256-5·256=3.384 Mbps, using the total bandwidth 1.28 Mbit/s

The proposed method and system allows you to organize multipoint videoconferencing as in the mode of activation by voice (voice activated), and in the mode of simultaneous presence (continuous presence). In the first mode pre-defined "Active" participants (two or more, their number is less than the total number of participants, but does not exceed the supported MCU); a selection terminal, which transmitted the image to all other participants, on the basis of who is currently speaking. During the conference, the composition of the active/passive participants may be changed by the operator. In the second case, the video image from the pre-defined participants are transmitted simultaneously in split screen mode (the total number is less than the total number of participants, but do not exceed the t supported MCU), voice flows from the active participants are mixed. During the conference, the composition of the active/passive participants can also be changed by the operator.

Typical algorithms for preparation and conduct video-conferencing session using the system in both modes is shown in figure 3.

In the activation mode for the voice acting, the subscriber should not see their own image, and the previous speaker, so to each active subscriber is transmitted to its media stream, in which, depending on the state of the terminal (voice/silence), is transmitted or audio/video speaker at this time or earlier. Since there are passive clients, which must always be passed only by acting in the present moment, for broadcast use a third thread, which may not be addressed to the speakers. When you do this:

all peripheral nodes corresponding to the passive terminal VCCS activated standard table broadcast multicast → unicast audio and video streams ("RTP streams"), distributed from the center to multicast group, substituting the address of the corresponding terminal videoconferencing instead multicast group;

all peripheral nodes corresponding to the passive terminal VCCS are activated with undertie filters, prohibiting transmission of audio and video streams from the video conferencing terminal;

- at the Central node is activated, the standard translation table for outgoing RTP streams in a multicast to broadcast audio/video to all passive videoconferencing terminals;

- the command is given on the MCU to connect to the conference of the parties;

module control signaling is performed by connecting the participants, through the processing of alarm number connected to the conference of the parties may be greater than the number involved in the MCU ports;

- on the basis of data about the ports for the original active videoconferencing terminal at the Central node are formed and activated table broadcast port for inbound RTP streams to the MCU received these threads on standard ports.

Thus, all participants, except the speaker, see and hear the speaker at the moment, and the speaker sees the previous speaker.

If necessary, change one or more active users of the operator's console gives the appropriate command, the execution of which on the peripheral nodes:

- turning into a passive state, deactivated used a translation table, standard filters are activated, prohibiting the transfer of audio and video streams from the video conferencing terminal, and activated article is hartie table broadcast multicast → unicast for RTP streams distributed from the center to multicast group;

- transitioning to the active state, removed filters for reverse RTP streams and installed a translation table in accordance with the addresses and ports of the participants, turning into passive mode.

Using the simultaneous presence of both active and passive subscribers should see a window with the image of the active participants) and hear (mixed voice) one and the same. For education broadcast multicasting channel for transmission to all participants RTP flows are flows that are destined to one of its customers. Other RTP streams transmitted from the center, are not used. When you do this:

all peripheral nodes videoconferencing terminals are activated standard table broadcast multicast → unicast audio and video streams distributed from the center to multicast group, substituting the address of the corresponding terminal videoconferencing instead multicast group;

all peripheral nodes corresponding to the passive terminal VCCS activated standard filters, prohibiting the transfer of audio and video streams from the video conferencing terminal;

- at the Central node is activated, the standard translation table for outgoing RTP streams in a multicast to broadcast audio/video to all the passive terminal of the m videoconferencing;

- given command to the MCU (codec with embedded MCU) to connect to the conference of the parties;

module control signaling is performed by connecting the participants, through the processing of alarm number connected to the conference of the parties may be greater than the number involved in the MCU ports;

- on the basis of data about the ports for the original active videoconferencing terminal at the Central node are formed and activated table broadcast port for inbound RTP streams to the MCU received these threads on standard ports.

In case of successful execution of these actions, the conference begins, in the case of a fault detection message to operator error.

Thus, all participants receive the same RTP streams and therefore see and hear the same, i.e. active participants in the Windows and the mixed voice.

If necessary, change one or more active users of the operator's console gives the appropriate command, the execution of which on the peripheral nodes:

- turning into a passive state, deactivated used a translation table for RTP flows towards the center and activated standard filters, prohibiting the transfer of audio and video streams from the video conferencing terminal;

- transitioning active in the status, removed filters for reverse RTP streams and installed a translation table for reverse RTP streams in accordance with the addresses and ports used by the terminal VCCS turning into passive mode.

1. How to video conference between at least three devices, videoconferencing, in which:

as the transport Protocol uses TCP/IP;

as the transport network use the network supports multicast messages to multicast;

at a Central site for processing and generation of audio/video streams using server multipoint video conferencing that supports standard signaling protocols to establish a connection with the exchange of audio/video over IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol;

on the peripheral pixels as target devices use videoconferencing device that supports standard signaling protocols to establish a connection with the transfer of audio/video streams over IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol,

characterized in that:

pennies the actual node broadcast address for streaming audio/video, coming from the server terminal video conferencing device, so that each stream carrying the flow of information, the destination address is replaced by the address multicast group, and the port is the one on which the video conferencing device, which is in active or passive mode, waiting for the reception of the corresponding stream audio or video information;

on the peripheral points broadcast address for streaming audio/video information coming from the Central node, so that for threads that should be displayed at a given point in a given time, the destination address is replaced by the address of the video conferencing device that is installed at this point;

on the peripheral points filtering streams of audio/video information is performed so that only a specified set of peripheral pixels transmits them to the side of the Central node to all other peripheral points of the streams of audio/video filter and into the transport network is not passed in.

2. The method according to claim 1, characterized in that the peripheral points for each thread, which carries information, the destination address is replaced by the address multicast group, and the port is the one on which the server of the video conferencing device is in operation to receive a corresponding stream audio/video from AK the active client of the corresponding stream audio/video.

3. The method according to any one of claims 1 and 2, characterized in that the streams of audio/video information delivered to the videoconferencing device connected to a subscriber terminal, according to standard protocols - H.323, SIP, SCCP and others, in which alarm and audio/video streams addressed with different combination of ip address/port/Protocol.

4. The method according to any one of claims 1 and 2, characterized in that the number of terminals connected to the conference may be greater than the number of ports of the videoconferencing server.

5. The method according to any one of claims 1 and 2, characterized in that for connecting devices video conferencing use the signal exchange Protocol that is used by the server video conferencing.

6. The method according to any one of claims 1 and 2, characterized in that the set of target devices videoconferencing change during the conference.

7. The method according to any one of claims 1 and 2, characterized in that during the conference at different points in time in the address of the video conferencing device transmit different streams of audio/video information.

8. The method according to any one of claims 1 and 2, characterized in that the maximum number of videoconferencing devices participating in the same conference, does not depend on the server supports videoconferencing number of clients.

9. The method according to any one of claims 1 and 2, featuring the the action scene, as the transport network is any network that supports broadcast in accordance with TCP/IP, such as a satellite network, a data network based on network or cable TV city broadband network access.

10. A system for implementing a video conference between at least three devices, videoconferencing, in which:

as the transport Protocol uses TCP/IP;

as the transport network use the network supports multicast messages to multicast;

at the Central site using server multipoint conferences for processing and generation of audio/video streams, supports standard signaling protocols to establish a connection with the exchange of audio/video over IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different combination of ip address/port/Protocol;

on the peripheral points on the Central node as the target devices use videoconferencing device that supports standard signaling protocols to establish a connection with the transfer of audio/video streams over IP networks - H.323, SIP, SCCP and others in which alarm and audio/video streams are transmitted with different is the combination of ip address/port/Protocol,

characterized in that it:

address translation at a Central site for streaming audio/video data coming from the server terminal video conferencing device, is designed in such a way that for each stream carrying the flow of information, the destination address is replaced by the address multicast group, and the port is the one on which the video conferencing device, which is in active or passive mode, waiting for the reception of the corresponding stream audio or video information;

address translation on the peripheral pixels for streaming audio/video information coming from the Central site, designed in such a way that for threads that should be displayed at a given point in a given time, the destination address is replaced by the address of the video conferencing device that is installed at this point;

filtering streams of audio/video data of the peripheral pixels is produced in such a way that only a specified set of peripheral points can transfer them to the side of the Central node, and all the other peripheral points of the streams of audio/video filter and into the transport network is not passed in.

11. The system of claim 10, wherein for each flow, which carries information, the destination address is replaced by the address multicast group, and the port is the one on which the device videoconferencing in active or passive mode, waiting for the reception of the corresponding stream audio or video.

12. System according to any one of p and 11, characterized in that the streams of audio/video information delivered to the videoconferencing device connected to a subscriber terminal, according to standard protocols - H.323, SIP, SCCP and others, in which alarm and audio/video streams addressed with different combination of ip address/port/Protocol.

13. System according to any one of p and 11, characterized in that it contains the unit, modifying the signal information so that the number of terminals connected to the conference may be greater than the number of ports of the videoconferencing server.

14. System according to any one of p and 11, characterized in that for connecting devices video conferencing use the signal exchange Protocol that is used by the server video conferencing.

15. System according to any one of p and 11, characterized in that the set of target devices videoconferencing change during the conference.

16. System according to any one of p and 11, characterized in that during the conference on peripheral points modify a translation rule multicasting audio/video streams so that for those threads that should be displayed at a given point in a given time, and the RES multicast group is replaced by the address of the video conferencing device, installed at this point;

17. System according to any one of p and 11, characterized in that the maximum number of videoconferencing devices participating in the same conference, does not depend on the server supports videoconferencing number of clients.

18. System according to any one of p and 11, characterized in that the transport network is any network that supports broadcast in accordance with TCP/IP, such as a satellite network, a data network based on network or cable TV city broadband network access.



 

Same patents:

FIELD: communications engineering, transmission of images and signals, possible use for simultaneous transmission and receipt of N digital signals in communication equipment.

SUBSTANCE: the method for encoding and decoding signals includes combination of N synchronous digital signals at transmitting side into one multilevel resulting signal S, p-multiple transformation of original multilevel resulting signal S into new multilevel resulting signal Sp by execution of analog-digital and digital-analog transformations of signals in separate serially connected signal processing units. At receiving side, reverse operations are performed with the signals in comparison with the transmitting side.

EFFECT: possible transmission of one or combined transmission of several N synchronous analog television signals of multi-scale images or different digital signals in M-times lesser frequency band of one analog signal, transformed to digital form.

3 cl, 3 dwg, 1 tbl

FIELD: data transfer networks; transmission and reception of audio and video signals.

SUBSTANCE: as distinct from centralized architecture of building video information service center, proposed system is built as distributed system that has one base video information service center and several local ones coupled with the latter through main-line communication channels. Base video information service center has in addition output to public telephone network, Internet, and mobile communication network. In addition, local information service centers and base one incorporate digital telephone exchanges, and users of local information service centers are additionally connected to these digital telephone exchanges. Technologies of local networks (Ethernet, Fast Ethernet, Gigabit Ethernet) are used for connecting users of local information service centers to servers.

EFFECT: enhanced servicing speed at lower throughput of main-line communication channels.

2 cl, 3 dwg

FIELD: technology for transferring images, can be used for concurrent transmitting and receiving within frequency band of one digital television image signal, N synchronous television signals of multi-gradation images, formed using, for example, television cameras or other sources of black and white, colored, spectrum-zonal, volumetric, or other multi-gradation images, and also can be used for concurrent recording and conservation of N synchronous television signals of multi-gradation images, and can be possibly utilized for construction of video-surveillance systems, television systems for remote probing of Earth surface and multimedia video-information transfer systems.

SUBSTANCE: method for transferring synchronous television signals of multi-gradation images includes, at transferring side, operations for processing and transformation of N synchronous analog television signals to N synchronous digital television signals, by means of their analog-digital transformation, forming of digital synchronization signals, transmission of signals via communication channel, and at receiving side, reversed operations over signals are performed to in comparison with transmitting side, during which after forming of digital synchronization signals, aforementioned signals are joined as one multi-level resulting signal S by means of analog-digital transformation of N synchronous digital signals, further, multi-level resulting signals S is added to synchronization signal, amplified and transferred along communication channel, and at receiving side multi-level resulting signal S is amplified, then synchronization signals are separated from it, controlling and clock pulses are formed, then, operation of analog-digital transformation of multi-level resulting signal S is performed and N synchronous digital television signals are formed in serial code, after that, operations over signals are performed, being reversed relatively to those at transferring side, namely, each digital signal, represented by serial code, is transformed to parallel n-byte binary code, then, forming of N source analog synchronous television signals of multi-gradation images is performed by means of their digital-analog transformation.

EFFECT: possible concurrent transmission of N synchronous analog television signals of multi-gradation images within frequency band of one analog signals, transformed into television form.

2 dwg, 1 tbl

FIELD: communications.

SUBSTANCE: video-phone communication is based on cable television networks. At client side attachment and transmitting camera are mounted. In cable networks commutation blocks are positioned - automatic phone stations, provided with input devices. Clients are distributed among client networks, in which all clients are connected to same two-wire line. Each client has transmission frequencies and receipt frequencies for television signals assigned.

EFFECT: higher efficiency.

3 dwg

The invention relates to a method and apparatus for a television game program

The invention relates to the field of video equipment, in particular to method and system, according to which data transfer is performed by a telecommunications network which at least partly consists of a mobile radio network

The invention relates to the field of education, audiovisual systems, communication systems and computer networks

The invention relates to a method and apparatus for collecting and processing monitoring data

The invention relates to transfer and view video

The invention relates to image compression to reduce bandwidth requirements of digital video decoder

FIELD: communications.

SUBSTANCE: video-phone communication is based on cable television networks. At client side attachment and transmitting camera are mounted. In cable networks commutation blocks are positioned - automatic phone stations, provided with input devices. Clients are distributed among client networks, in which all clients are connected to same two-wire line. Each client has transmission frequencies and receipt frequencies for television signals assigned.

EFFECT: higher efficiency.

3 dwg

FIELD: technology for transferring images, can be used for concurrent transmitting and receiving within frequency band of one digital television image signal, N synchronous television signals of multi-gradation images, formed using, for example, television cameras or other sources of black and white, colored, spectrum-zonal, volumetric, or other multi-gradation images, and also can be used for concurrent recording and conservation of N synchronous television signals of multi-gradation images, and can be possibly utilized for construction of video-surveillance systems, television systems for remote probing of Earth surface and multimedia video-information transfer systems.

SUBSTANCE: method for transferring synchronous television signals of multi-gradation images includes, at transferring side, operations for processing and transformation of N synchronous analog television signals to N synchronous digital television signals, by means of their analog-digital transformation, forming of digital synchronization signals, transmission of signals via communication channel, and at receiving side, reversed operations over signals are performed to in comparison with transmitting side, during which after forming of digital synchronization signals, aforementioned signals are joined as one multi-level resulting signal S by means of analog-digital transformation of N synchronous digital signals, further, multi-level resulting signals S is added to synchronization signal, amplified and transferred along communication channel, and at receiving side multi-level resulting signal S is amplified, then synchronization signals are separated from it, controlling and clock pulses are formed, then, operation of analog-digital transformation of multi-level resulting signal S is performed and N synchronous digital television signals are formed in serial code, after that, operations over signals are performed, being reversed relatively to those at transferring side, namely, each digital signal, represented by serial code, is transformed to parallel n-byte binary code, then, forming of N source analog synchronous television signals of multi-gradation images is performed by means of their digital-analog transformation.

EFFECT: possible concurrent transmission of N synchronous analog television signals of multi-gradation images within frequency band of one analog signals, transformed into television form.

2 dwg, 1 tbl

FIELD: data transfer networks; transmission and reception of audio and video signals.

SUBSTANCE: as distinct from centralized architecture of building video information service center, proposed system is built as distributed system that has one base video information service center and several local ones coupled with the latter through main-line communication channels. Base video information service center has in addition output to public telephone network, Internet, and mobile communication network. In addition, local information service centers and base one incorporate digital telephone exchanges, and users of local information service centers are additionally connected to these digital telephone exchanges. Technologies of local networks (Ethernet, Fast Ethernet, Gigabit Ethernet) are used for connecting users of local information service centers to servers.

EFFECT: enhanced servicing speed at lower throughput of main-line communication channels.

2 cl, 3 dwg

FIELD: communications engineering, transmission of images and signals, possible use for simultaneous transmission and receipt of N digital signals in communication equipment.

SUBSTANCE: the method for encoding and decoding signals includes combination of N synchronous digital signals at transmitting side into one multilevel resulting signal S, p-multiple transformation of original multilevel resulting signal S into new multilevel resulting signal Sp by execution of analog-digital and digital-analog transformations of signals in separate serially connected signal processing units. At receiving side, reverse operations are performed with the signals in comparison with the transmitting side.

EFFECT: possible transmission of one or combined transmission of several N synchronous analog television signals of multi-scale images or different digital signals in M-times lesser frequency band of one analog signal, transformed to digital form.

3 cl, 3 dwg, 1 tbl

FIELD: communications engineering, possible use for conducting multi-point video/audio conferences which use transport protocols of TCP/IP stack in star circuit (point to multi-point) in broadcasting networks.

SUBSTANCE: method results in reduction of required total traffic capacity of data transmission channels between remote video conference communication terminals and the server for setting up multi-point video conferences(MCU) due to usage of filtration mechanisms for data streams which transmit video images and voice information.

EFFECT: increased efficiency.

2 cl, 6 dwg

FIELD: radio-engineering.

SUBSTANCE: invention refers to transmission of digital services, and namely to DVB- compatible services (digital video broadcasting). Method involves at least the following steps at which: receiver is connected to a stream of digital data in order to receive that stream of digital data; receiver extracts from the above stream the information on the network location, on the one hand, of streams transmitting contents of the above digital services, and on the other hand, of individual streams transmitting the information on the above digital services; receiver is connected at least to the stream of the above individual streams transmitting information that describes the service, in order to receive the information on those services; receiver uses that information for making the list that includes at least one service available in network.

EFFECT: receiver connected to bidirectional network identifies digital services in bidirectional network.

8 cl, 10 dwg

FIELD: physics, communications.

SUBSTANCE: present invention is related to video-telephone system, in which possibility to transfer and receive visualised moving images at basic radio communication station is provided by use of mobile communication terminal (terminal of PHS or PDA system). Video-telephone system, comprising the following components: mobile communication terminal, which is able to transmit and receive introduced moving images and speech data into basic radio communication station; autonomous basic station that operates to receive mentioned moving images sent to mentioned basic radio communication system and received from it, on the basis of communication protocol that is used between mentioned basic station of radio communication and mentioned mobile communication terminal; and monitor that operates to reflect moving images received by mentioned autonomous basic station.

EFFECT: possibility to realise video-telephone communication with high-quality images using camera installed in mobile communication terminal.

13 cl, 9 dwg

FIELD: information technologies.

SUBSTANCE: introduction of session description which describes audio-visual presentation transferred to the recipient in the form of flow is introduced at least to some messages of Real-time transport control protocol (RTCP), which are supplied from audio-visual information content source to the recipient. That message can be connected for example to one of the variety of audio-visual information content parts included in the list of audio-visual information content reproduction, which is transferred in the form of a flow from the device to the recipient. In accordance with some aspects, RTCP message, which introduces session description message, consists at least of three fields: the first field containing the data identifying RTCP message as the one referring to the type which introduces session description message; the second field containing the data which is session description message for multimedia representation, and the third field containing the data identifying the length of RTCP message generated by adding the length of the first, the second and the third fields.

EFFECT: providing introduction of session description message to message of real-time transport control protocol.

11 cl, 20 dwg

FIELD: information technologies.

SUBSTANCE: synchronising-control events are generated at head server, containing both event and its content data coordinated with data on the time of beginning and end of broadcasting content of television program, interactive data is repeatedly sent to subscriber device for the period of time controlled by head server, it is received and processed by subscriber device, comprising TV attachment.

EFFECT: provides for reliable synchronisation of interactive content with television broadcasting flow, and reduced volume of resources of receiving subscriber device taken for processing and memorising of interactive content.

7 dwg

FIELD: information technology.

SUBSTANCE: there is disclosed method for interface adaptation of television internet protocol with a stream data storage device, involving reception of packet commands from a service layer by an adaptation layer; evaluation by the adaptation layer of the type of packet commands and decomposition or formation of packet commands into elementary commands; storage by the adaptation layer of the decomposed or formed elementary commands in a queue in accordance with the condition for limitations of elementary commands in the stream data storage device; obtaining by the adaptation layer of elementary commands from the queue and sending a request to the stream data storage device when the limitation condition is met; evaluation by the adaptation level of operation results, and if successful, the process is completed; otherwise the appropriate transmission mode which corresponds to a predetermined strategy is selected.

EFFECT: efficient use of multiple stream data storage devices and provision for a more convenient and efficient service for data stream users.

14 cl, 4 dwg

Up!