Introduction of session description message to real-time transport control protocol message (rtcp)

FIELD: information technologies.

SUBSTANCE: introduction of session description which describes audio-visual presentation transferred to the recipient in the form of flow is introduced at least to some messages of Real-time transport control protocol (RTCP), which are supplied from audio-visual information content source to the recipient. That message can be connected for example to one of the variety of audio-visual information content parts included in the list of audio-visual information content reproduction, which is transferred in the form of a flow from the device to the recipient. In accordance with some aspects, RTCP message, which introduces session description message, consists at least of three fields: the first field containing the data identifying RTCP message as the one referring to the type which introduces session description message; the second field containing the data which is session description message for multimedia representation, and the third field containing the data identifying the length of RTCP message generated by adding the length of the first, the second and the third fields.

EFFECT: providing introduction of session description message to message of real-time transport control protocol.

11 cl, 20 dwg

 

The technical field,

The present invention relates to the transmission of audiovisual information and data, and specifically to implementation of the message of the session description in the RTCP message.

Prior art

Streaming content, such as streaming audio, video and/or text, is becoming more and more popular. The term "streaming" is commonly used to indicate that the data representing the content transmitted over the network to the client computer upon request "on demand" instead of his pre-delivery completely before playing. Thus, the client computer reproduces the transmitted streaming content when it is received from a network server, rather than waiting for the complete file, which must be delivered.

The widespread availability of streaming multimedia content allows many types of content that was not previously available on the Internet or other computer networks. "Live" content transmitted directly from the scene) is one of the most important examples of such content. Using streaming media content, audio, video, or audio/visual overview of significant events can be sent ointernet, when events unfold. Similarly, television and radio stations can transfer content directly from the scene on the Internet.

The session description Protocol (SDP), the document "request for comments the working group on networks (RFC) 2327, is a text-based format used to describe the properties of multimedia representations, called "sessions", and the properties of one or more streams of audio-visual information contained in the view. SDP was designed as an application-level Protocol that is intended for describing multimedia sessions for the purpose of announcement of the session, session invitation, and other forms of initiating a multimedia session. SDP can be used in accordance with other protocols, such as the streaming Protocol information in real time (RTSP) or the hypertext transfer Protocol file (HTTP), to describe and/or coordination properties of the multimedia session is used to deliver transmitted in a data stream.

However, in many situations it is difficult to obtain the SDP information from the network server to the client computer. For example, the network server may stream some media representations on the client computer, for example, views that are listed in the playlist. When a network series the EP switches from stream to one view to the next, the SDP information for the next view is often difficult to make available to the client computer. Thus, it would be useful to have an additional mechanism by which the SDP information can be made available to the client computer.

Streamed content can also be sent to many addresses (made multivescence). Conventional approaches to multigemini transmitted in the form of a stream of content typically includes providing the content to multigemini on the server, which then performs multivescence content over the network (i.e. without feedback from clients, receiving streams). The server typically performs multivescence content in multiple threads in multiple formats (for example, baud rate, in bits, languages, coding schemes etc). Clients connected to the network can then take the thread(s)corresponding(s) of their resources. To allow clients to choose which one(s) thread(s) to take, one approach multicasting requires that the server threw the file, which provides information multicasting, which allows customers to access the content streams. Maintain and publish this file is typically a manual process that has a relatively high cost of administration. Further, if there is no need to support the and publication, customers can encounter problems, which can lead to dissatisfaction of the customer (user). Another problem with this approach is that clients need to maintain their "information multicasting updated so that they can properly access the content. This problem is exacerbated for clients who do not have appropriate feedback channel to request updates (for example, clients with unidirectional satellite links).

The invention

Describes how to implement the message of the session description in the message transmission control Protocol real-time (RTCP). Embedded in at least some of the RTCP messages sent from the source audiovisual content to a recipient that is a message session description, which describes an audiovisual performance that is passed as a stream to the recipient.

In accordance with some aspects of the RTCP message, which implements the message session description, includes at least three fields. The first field contains data that identifies the RTCP message as being a message of a type that implements the message of the session description. The second field contains data that is the message of the session description for the audiovisual presentation. Third floor which contains the data identifies the length of the RTCP message, generated by summing the length of the first field, the length of the second field and the length of the third field.

In accordance with other aspects of the RTCP message they create in the device, such as a server device. The message session description, embedded in the RTCP message associated with one of the many parts of the audiovisual content in the playlist audiovisual content that is passed in the form of a stream from this device to a recipient.

In accordance with other aspects of a multimedia presentation are multivisceral using the channel advertisement that includes information describing the presentation along with many channels for a variety of multimedia data streams for approval (meet the requirements of customers of various multimedia resources. Clients can use this channel ads to select the channel(s)corresponding to(their) their media resources.

In accordance with other aspects channels create in advance a certain way (for example, pre-selected logical address pre-selected ports, IP addresses, and so on) so that clients can immediately connect to the channel without (or simultaneously with) the connection(em) to the channel ads to reduce the startup delay.

In sootvetstviis other aspects can be created in the acceleration channel, which provides the data blocks containing the current block of the multimedia presentation along with pre-selected number of previous blocks with a bit rate that is faster than real time" (that is, with a bit rate that is greater than the bit rate multimedia streams). This feature allows customers with appropriate resources more quickly to buffer enough data to start the presentation of the multimedia data to users. Alternatively, the acceleration channel should not be "faster than real time, so that the customer could connect with the acceleration channel and another channel that carries out multivescence multimedia data so that in fact the client receives the multimedia data at a bit rate that is faster than real-time."

Brief description of drawings

The same reference position used in the document to refer to similar components and/or features.

Fig. 1 illustrates an exemplary network environment that can be used for streaming audio-visual information, using the message session description, embedded in the RTCP message, as described below.

Fig. 2 Il is ustrinum sample client and server devices, which can be streamed audiovisual content using the message session description, embedded in the RTCP message, as described below.

Fig. 3 illustrates an example client and server devices in the environment multicasting, which can be transmitted as a stream of audiovisual content using the message session description, embedded in the RTCP message, as described below.

Fig. 4 illustrates an example client and server devices in the environment with the playlist on the server side, which can be streamed audiovisual content using the message session description, embedded in the RTCP message, as described below.

Fig. 5 illustrates an example of the format of the RTCP message, with the embedded message of the session description.

Fig. 6 illustrates an exemplary message format of the session description.

Fig. 7 illustrates the sequence of operations of an example process for implementing messaging session description in the RTCP message when using a playlist.

Fig. 8 illustrates the sequence of operations of an example process for receiving messages of the session description in the RTCP message when using a playlist.

Fig. 9 is a block diagram illustrating a system for multicasting multimedia representations, according to domoarigato implementation.

Fig. 10 illustrates the sequence of operations of a server in the system according to Fig. 9, according to one variant of implementation.

Fig. 11 is a diagram illustrating an exemplary channels according to one variant of implementation.

Fig. 12 the sequence of operations illustrating a sequence of operations on the client in the system of Fig. 9, according to one variant of implementation.

Fig. 12A sequence of operations illustrating a sequence of operations on the client in the system of Fig. 9, according to another variant implementation.

Fig. 13 - sequence of operations illustrating a server according to Fig. 9, according to one variant of implementation.

Fig. 14 is a sequence illustrating the sequence of operations of the controller configuration of Fig. 13, according to one variant of implementation.

Fig. 15 is a block diagram illustrating the server of Fig. 9, according to another variant implementation.

Fig. 16 is a sequence illustrating the sequence of operations of the server with the generator of the accelerated flow of Fig. 15, according to one variant of implementation.

Fig. 17 sequence of operations illustrating a sequence of operations on the client when receiving the accelerated flow, according to one variant of implementation.

Fig. 17A is a sequence of operations, illustri the existing sequence of operations on the client when receiving the accelerated flow, according to another variant implementation.

Fig. 18 is a block diagram illustrating an example computing environment suitable for implementing the above-mentioned embodiments.

A detailed description of the preferred option exercise

The following describes the implementation of the message of the session description in the message control Protocol real-time (RTCP). Multimedia presentation or view of one of the types of audio-visual information is transmitted in the form of a stream from the source audiovisual content, such as a server device to a recipient, such as a client device using a packet transport Protocol real-time (RTP). Control information relating to the representation that is passed in the form of a stream is also passed from the source audiovisual content to the recipient using the RTCP messages. In at least some of the RTCP messages embedded message session description, which describes the view that is passed as a stream.

In the above description reference is made to the multimedia presentation, which is passed as a stream from the source audiovisual content (content) to the recipient. Source audiovisual content can be any source of audiovisual content, an example of which are the two which is a server device. The recipient can be any recipient of audiovisual content, an example of which is the client device. Additionally, it should be clear that although the description may refer to media representations, which passed in the form of a stream representation of one type of audio-visual information can also be passed as a stream in the same way as described here regarding media representations.

Fig. 1 illustrates an example network environment 100 that can be used for streaming audio-visual information, using the message session description, embedded in the RTCP message, as described below. In the environment 100 lots (a) the client computing device, 102(1), 102(2), ..., 102( a) connected to the set (b) of server computing devices 104(1), 104(2), ..., 104(b) via the network 106. The network 106 is intended to represent any of a number of conventional network topologies and types (including wired and/or wireless networks)by using any of a number of conventional network protocols (including public and/or proprietary protocols). The network 106 may include, for example, the Internet, and possibly at least part of one or more LANs (local area networks).

Computing devices 102 and 104 each may be any of a number of conventional computing device is ist, including desktop PCs, workstations, universal computers, Internet devices, game consoles, PDAs, cellular phones, personal digital assistants (PDAs), etc. of One or more devices 102 and 104 may be devices of the same type or, alternatively, devices of different types.

Server device 104 can make any of the data sets available for streaming to the clients 102. The term "streaming" is used to indicate that the data representing the audiovisual information transmitted over the network to the client device and the playback of the content can begin to deliver content in its entirety (for example, when transmitting data on the basis of "on-demand"instead of preliminary delivery data in their entirety before playback). These data may be publicly available or, alternatively, limited (access) (for example, limited only to some users, is available only if you have made the appropriate fee, and so on). Data can be anything from a set of one or more types of content, such as audio, video, text, animation, etc. Additionally, the data may be pre-recorded or, alternatively, "alive" (for example, a digital representation of the recorded concert, when the concert is given and makes the camping is available for streaming shortly after data collection).

The client device 102 can receive transmitted as a stream of audiovisual information from the server 104, which stores this passed in a stream of audiovisual content in the form of a file or, alternatively, from the server 104, which receives this transmitted as a stream of audiovisual information from some other source. For example, the server 104 may receive transmitted as a stream of audiovisual information from another server, which stores this passed in a stream of audiovisual content as a file or can't take transmitted as a stream of audiovisual information from some other source (for example, the encoder, which encodes a live event).

Used the term "transmitted in the form of a stream of audiovisual information" refers to the flow of one or more types of audio-visual information from one device to another (e.g. from server device 104 to client device 102). Audiovisual streams may include any of many types of content, such as one or more of audio, video, text, etc.

Fig. 2 illustrates an example client and server devices, which may be transmitted as a stream of audiovisual content using the message session description, embedded in a TCP message, as is described below. Many different protocols are typically observed in the client device 102 and server device 104 to send a stream of audiovisual content from server device 104 to client device 102. These different protocols can be responsible for different aspects of the transfer process in the form of a stream. Although not shown in Fig. 2, one or more additional devices (e.g., firewalls, routers, gateways, bridges, and so on) can be located between the client device 102 and server device 104.

In the example of Fig. 2 Protocol 150 application layer (application layer), transport Protocol 152 and one or more protocols 154 of the delivery channel used as part of the transfer process in the form of a stream. Additional protocols that are not shown in Fig. 2, can also be used (for example, there may be additional(e) the Protocol(s) between the Protocol 150 application layer and the transport Protocol 152). The Protocol 150 application layer is a Protocol at the application level to control the delivery of data with properties in real-time. The Protocol 150 application layer provides the structure, optionally expandable to provide managed software delivery "on demand" real-time data, such as transmitted in the form of a stream of audio and video content. The Protocol 150 application layer is the control Protocol for initialisatie and direction of delivery transmitted in the form of a stream of multimedia information from the servers of audiovisual information. Examples of Protocol 150 application layer include streaming Protocol real-time (RTSP), as described in the description of the "request for comments the working group on networks (RFC) 2326, April 1998, and the transfer Protocol hypertext (HTTP), as described in the description of the "request for comments the working group on networks (RFC) 1945, May 1996, or in the description of the "request for comments the working group on networks (RFC) 2068, January 1997.

The Protocol 150 application layer uses the transport Protocol 152 to deliver real-time data such as streaming audio and video. Transport Protocol 152 defines the packet format for streams of audiovisual information. Transport Protocol 152 provides end-to-end network transport functions suitable for applications transmitting data in real time, such as audio, video or simulation data, for multicast (multicasting) or unicast network services. Examples of transport protocols 152 include a Protocol for transporting real-time (RTP) and the transmission control Protocol real-time is (RTCP), as described in the description of the "request for comments the working group on networks (RFC) 3550, July 2003. Can also be used on other versions of the type of future draft or standardized versions of RTP and RTCP. RTP does not focus on resource reservation and does not guarantee the quality of service for services in real-time. Transportation data is expanded in accordance with a control Protocol (RTCP)to allow control of the data delivery method, scalable to large multicast networks (networks multicasting), and to provide some control and functionality of the identification.

The RTCP Protocol groups one or more of the control messages together in a unit called RTCP packet. In one or more RTCP packets embedded control message that includes the message of the session description. The message session description describes properties of a multimedia presentation that is passed in the form of a stream from server device 104 to client device 102. Transmitted in the form of a stream of audiovisual data from server device 104 to client device 102, thus, include the message of the session description.

Transport Protocol 152 uses the Protocol(s) 154 delivery channel for compounds of transportation. The Protocol(s) 154 delivery channel includes(s) in one or Bo is her channel for transporting data packets from server device 104 to client device 102. Each channel is typically used for sending data packets for the flow of one type of audio-visual information, although in alternative embodiments, the implementation may use a single channel to send data packets to flows of many types of audiovisual information. Examples of protocols 154 delivery channel includes the packets of the transmission control Protocol (TCP) and the packets are user datagram Protocol (UDP). TCP guarantees delivery of data packets, while UDP does not guarantee delivery of data packets. As a rule, the delivery of data packets using TCP is more reliable, but also requires more time than the delivery of data packets using UDP.

Fig. 3 illustrates an example client and server devices in the environment multicasting, which can be transmitted as a stream of audiovisual content using the message session description, embedded in the RTCP message, as described below. In some embodiments, the implementation of protocols 150, 152 and 154 of Fig. 2 included in the client and server devices according to Fig. 3, but is not illustrated. In addition, although not shown in Fig. 3, one or more additional devices (e.g., firewalls, routers, gateways, bridges, and so on) can be located between the client device 102 and server device 104.

Module 182 streaming PE is Adachi from the server device 104 transmits a stream of the same multimedia presentation to each of multiple client devices 102(1), 102(2), ..., 102(x). Each client device 102 has the player 184 audiovisual transmitted in the form of a flow of information that takes streaming multimedia presentation and handles the received stream to the client device 102, generally reproducing the multimedia presentation in the client device 102. The same data are transmitted in the form of thread on each client device 102 approximately at the same time, allowing the server device 104 to transmit in the form of a stream, only one instance of the same multimedia presentation at the same time, with different client devices 102 "listen" the presence of this view, which is passed as a stream.

Transmitted in the form of a stream of audiovisual information 186 includes RTCP messages having one or more communications session description embedded in it. The same message of the session description can be transmitted by broadcast many times over the streaming media presentation, thus allowing the new client device 102 to listen transmitted as a stream of audiovisual information after streaming has started, but still receive the message session description, describing multimedia presentation. Through implementation of the message description is of EASA in message RTCP stream of audiovisual information 186 client devices 102 do not need to listen to a separate thread or broadcasting, potentially from a device other than the server device 182 to accept the message of the session description.

Still not passed the examination of patent application No. 10/693,430, filed October 24, 2003, "Methods and Systems for Self-Describing Multicasting of Multimedia Presentations", which is incorporated into this description by reference, describes an example of such a multicasting environment.

Fig. 4 illustrates an example client and server devices in the environment with the playlist on the server side, which can be transmitted as a stream of audiovisual content using the message session description, embedded in the RTCP message, as described below. In some embodiments, the implementation of protocols 150, 152 and 154 of Fig. 2 included in the client and server devices according to Fig. 4, but is not illustrated. In addition, although not shown in Fig. 3, one or more additional devices (e.g., firewalls, routers, gateways, bridges, and so on) can be located between the client device 102 and server device 104.

Module 202 streaming from the server device 104 transmits in the form of a stream multimedia presentation as transmitted in the form of a stream of audiovisual information player 204 206 stream of audiovisual information in the client device 102. Player 206 stream of audiovisual information flux takes the TV a multimedia presentation and handles the received stream to the client device 102, usually reproducing the multimedia presentation in the client device 102.

The server device 104 includes a list of 208 playback, which identifies the set of (y) parts of audiovisual content 210(1), 210(2), ..., 210(y). In some embodiments of the list 208 playback includes multiple entries, each entry identifies one of the many parts of the audiovisual content 210. Alternatively, a list of 208 playback can identify one piece of audiovisual content, although in such situations, one part of the audiovisual content can simply refer to themselves instead of using a playlist. The client device 102 is able to choose a single resource for play, this resource identifies a list of 208 playback. Module 202 streaming accesses the identified list 208 playback, and then accesses the individual parts of the audiovisual content 210, and passes in the flow of these parts 210 to the client device 102. Thus, the client device 102 can access a single resource and still have many different pieces of audiovisual content transmitted in the form of a stream from the server device 104. Because the list 208 playback available and is specified server the m device 104 to identify the parts of the audiovisual content, instead of a client device 102, a list of 208 playback can also be named as a playlist on the server side.

Each part of the audiovisual content 210 includes one or more audiovisual streams. Different parts of the audiovisual content 210 may include a different number of audiovisual streams. Each part of the audiovisual content 210 is typically a multimedia presentation. The way in which defined "part" of content can vary from implementation to be based on the type of audio-visual information. For example, for musical audio and/or video content each song can be part of the content. The content can be divided by natural boundaries (e.g., different songs), or, alternatively, other arbitrarily (for example, every five minutes content is a part). For stored content in different parts of the content can be saved as multiple files or, alternatively, as the same file.

Although shown as two separate illustration in Fig. 3 and 4, it should be clear that part of the audiovisual content, called the playlist on the server side, as illustrated in Fig. 4, can be transmitted by multicasting, as shown in Fig. 3.

In Fig. 2, 3 and 4 pok is connected, on the transport level data that must be passed in the form of a stream from server device 104 to client device 102 embedded in the RTP packets. Control information relating to data that is passed in the form of a stream and RTP packets are implemented in one or more of the control messages in the RTCP packet.

Typically, the RTCP packet consists of several different types of messages. The first message in the RTCP packet is a Report of the receiver or the sender Report. The second message is a message SDES (source Description). The SDES message contains one or more elements of text meta-data. The SDES message contains an element CNAME (canonical name). The CNAME item is a permanent identifier of the source audiovisual content at the transport layer and provides a mapping (correspondence) between the number of RTP synchronization source (SSRC) and a text string. SSRC is the source of the packet stream (RTP and RTCP). CNAME is used so that the sender or receiver, which is involved in multiple RTP sessions that belong to the same representation, could use a different SSRC values in each RTP session, but keep the CNAME value one and the same.

An additional type of message that can be included in the RTCP packet is a control message having embedded in it a message description is in session. The message session description describes properties of a multimedia presentation that is passed in the form of a stream from server device 104 to client device 102. Different formats of audio-visual information or protocols can be used for such communications session description. An example of such a format of audio-visual information is the session description Protocol (SDP), the description of the request for comments of the working group on networks (RFC) 2327, April 1998. In some embodiments, the implementation described here, the message of the session description is a message in accordance with the SDP format described in RFC 2327.

Although the various formats can be used to describe properties of a multimedia presentation, one or more messages of the session description is sent from the server device 104 to client device 102, which can include the identifier(s) properties. A single message of the session description can be sent to the server device 104 for a particular multimedia presentation or, alternatively, may be sent many messages of the session description. If sent many message session description, this set of messages may include the same information, different information, or "overlapping" information.

The message session description includes, for example, one or b is more of the following: identification information of different channels, used for multicasting multimedia presentation; a description of each of the audiovisual stream, available in a multimedia presentation (e.g., indicating the type of stream (e.g., video or audio), the bit rate of each stream of audiovisual information, the language used in this thread, and so on); error correction; information security/authentication, encoding information; or information of the digital rights management (DRM), etc.

It should be noted that in some situations the message of the session description can be divided or fragmented across multiple RTCP control messages. Such situations can arise, for example, when the message session description very large. Each of these messages RTCP control included in a separate RTCP packet, and each contains a part or fragment of the full message of the session description. The client device 102 after receiving all the parts or fragments can combine them together to restore the message of the session description.

Fig. 5 illustrates an example message format 250 RTCP control with the embedded message of the session description. Message 250 RTCP described below as comprising a number of fields (also called parts), and each stores a variety of data. It should be clear that these fields can be found in excellent order of the ke, than the order in which they are described below and shown in Fig. 5. Additionally, although the size or length of these fields (e.g., in bits) described below, it should be clear that this is only an example, and the fields of alternative may be greater or less than these sample sizes or lengths. In some embodiments, the implementation of message 250 RTCP includes all the fields shown in Fig. 5. In additional embodiments, the implementation of message 250 RTCP includes fewer than all of the fields shown in Fig. 5, or may include additional fields not shown in Fig. 5.

The message field 250 RTCP can be considered as grouped into three groups: header 290, block 292 RTP status and message 284 session description. Title 290 includes various information about the message 250 RTCP. Block 292 RTP-condition is optional, and when enabled, is used to identify RTP-specific information about the stream multimedia presentation, which is described in the message session description (e.g., to determine the SSRC and the start number RTP sequence flow message session description). Typically, one block 292 RTP status is associated with and included in the message 250 RTCP for each stream of audiovisual information in a multimedia presentation. Message 284 session description is the message the op is writing session, embedded in the message 250 RTCP.

Box 252 V (version) is a 2-bit field that identifies the version of RTP, which is the same in RTCP packets as in RTP packets. For example, the version specified with RFC 3550 - 2.

Box 254 P (padding) - single bit that, if set (for example, equal to the value of 1)indicates that the message 250 RTCP contains some additional padding at the end, which is not part of the control information. This filling is included in field 262 length, but otherwise should be ignored. The amount of padding is included directly in the filling. In some embodiments of extra padding is specified in octets, and the last octet of the padding is the value of how much filling octets are included (including himself) and, thus, should be ignored.

Box 256 C box (compression) - single bit that, if set (for example, equal to the value of 1)indicates that the data in the field 284 SDP data has been compressed. Different compression types can be used, such as using Zlib compression, as described in version 3.3 Format Specifications ZLIB compressed data, the description "request for comments the working group on networks (RFC 1950, May 1996.

Box 258 Res (reserved) - 4-bit reserved field. In some embodiments of field 258 Res must be completed is about zero.

Field 260 header PT (payload type) - 7-bit field that is set equal to a value (e.g., 141), to indicate that the message 250 RTCP implements the message of the session description.

Field 262 length - 16-bit field that identifies the message length 250 RTCP. This length can be generated by summing the lengths of the various fields in the message 250 RTCP, including any header and any padding. In some embodiments of length is identified in 32-bit values minus one.

Field 262 SDPMsgHash (hash SDP message) - 16-bit field used to identify the message session description included in the message 250 RTCP, and addresses (e.g. IP address) of the sender (for example, a server device 104). In some embodiments of the identifier in field 264 is calculated as a checksum for the message session description, and address so that when there are any changes in the value of ID in the field were also changed. In some embodiments of the field value 264 SDPMsgHash is calculated as value, the hash function of the message ID described in the minutes of the Announcement of the Session (SAP), "request for comments the working group on networks (RFC 2974, October 2000. If the message is a session description is fragmented across multiple RTCP messages, as described below, the value of a field 264 SDPMsgHash each fragment which should be identical.

Box 266 F (more fragments) - single bit that, if set (e.g., has a value of 1 indicates that the message session description was fragmented into many of the RTCP message, and that the current RTCP message contains the last fragment of the message of the session description. If the field 266 F is not set (e.g., has a value of 0), the message session description was not fragmented (all message session description included in the message 250 RTCP), or the message session description was fragmented, and the message 250 RTCP contains the last fragment of the message session description.

Box 268 FragSeqNum (the sequence number of the fragment) is a 15-bit field used to identify the various elements of the message of the session description. Fragments of the message session description assign IDs in some way, known and server device 104, and a client device 102. For example, the identifiers may be assigned sequentially starting with the value 0, so that the first fragment is 0, second is 1, third is 2 and so on If the message 250 RTCP does not contain a fragment of a message session description (i.e. the message 250 RTCP contains all the message session description), box 268 FragSeqNum must be set to 0.

Box 270 NumRtpState (number RTP-state) - 16-bit field used to determine the number of blocks RP status contained in the message 250 RTCP. Each block RTP-state has a size of 14 bytes. Field NumRtpState" is set to 0, when the blocks RTP-States are not present. In the illustrated example, the message 250 RTCP there is one block 292 RTP status. If there are many blocks RTP status field 274, 276, 278, 280 and 282 are included for each set of blocks RTP status. If the message is a session description is fragmented into multiple messages 250 RTCP, then only the message 250 RTCP containing the first fragment of the message session description must contain the block(s) RTP-state.

Box 272 - a 1-bit field that is not installed (for example, has a value of 0)if the field 274 PT contains a valid number, payload type RTP. If the field 272 is not installed, the information in block 292 RTP status only applies to the number of RTP payload Type identified in field 274 PT and SDP Flow ID (thread ID SDP), identified in field 276 Flow ID. If the field 272 is installed (for example, has a value of 1), the field 274 PT should be ignored, and the block 292 RTP status apply to all RTP packets for SDP Flow ID identified in field Flow ID, regardless of the RTP payload Type.

Box 274 RT 7-bit field indicating the number of the RTP payload Type information in block 292 RTP status. If the field 272 is installed (for example, has a value of 1), the field 274 PT not is use and must be set to 0.

Box 276 Flow ID (thread ID SDP) - 24-bit field that identifies the SDP Flow ID to which the information relates in block 292 RTP status. Each stream of audiovisual information is passed in a stream in a different RTP session. These sessions RTP designate the number using the attribute "a=mid:"as described in the row Grouping of audio-visual information in the Session Description Protocol (SDP) description "request for comments the working group on networks (RFC 3388, December 2002. Box 276 Flow ID identifies a specific record "m=" in the message session description, which is the same as the value for the attribute "a=mid" (in accordance with RFC 3388) write "m=".

Box 278 SSRC (synchronization source) - 32-bit field that specifies the value of the SSRC field of the RTP used for the stream of audiovisual information, which is identified by the field 276 Flow ID. If the field 272 is not installed (for example, has a value of 0), the SSRC field only applies to the RTP packets for this stream of audiovisual information that use the RTP payload Type specified field 274 PT.

Field 280 RtpTime (RTP) 32-bit field that specifies the value of the Timestamp field of the RTP, the RTP packet would have if the package was sent exactly at the beginning of the audiovisual stream, identified field 276 Flow ID. For example, if the timeline of the audiovisual presentation begins at time T, the value is Olya 280 RtpTime is the value of the Timestamp field of the RTP packet, which would have been sent exactly at time T, even if the RTP packet is not actually there for the stream of audiovisual information, identified by block 292 RTP status.

Box 282 RtpSeq (RTP sequence) - 16-bit field that specifies the field value numbers RTP sequence of the first RTP packet, which is sent to the stream of audiovisual information identified field 276 Flow ID. If the field 272 is not installed (for example, is 0), the field 282 RtpSeq only applies to the RTP packets for this stream of audiovisual information that use the RTP payload Type specified field 274 PT.

Box 284 data SDP message session description, embedded in the message 250 RTCP. In situations when a message session description fragmented, box 284 data SDP contains only part of the communication session description (e.g., single fragment of the message session description). In some embodiments of the message session description is a complete description of the SDP in UTF-8 format.

Fig. 6 illustrates an example of a message format of the session description. Although illustrated as a specific example in Fig. 6, the message of the session description can be formatted with fields or parts in different orders or, alternatively, be distributed across different messages.

Message 320 session description includes part 322 description level the session and zero or more parts 324 level description of audio-visual information. Part 322 level description of the session includes one or more fields with the data that apply to the entire session and all streams of audiovisual information, which are part of the session. Each part 322 level description of audio-visual information, on the other hand, includes one or more fields with the data that apply only to a single stream of audiovisual information.

Data fields in part 322 level description of audio-visual information describe the properties for specific streams of audiovisual information. These properties can be in addition to the properties described in part 322 description session-level, or instead of the properties described in part 322 level description of the session. For example, one or more properties in a specific part 322 level description of the audiovisual information may cancel for a particular audio-visual information associated with this particular part 322 level description of audio-visual information, properties identified in part 322 level description of the session.

Message 320 session description and message structure 320 is described in more detail below with reference to SDP. It should be noted that these specific patterns are only examples and that the message of the session description can take many forms.

Part 322 level description of the session begins with a competitive price is to maintain the fields, referred to as a field in the Protocol version. Similarly, part 324 level description of audio-visual information, each begins with a specific field named field as the name and address of the transport of audio-visual information. In some embodiments, the implementation of multiple fields of the same type can be included in the message session description (e.g., a single message of the session description can have two or more attribute fields).

Table I below illustrates an example of fields that may be included in part 322 level description of the session. Table I includes a name for each field in the example, the abbreviation or type for each field sample and a brief description of each field sample. In some embodiments, the implementation of the required field Protocol version, owner/Creator and ID field session field session name and description field time, while all the other fields in the Table I is optional.

Table I
NameTypeDescription
Protocol versionv=Version SDP
Sourceo = The session initiator (for example, the name and address of the user host computer user) plus a session ID and version number of the session
The session names =The session name
Information sessioni =Information about the session
URI descriptionu =A pointer to additional information about the session
E-mail addresse =The email address of the person responsible for the session
Phone numberp =The phone number of the person responsible for the session
The connection informationc =Connection information describing the connection for the session, such as the network type, the type of addressing and address of the connection
Information throughputb =The proposed bandwidth to be used by the session
Description timeCm. table II below
The time zone settingz =Determine the configured times and offsets to account for the time for daylight savings
The encoding keyk =Specifies the mechanism that should be used to obtain the encryption key for the session by external means or from the encoded encoding key
Attribute=The session attribute that extends SDP

Table II below illustrates in more detail field descriptions time. Table II includes a name for each field in the description field time, abbreviation or type for each field in the description field time and a brief description of each field in the description field time. Field time during which the session is active, while the field repetition zero or more times, is optional.

TABLE II
NameTypeDescription
The time when the session is activet =The start and end time of the session
Repetition zero or more timesr =Determine the number of repetitions for a session

Table III below illustrates an example of fields that may be included in part 324 level description of audio-visual information. Table III includes a name for each field in the example, the abbreviation or type for each field sample and a brief description of each field sample. In some embodiments, the implementation of the required field declarations of audio-visual information, while all the other fields in the Table III are optional.

TABLE III
NameTypeDescription
The announcement of audiovisual informationm =The type of audio-visual information of the stream of audiovisual information, the transport port to which the stream of audiovisual information is sent, the transport Protocol for the stream of audiovisual information, and format(s) audiovisual in the information for this stream of audiovisual information
The title of audio-visual informationi =Information about the stream of audiovisual information (for example, the label for the stream of audiovisual information)
The connection informationc =Connection information describing the connection for the stream of audiovisual information, such as the type of network used, the address type and address of the connection.
Information throughputb =The proposed bandwidth to be used by the stream of audiovisual information
The encoding keyk =Specifies the mechanism that should be used to obtain the encryption key for the stream of audiovisual information by external means or of the encoded encoding key
Attribute=The attribute of the stream of audiovisual information that extends SDP

Fig. 7 is a sequence illustrating an example process 350 for the implementation of the message the op is writing session in the RTCP message when using a playlist on the server side. Fig. 7 shows the steps performed by the content source audiovisual information, such as server device 104 (e.g., Fig. 1, 2, 3 or 4).

Initially identify the next part of the content of audiovisual information in the playlist (step 352). When it starts playing parts of the audiovisual content, the next part is the first part identified in the playlist. Additionally, each time the end of one part of the content (for example, the complete part of the content transferred in the form of a stream on the client device 102, even though playing this part in the client device 102 is most likely not yet finished), the next part of the content of audiovisual information is the part that follows the part, the end of which was achieved. It should be noted that this next part can be followed in the order specified by the playlist, or the user may be able to navigate to different parts within a playlist (for example, a user may be able to request that specific part in the play list has been omitted or pereprygnut").

Then get the information describing the identified portion of the content of audiovisual information (action is s 354). This information can be obtained from one or more different ways. One of the ways that this information can be obtained, is to extract from a file or record. In some embodiments, the implementation of at least part of the information stored in the file or record associated(Oh) identified part of the content of audiovisual information. To this file or write address at step 354, to retrieve the stored information.

Another way in which this information can be received, is receiving from the human user. In some embodiments, the implementation of at least part of the information get from the human user. These inputs from the user are used in step 354 as at least part of the information that must be included in the message session description.

Another way in which this information can be obtained is automatic detection. In some embodiments, the implementation of at least some of the information may be automatically received by the computing device, analyzing the source of the identified portion of the content of audiovisual information itself or an identified portion of the content of audiovisual information. This is automatically detected information is used in step 354 as at least part, and the formation which should be included in the message session description.

Then, you create an RTCP message with the message session description, which includes the obtained information (step 356). In some embodiments, the implementation of this RTCP message is in the message form 250 RTCP in Fig. 5, described above. Created RTCP message then send to the designated recipient the next part of the content of audiovisual information (step 358). Assigned to the recipient the next part of the content of audiovisual information is the device on which the audiovisual content is passed as a stream (for example, a client device 102 of Fig. 1, 2, 3 or 4). Generated message included in the RTCP RTCP packet, which is included as part of the audiovisual information that is passed in the form of a flow is assigned to a recipient.

It should be noted that situations may arise when the number of streams of audiovisual information, which is transmitted in the form of flow for two different parts of the content of audiovisual information identified in the playlist, is different. For example, the first portion of the audiovisual content information identified in the playlist, you can have two threads (for example, the audio stream and the video stream), while the second part of the content of audiovisual information, identifizierung playlist, can have three streams (e.g., audio stream, video stream and the flow of the text subtitle). Additionally, when passed in the form of a data stream of audiovisual information using UDP, each stream of audio-visual information typically uses a different channel UDP, which take the recipient on a different UDP port. If the recipient has only two ports for the first part of the content of audiovisual information (e.g., one port for the audio stream and one port for video stream)may not be a port available for the recipient to accept the flow of the text subtitle the second part of the content of audiovisual information.

Such situations can be resolved in various ways. In some embodiments, the implementation of these situations permit, passing in the form of additional flow(s) thread(s) audio-visual information on an open HTTP connection using TCP. The indication is included in the message 250 RTCP (for example, as an additional block 292 RTP status for each additional stream of audiovisual information), which additional(s) thread(s) audio-visual information passes in the flow of this method.

In other embodiments, the implementation of such a situation is allowed if the recipient open one or two additional ports, often referred to as group ports. Each is of these group of ports may be used to receive any of the stream of audiovisual information, which the server device sends to the recipient. The indication is included in the message 250 RTCP (for example, as an additional block 292 RTP status for each additional stream of audiovisual information)indicating which group of additional ports(s) thread(s) audio-visual information is passed as a stream.

In other embodiments, the implementation of these situations allow the server device sending the message session description to the recipient (for example, in the message 250 RTCP), which identifies all the streams of audiovisual information available for the second part of the content of audiovisual information. The server device then waits (response of recipient to choose which of the streams of audiovisual information, the recipient is willing to accept. The recipient will have to make choices (e.g., automatically or based on user input in the recipient), and to send to the server device, an indication of what(s) thread(s) audio-visual information was(and is) selected(s) and what ports you selected(s) thread(s) audio-visual information need(s) to be transferred in the form of a stream.

Fig. 8 shows the sequence of operations illustrating an example process 380 for receiving messages of the session description in the RTCP message when using a playlist on the server side. Fig. 8 shows the action which I, performed by the receiver of audiovisual information, such as client device 102 (e.g., Fig. 1, 2, 3 or 4).

Initially, the RTCP message is taken from the source audiovisual content information (step 382). The source of the content of audiovisual information is, for example, a server device 104 of Fig. 1, 2, 3, or 4.

The message of the session description for the next part of the content of audiovisual information in the playlist extracted from the RTCP message (step 384). When the transfer in the flow of parts of the content of audiovisual information in the playlist is just beginning, this next part of the content of audiovisual information is the first part of the content of audiovisual information in the playlist. After the start of transfer in the form of a stream of at least one part of the audiovisual content, the next part of the content of audiovisual information is the next part, identified in the playlist. It should be noted that this next part can be followed in the order specified by the playlist, or the user may be able to navigate to an excellent part within the playlist (for example, a user may be able to request that specific part list vosproizvedeny is to be skipped or "pereprygnut"). It should also be noted that the message of the session description for the next part of the content of audiovisual information is usually taken to play the current part of the finish of the content of audiovisual information (to allow the client device 102 to immediately start playing the next part of the content of audiovisual information when the playback of the current portion of the content of audiovisual information completed).

The extracted message session description is then used in processing the next part of the content of audiovisual information (step 386). This processing typically involves play the next part of the content of audiovisual information in the client device 102.

Fig. 9 illustrates a system 500 for multicasting multimedia representations according to one variant of implementation. In this embodiment, the system 500 includes a source of content 502, the server 504 and clients 506i-506x that are associated with the server 504 via network 508. Network 508 may be any suitable wired (including fiber optic) or wireless network (e.g., RF or optical in free space). In one embodiment, the network 508 is the Internet, but in other embodiments, the implementation of network 508 may be a local area network (LAN), the network University is orodja etc.

In this embodiment, the server 504 includes a generator 510 ad. As will be described in more detail below, embodiments of the generator 510 ads generate streams containing information regarding media representations, subject to multigemini network 508. The operation of this variant of the implementation of the system 500 during the implementation of multicasting multimedia representations described below with reference to Fig. 10 - Fig. 12.

Fig. 10 illustrates the sequence of operation of the server system 500 of Fig. 9 when multigemini multimedia presentation according to one variant of implementation. With reference to Fig. 9 502 server 504 operates as follows for the implementation of multicasting a multimedia presentation.

At step 524, the server 504 receives multimedia presentation via a connection 512. In this embodiment, the server 504 receives a multimedia representation of the source content 502 via line 512 connection. In particular, the source content 502 provides multimedia content to be multigemini, network 508. Multimedia content can be generated in any suitable way. For example, the multimedia content may be pre-recorded/generated content, which is then stored in a data store (not shown), or lane is given directly from the scene, which is fixed (for example, using a video camera, microphone and so on) and encode (encoder, not shown).

In the usual application of multimedia presentation will include many threads. For example, a multimedia presentation may include a video stream, the audio stream to another video stream encoded with a lower bit rate and other audio streams encoded with a lower bit rate. In other applications, multimedia presentation may have more or fewer threads than those described in the application example. Thus, in this embodiment, the server 504 receives a multimedia presentation in the form of one or more flows to step 524.

At step 526, the server 504 generates a stream ads and performs multivescence stream ads network 508 via line 514 connection. In this embodiment, the generator 510 ad server 504 generates a stream ads. In some embodiments, the implementation of the generator 510 ads can be configured by the administrator, while in other embodiments, implementation of the generator 510 ads can be configured to process the stream(s)accept(s) at step 524, and retrieve information from it(them) thread(s) to form a stream of ads.

In some embodiments, the implementation of the server 504 provides multivescence stream advertisement dedicated channel announcement (i.e. the channel without ads related to other media representations). As used in this context, the channel may be a logical address, such as address, Internet Protocol (IP) and port multicasting. Thus, the client can connect to the channel, listening to the logical address and the port associated with the channel. Customers can learn about the logical address any suitable manner, such as, but not limited to, e-mail, invitations, mailings by post between Web sites, and multivisceral normal Protocol Declarations Session (SAP) (for example, as defined in the Specification IETF RFC 2974, "Session Announcement Protocol"). In the variants of implementation that uses multicasting for SAP announcements multimedia presentation, multigemini SAP there is no need to include detailed information describing the presentation, which can be provided in the "combined" (in-line) stream ads (described in more detail below).

In some embodiments, the implementation of stream ads is multivisceral, "combined" (in-line) with a stream containing multimedia data. For example, this may be done multivescence stream of multimedia data using packets according to a transport Protocol real-time (RTP), and can be implemented multivescence stream ads using packets according to the SNO the transport control Protocol real-time (RTCP). In one embodiment, the RTP is defined in the description "request for comments" (RFC) 3550, documents the problem of designing the Internet engineering task force (IETF), July, 2003 (which includes the specification for RTCP). In this embodiment, the RTP has been extended to support data declarations in the RTCP packets. Further improvement of these announcements can be sent to "combined" in the same RTP packets (or other packet/datagram Protocol), and multimedia data. In other embodiments, implementation of the channel ads may be out-of-band (for example, when multivescence channel ads using SAP).

Stream the announcement contains information that describes a multimedia presentation, such as, for example, identification of the various channels used for multicasting multimedia presentation, descriptions of flow (e.g., indicating the type of stream (e.g., video or audio); the bit rate of the stream; the language used in the stream, and so on)that can be transported by each of the channels, the information error correction; information security/authentication; information coding; digital rights management (DRM), etc. In one embodiment, the stream of ads is a recurring multivisceral during a multimedia presentation to clients when oedinyayuschihsya at different times, could take the information describing multimedia presentations. The client receiving this information describe the presentation via stream advertisement may then determine which channels are suitable for the connection, based on the records of their resources.

At step 528, the server 504 provides multivescence stream(s)selected(s) from the stream(s), multimedia presentation, adopted(s) at step 524. In some scenarios, the server 504 provides multivescence all threads adopted at step 524. In some embodiments, the implementation, the administrator can configure the server 504 for multicasting specific threads in a pre-selected channels. In one embodiment, the server 504 supports at least the channel ads, video channel and audio channel. More often the server 504 also supports additional channels of video and audio streams with different transmission speeds in bits, to provide customers with a variety of resources available for processing the multimedia presentation.

For example, as shown in Fig. 11, the server 504 may be configured to support channel 532 ad, channel 534 acceleration (described below with reference to Fig. 15 and 16), the channel 536 high-quality video, channel 538 high quality audio, channel 540 applications, channels 5421-542Nin alternative languages and channels 544 1-544Malternative transmission speed in bits (for audio and/or video streams). In one embodiment, the channel 540 applications can be used for multicasting data used by application programs, which are expected to be executed locally on the clients (for example, player of audio-visual information, or other applications that may require an additional plug-in module for use multimediaman data applications, such as Microsoft PowerPointŪ). Depending on flows provided by source 502 content and configuration server 504, and a pre-selected definition channels, the server 504 may display the flow only in one channel, multiple channels or no channels. For example, if a multimedia representation of the source content 502 includes a stream in English and flow in Spanish, the server 504 may be configured to display the stream in Spanish all channels 5421-542Nor only channel 5421or at all on any channel.

In some embodiments, the implementation of distribution channels is selected in advance. For example, in variants of implementation, in which each channel has its own IP address, the channels can be a set of sequential IP addresses in the IP addressed is, assigned for multicasting (i.e., the range of IP addresses 224.0.0.0 to the IP address 239.255.255.255). Thus, the channel 532 ads can be assigned to the IP address 231.0.0.1, channel 534 acceleration can be assigned to the IP address 231.0.0.2, and the channel 536 high-quality video can be assigned to the IP address 231.0.0.3, and so on. Similarly, the channels can be set serial port group IP address. Thus, in the embodiment, on the basis of RTF channel 532 ads can be assigned to port 231.0.0.1:5000, channel 534 acceleration can be assigned to port 231.0.0.1:5002, channel 536 high-quality video can be assigned to port 231.0.0.1:5004, and so on (so that the ports 5001, 5003 and 5005 could be used for RTCP packets).

The approaches used by the above-mentioned variants of the implementation of the system 500, have several advantages. For example, as is multivescence stream ads on a selected channel, the client can more quickly take the information describe the presentation, thus advantageously reducing the startup delay. In contrast, the usual approach multicasting SAP usually has a large startup delay, because multicasting SAP typically declare a large number of multigemini that tends to reduce the frequency with which multivescence ads for a particular multimedia presentation(which in turn tends to increase the delay).

Further, these embodiments of the system 500 does not require customers to have a reverse channel to the server 504, thereby providing greater flexibility in the delivery of multimedia representations of the desired audience.

In addition, these embodiments of system 500 eliminates the need for the server to provide file information multicasting", required in the above-described conventional system, and thus to bear the costs required for the maintenance and publication of this file.

In addition, since the threads, multivescence which is carried out in the set of channels that are predictable in some embodiments, implementation, customers can choose to connect to a specific channel without waiting for the reception and processing of information describing multimedia presentations on the channel 532 ad. For example, an aggressive client (usually a client with relatively large resources) can choose the connection channels 536 and 538 high-quality video and high quality audio at the same time or instead of connecting to the channel 532 ad, thus reducing startup delay, if the client really has the resources to handle flows with no loss of data. For example, a client with greater resources can be a client, having a computing platform with high-speed CPU and b is larger buffering resources and related high-speed computer network with a relatively large bandwidth. High-speed CPU and large resources buffering greatly reduce the risk of data loss.

Fig. 12 illustrates the client 5061(Fig. 9) when you receive a multimedia presentation, multivescence which is performed by the server 504 (Fig. 9), according to one variant of implementation. Clients 5062-506X(Fig. 9) can operate essentially identically. With reference to Fig. 9, 11 and 12, the client 5061operates as follows when receiving a multimedia presentation.

At step 562, the client 5061already taking the logical channel address announcements multimedia presentation, is connected to the channel 532 ad. As described above for one possible implementation, the server 504 repetitive manner performs multivescence information describe the presentation on a selected channel ads. Thus, the client 5061may relatively quickly receive information description view compared to conventional systems, which typically operate multivescence information describing a relatively large number of multimedia and/or other types of views.

At step 564, the client 5061then connects to one or more channels that provide streams of multimedia data, which are described in the accepted stream ads. In one embodiment, is sushestvennee client 506 1it can determine which channel(s) to connect to have the best experience using the resources available to the client 5061. The client 5061can then take the selected(s) thread(s) of the multimedia presentation.

Fig. 12A illustrates the stages of the client 5061(Fig. 9) when you receive a multimedia presentation, multivescence which is performed by the server 504 (Fig. 9)according to another variant implementation. Clients 5062-506X(Fig. 9) can operate essentially identically. With reference to Fig. 9, 11 and 12A, the client 5061operates as follows when receiving a multimedia presentation. In this embodiment, the client 5061essentially simultaneously takes the stage 562 (to connect to a channel ad 532, as described above) and step 570.

At step 570, the client 5061also connects to one or more pre-selected channels of a multimedia presentation in addition to the channel 532 ad. As described above for one possible implementation, the server 504 may be configured for multicasting flows at a pre-selected channels in a predictable way. In this embodiment, the client 5061can take advantage of the pre-selected destination channel to connect with the desired channels without having to make the information describe the presentation channel 532 ad. For example, in one scenario, the client 5061has a relatively large resources for receiving and processing multimedia performance, capable of handling normal flow of high-quality video and high quality audio. With these resources, the client 5061can be configured to immediately connect with channels 536 and 538 to accept the stream high quality video and audio with high quality in order to reduce the startup delay with a relatively high expectation that the client 5061can properly handle the threads.

At step 572 decision the client 5061determines whether it is optimal to process the stream(s)adopted(s) from channel(s)to which time he was joined on stage 570 due to the resources available to the client 5061. In one embodiment, the client 5061uses information to describe the presentation adopted for the channel 532 ad, to determine whether its resources to process streams received through these channels. For example, the flow channels attached at stage 570 may have a transmission speed in bits (which will be described in stream ads)that are too large for the client 5061for processing without data loss (which can lead to intermittent audiocassette for streaming audio or the appearance of blocking the ri video playback for video streams). If the client 5061decides at step 572 that it can optimally process thread(s) pre-selected(s) channel(s), the client 5061continues to take the thread(s) from channel(s)to which(th) client 5061joined on stage 570, while multivescence multimedia presentation does not end.

However, in this embodiment, if the client 5061at step 572 decides that he is not able to optimally process the stream(s) pre-selected(s) channel(s), the sequence of operations proceeds to step 564 (described above with reference to Fig. 12). At step 564, using the information describing the presentation adopted at step 562, the client 5061joins one or more other channels, which transmit multimedia streams that the client 5061can be optimally processed. At step 574 in this embodiment, the client 5061can be disconnected from the pre-selected channels, attached at step 570. The client 5061continues to take the thread(s) from channel attached at step 564, while multivescence multimedia presentation does not end or not made the choice to leave the channel.

Fig. 13 illustrates some of the components of the server 504 (Fig. 9) according to one variant of implementation. In this embodiment, in addition to the generator 510 Clare the population (described above with reference to Fig. 9) the server 504 includes a controller 582 configuration, the configurable module 584 destination flows, interface 586 source and the network interface 588. In some embodiments, the implementation of these elements are software modules or components that can be executed by the computing environment server 504.

Interface 586 source configured to receive one or more media streams from the source content 502 (Fig. 9) on line 512 connection. Configurable module 584 assignment of threads configured to receive streams from interface 586 source stream ads from generator 510 announcements and information management from the controller 582 configuration. In this embodiment, the configurable module 584 thread assignment acts like a switch when converting or direction of one or more streams received from interface 586 source to the channel(s) multicasting. Network interface 588 provides multivescence selected streams over the network 508 (Fig. 9). In some embodiments, the implementation of the controller 582 configuration configures the configurable module 584 assignment of threads to convert (display) accept(s) thread(s) of the multimedia presentation in the channel(s). In addition, in some embodiments, the implementation of the controller 582 configuration Yes the t command generator 510 ads to generate ads. The sequence of a variant of implementation of the controller 582 configuration is described below with reference to Fig. 13 and 14.

Fig. 14 illustrates the sequence of operations of the controller 582 configuration (Fig. 13) when multigemini multimedia presentation according to one variant of implementation. With reference to Fig. 13 and 14 one variant of implementation of the controller 582 configuration performs multivescence multimedia presentation, as described below.

At step 602, the controller 582 configuration according to this variant implementation takes configuration information from the administrator. The administrator can manually provide the configuration information to the controller 582 configuration of the server 504. This configuration information may identify each of the channels in terms of the logical address and include information to describe the presentation (described above). For example, the information may include the type(s) of audio-visual information stream(s) multimedia presentation(s) need(s) to be multigemini, transmission speed in bits; the language(s), information, error correction; information security/authentication, encoding, digital rights management (DRM), etc.

In alternative embodiments, the implementation of the controller 582 configuration can be configured in order to extract information describing the presentation directly from the threads themselves (for example, from the header or metadata information included in the threads) after receiving from the source content 502 (Fig. 9) through the interface 586 source.

At step 604, the controller 582 configuration configures the module 584 assignment of threads to show the flow of ads from generator 510 ads and thread(s) of multimedia data from the interface 586 source on the channels, as described in the information describing the view. This stream ads again multimedia channel advertisement server 504 for multicasting multimedia performance.

At step 606, the controller 582 configuration displays information describing the presentation for the thread(s) on the generator 510 ad. As described above, the generator 510 ads generates a stream ads, which includes information describing the view.

As described above, the distribution channels can be selected in advance. For example, the client may be given a logical address (e.g. URL) to connect to multimediamar multimedia presentation. In one embodiment, the first logical address is pre-selected to stream ads in one embodiment. In this example, the following posledovatelnosti address is pre-selected in order to transmit the acceleration channel, while the next sequential logical address is pre-selected to stream high-quality video, and so on, as shown in the embodiment of Fig. 11. The controller 582 configuration configures the module 584 assignment of threads to show the flow of ads, and multimedia streams according to a pre-selected placement of the channels.

Fig. 15 illustrates some of the components of the server 504 (Fig. 9) in another variant implementation. This alternative implementation, the server 504 is essentially similar version of the implementation in Fig. 13, except that this implementation includes a generator 702 accelerated flow. In one embodiment, the generator 702 accelerated flow configured to generate a flow in which each block of multimedia data, multivescence which contains the current sub-blocks of multimedia data and a pre-selected number of previous sub-blocks of data. For example, can be multivescence accelerated flow so that the datagram contains the current frame(s) of the multimedia presentation and the frame(s) of the previous five seconds. In this embodiment, the generator 702 accelerated flow gives accelerated sweat is to configurable module 584 assignment of threads to display (destination) in a dedicated channel of acceleration, such as channel 534 acceleration (Fig. 11). However, in other embodiments, implementation of the datagram channel acceleration should not include the current(e) frame(s).

Fig. 16 illustrates the sequence of operations of the server 504 with generator 702 accelerated flow (Fig. 15) according to one variant of implementation. With reference to Fig. 15 and 16, this variant implementation of the server 504 operates as described below.

At step 802 generator 702 accelerated flow generates a block of multimedia data for multicasting network 508 (Fig. 9). In this embodiment, the generator 702 accelerated flow forming unit using the current subblock data of the multimedia presentation and the previous Z subblock data of the multimedia presentation. As mentioned above, the block may be the datagram or packet and the sub-blocks may be frames of multimedia data. In one embodiment, Z is chosen so as to ensure that the block (that is, a packet or datagram) contains a key frame, you need to play or decode multimedia data. In other embodiments, implementation Z choose regardless of whether provided with the unit, which has a key frame.

At step 804 exercise multivescence block of multimedia data generated at step 802, the network 508 (Fig. 9). In this embodiment executed by the program generator 702 accelerated flow sends a block of multimedia data on a configurable module 584 destination flows, which then displays the unit for the acceleration channel. The server 504 then provides multivescence block of multimedia data over the network 508 (Fig. 9) through a network interface 588. In one embodiment, the server 504 provides multivescence module with a bit rate that is faster than real time" (that is, with a bit rate that is faster than the transmission speed in bits "underlying" (main) multimedia data). This approach advantageously allows the client, with relatively large resources to connect with the acceleration channel and quickly fill the buffer of its multimedia player when the reception unit so that the rendering or playback can begin more quickly. This feature extended versions of the implementation, which multimodemii block of multimedia data includes a key frame. Alternatively, the bit rate at which the server 504 provides multivescence unit must not be "faster than real time. This approach can be used in applications in which the client is simultaneously connected with the acceleration channel, and another channel that carries out multivescence multimedia data so that in fact the client receives the multimedia data at a bit rate that is faster che is in real time".

If more multimedia data should be subjected to multigemini, the sequence of operations returns to step 802, as shown in step 806 decision-making. Thus, for example, using the above example with multimedia frames transported in the datagrams, the following datagram may include the following frame of multimedia data plus the frame, added to the previous datagram, plus the previous (Z-1) frames. Thus, in this embodiment, each block (e.g., datagram) is a sliding window of the current sub-block (e.g., frame) and the previous Z frames, while Z is chosen to be large enough to ensure that each block has enough information to minimize the time required to allow a client media player to start rendering/playing a multimedia presentation. As mentioned above, in some embodiments, the implementation of Z can be chosen so as to ensure that each block has a key frame.

In one embodiment, the blocks of video and audio data is subjected to multigemini alternately on the same channel, if the multimedia presentation includes audio and video streams. In other embodiments implement the program individual channels of acceleration can be used for audio and video streams.

At the beginning of the multimedia presentation in one embodiment, the generator is accelerated flow waits until at least Z of sub-blocks of multimedia data will be subjected to multigemini for non-accelerated(th) channel(s) prior to forming the data block at step 802.

Fig. 17 illustrates the sequence of operations of the client when receiving the accelerated stream according to one variant of implementation. At step 902, the client (for example, one of the clients 5061-506Xin Fig. 9) is connected with the acceleration channel. In some scenarios, the acceleration channel is part of a pre-selected distribution channels, and the client can connect to it simultaneously or not simultaneously with the connection channel ads. As described above, the acceleration channel can advantageously be used by the client, with relatively large resources for receiving and processing multimedia representations so that the client could reduce the startup delay.

At step 904, the client receives one or more blocks of the multimedia data on the acceleration channel. In one embodiment, each block of multimedia data generated as described above with reference to Fig. 16. The client can then process each block of multimedia data, to relatively quickly to begin the process of playback or rendering. In one the m scenario, the client takes a block of video data and the block of audio data, moreover, the data contain video key frame so that the client can begin the process of rendering/playback as soon as possible. As described above, the block should not have a keyframe in other variants of the implementation.

At step 906, the client can then connect with the non-accelerated channel, such as channel 536 high quality videos and the channel 538 audio high quality. In one embodiment, non-accelerated channels, to which is connected the client, are pre-selected using the above-described pre-selected distribution channels. In other embodiments, implementation of the client connects to the channel(s) based on the information describing the representations contained in stream ads. At step 908, the client disconnects from the acceleration channel. In one embodiment, the client disconnects from the acceleration channel immediately after the reception of a block or blocks of multimedia data needed to start the process or processes rendering/playback.

Although the steps 902 to 908 is described as being executed sequentially in the sequence of operations in Fig. 17 (as well as other sequences of operations described here) steps can be performed in a different order than shown, or with some steps that are performed more than once, or with some steps that are performed at the same time, what about, or combinations thereof. For example, in some embodiments, the implementation of steps 902 and 906 execute in parallel so that the sequence of operations was such that when the client connects to the accelerated and non-accelerated channels simultaneously. Step 904 is performed after step 902, and after steps 904 and 906 move on to step 908.

Fig. 17A illustrates an example scenario in which a client can connect to the acceleration channel and some pre-selected channels and then be connected with other channels (for example, on the basis of information declarations adopted for channel listings). In this example, the client connects to the acceleration channel (i.e., step 902) simultaneously with the connection to one or more pre-selected and non-accelerated(s) channel(s) (step 906). The client then receives one or more blocks of the multimedia data on the acceleration channel (step 904), as well as multimedia and data announcements from non-accelerated channels. In the connection to the channel of the ad client may decide to leave (disconnect from) pre-selected(e) channel(s) and to connect with other non-accelerated channels (i.e. stages 572, 564 and 574).

Different ways to implement multicasting described above may be implemented in computing environments with servers and clients. Exemplary computing environment suitable is to use server and client, described below with reference to Fig. 18.

Fig. 18 illustrates a General computing environment 1000 that can be used to perform the described methods. Computing environment 1000 is only one example of a computing environment and is not intended to offer any limitation on the use or functionality of the computer and network architectures. Computing environment 1000 should not be interpreted as having any dependency or requirement relating to any or combination of components illustrated in the exemplary computing environment 1000.

Computing environment 1000 includes a computing device for General purposes in the form of a computer 1002. Components of the computer 1002 may include, but are not limited to, one or more processors or blocks 1004 processing, system memory 1006, and a system bus 1008, which connects various system components including the processor 1004, a system memory 1006.

The system bus 1008 represents one or more of any of several types of bus structures including a memory bus or memory controller, a peripheral bus devices, accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. As an example, this architecture can include with the BOJ bus industry standard Architecture (ISA), the bus, a Microchannel Architecture (MCA), Enhanced ISA (EISA), local bus Standards Association Video Electronics (VESA), bus connection of peripheral components (PCI), also known as Mezzanine bus, PCI Express, Universal Serial Bus (USB), Secure Digital bus (SD), or IEEE 1394, i.e. FireWire.

The computer 1002 may include a number of machine-readable media. Such media can be any available media that is accessible to the computer 1002, and include both volatile and nonvolatile media, removable and non-replaceable media.

System memory 1006 includes machine-readable media in the form of volatile memory, such as RAM, RAM) 1010; and/or non-volatile memory, such as persistent storage device (RAM, ROM) 1012 or flash memory (flash RAM). The system basic input/output (BSW, BIOS 1014 contains the basic routines that help to transfer information between elements within the computer 1002, such as during startup, is stored in ROM 1012 or flash memory. RAM 1010 typically contains data and/or program modules that are immediately accessible to and/or currently used by block 1004 processing.

The computer 1002 may also include other removable/non-replaceable, volatile/energonut is independent of computer storage media. In the example of Fig. 10 illustrates the drive 1016 on the hard disk drive for reading from and writing to a non-replaceable non-volatile magnetic media (not shown), the drive 1018 magnetic disk drive for reading from and writing to removable non-volatile magnetic disk 1020 (e.g., a "floppy disk"), and the drive 1022 optical disk drive for reading from and/or writing to a removable non-volatile optical disk 1024 type CD-ROM, DVD-ROM, or other optical media. Drive 1016 on the hard disk drive 1018 on a magnetic disk and the drive 1022 on an optical disc, each associated with a system bus 1008, one or more interface 1025. Alternatively, the drive 1016 on the hard disk drive 1018 on a magnetic disk and the drive 1022 on the optical disk can be connected to the system bus 1008, one or more interfaces (not shown).

The drives and their associated machine-readable media provide nonvolatile storage of machine-readable commands, data structures, program modules and other data for the computer 1002. Although the example illustrates a hard disk drive 1016, a removable magnetic disk 1020, and a removable optical disk 1024, it should be noted that other types of machine-readable media which can store data that are accessible to the computer, such as magnetic tapes and other magnetic storage devices, cards with flash memory, CD-ROM, digital versatile disks (DVD) or other optical memory, random-access memory (RAM), memory read only (ROM), electrically erasable programmable memory to read-only (EEPROM) and the like, can also be used to implement the exemplary computing system and environment.

Any number of program modules may be stored on the hard disk 1016, a magnetic disk 1020, an optical disc 1024, ROM 1012 and/or the RAM 1010, including as an example, the operating system 1026, one or more application programs 1028, other program modules 1030 and data 1032 programs. Each operating system 1026, one or more application programs 1028, other program modules 1030 and data 1032 program (or some combination thereof) may implement all or part of the resident components that support a distributed file system.

The user can enter commands and information into the computer 1002 through input devices such as keyboard 1034 and device management position 1036 (for example, "mouse"). Other input devices 1038 (not shown specifically) may include a microphone, joystick, game pad, satellite dish, serial port, scanner, and/or similar devices. These and other input devices associated with the block 104 is processing through the interface 1040 input/output, which is connected to the system bus 1008, but may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).

Monitor 1042 or other type of display device may also be connected to the system bus 1008 via an interface, such as the video card 1044. In addition to the monitor 1042 other peripheral output devices can include components such as speakers (not shown) and a printer 1046, which may be associated with the computer 1002 via the interface 1040 input/output.

The computer 1002 may work related to the network environment using logical connections to one or more remote computers, such as remote computing device 1048. As an example, the remote computing device 1048 can be a PC, a laptop computer, a server, a router, a network computer, a peer device or other network node, etc. Remote computing device 1048 is illustrated as a portable computer that can include many or all of the elements and features described herein relative to computer 1002. Alternatively, the computer 1002 may also be employed in unrelated in the network environment.

Logical connections between the computer 1002 and a remote computer 1048 depicted as the locale of the Aya network (LAN) 1050 and wide area network (WAN) 1052. Such networking environments are commonplace in offices, computer enterprise networks, intranets and the Internet.

When implemented in a network environment LAN 1002 is connected to the local network 1050 through a network interface or adapter 1054. When implemented in a WAN network environment, the computer 1002 typically includes a modem 1056 or other means for establishing a connection over the WAN 1052. Modem 1056, which may be internal or external to the computer 1002 may be connected to the system bus 1008 interface 1040 I / o or other appropriate mechanisms. It should be clear that the illustrated network connections are examples and that can be used other means of establishing at least one communication line between the computer 1002 and 1048.

In the linked in network environment, such as illustrated with the computing environment 1000, program modules depicted as belonging to the computer 1002, or part of them can be stored in a remote memory device. As an example, remote application programs 1058 stored in the memory device of the remote computer 1048. To illustrate the application program or programs and other executable program components operating system type illustrated here as discrete modules, although it is obvious that such program and the comp is the components are stored in different times in various memory components of the computing device 1002 and executed by at least one data processor of a computer.

Different modules and methods may be described herein in the General context for executing computer commands, such as program modules, executed by one or more computer or other devices. Typically, program modules include routines, programs, objects, components, data structures, etc. that perform specific tasks or implement particular abstract data types. These software modules, etc. can be performed as a natural code, or can be loaded and executed, for example in a virtual machine or other runtime in-time compilation. Typically, the functionality of the program modules may be combined or distributed as desired in various embodiments of the implementation.

The implementation of these modules and techniques may be stored or transmitted by some form of machine-readable media. Machine-readable media can be any available media that can be accessed through a computer. As an example, but not limitation, machine-readable media may include computer storage media" and "communications".

"Computer storage media include volatile and nonvolatile, removable and non-replaceable media implemented in any method or what technology to store information, for example, machine-readable commands, data structures, program modules or other data. Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical memory, magnetic cassettes, magnetic tape, magnetic storage disks or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer.

"Communication" usually implement machine-readable commands, data structures, program modules or other data in a modulated-data signal such as a carrier signal, or other transport mechanism. Communications also include any means of information delivery. The term "modulated data signal" means a signal that has one or more of its parameter set established or modified in such a manner as to encode information in the signal. Only as a non-limiting example, communication means include wired media type wired network or direct wired connection, and wireless media, like acoustic, RF, infrared and other wireless media. A combination of any of the above t is the train included in the concept of machine-readable media.

Throughout the present description, reference was made to "one implementation", "option exercise" or "approximate option exercise", meaning specifically described feature, structure, or characteristic is included in at least one alternative implementation of the present invention. Thus, the use of such phrases can refer to more than one variant of implementation. Furthermore, the described features, structures or characteristics may be combined in any suitable manner in one or more embodiments.

Specialists can recognize, however, that the invention may be implemented without one or more of the specific details or with other methods, resources, materials, etc. In other cases, known structures, resources, or operations are not shown or not described in detail merely to avoid complicating aspects of the invention.

While the embodiments of examples and applications have been illustrated and described, it should be understood that the invention is not limited to the precise configuration and resources described above. Various modifications, changes and variations, apparent to experts, can be made in the construction, operation and details of the methods and systems of the present invention disclosed herein without separation of the context of the claimed invention.

Although the description above uses language that is specific to structural features and/or methodological acts, it should be understood that the invention claimed in the attached claims is not limited to the specific features or described actions. Rather, the specific features and steps are disclosed as exemplary forms of implementing the invention.

1. Machine-readable media containing a variety of machine-readable commands that, when executed by the computing device forced computing device to perform a method of generating and transferring messages describing multimedia session, containing the steps:
identify some of the content of audiovisual information that must be transmitted in the form of a stream of said computing device to the host computing device requesting that part of the content of audiovisual information;
receive information describing the identified portion of the content of audiovisual information;
generate a message of the transmission Control Protocol real-time (RTCP), which has received the information about the identified portion of the content of audiovisual information, the RTCP message contains:
the first field containing data that identifies a TCP message as related to the message type, which has an embedded message session description;
a second field containing data that is the message of the session description for presenting audiovisual information; and referred to the description message is the session description message session according to the Session Description Protocol (SDP)the SDP information mentioned in the presentation of audiovisual information is made available to the client device through the said RTCP messages without receiving client-specific message session description (SDP);
a third field containing data identifying the length of the RTCP message;
a fourth field containing data that identifies the version of RTP (transport Protocol real-time), used for transmission in the form of a stream representation of audio-visual information;
the fifth field containing data that identifies whether additional padding octets in the RTCP message;
the sixth field containing data that identifies whether the data in the second field is compressed;
the seventh field containing data that identifies the message session description and address of the sender of the message session description;
the eighth field containing data that identifies the number of blocks RTP-state contained in the RTCP message;
the ninth field containing data identifying note the following if the data in the block RTP-state RTCP messages to all RTP packets, have a specific thread ID SDP, or only to the RTP packet having a specific number of RTP payload type;
the tenth field containing data that identifies the number of RTP payload type for the block RTP-state of the RTCP message;
the eleventh field containing data that identifies the stream of audiovisual data of the audiovisual presentation applies to the unit RTP-state of the RTCP message;
the twelfth field containing data that identifies the source of the stream of audiovisual data of the audiovisual presentation applies to the unit RTP-state of the RTCP message;
thirteenth field containing data that identifies the field value of the RTP Timestamp, which is an RTP packet for the stream of audiovisual data of the audiovisual presentation, if the RTP packet is sent at the beginning of the audiovisual presentation;
fourteenth field containing data that identifies the field value numbers RTP sequence of the first RTP packet, which is sent to the stream of audiovisual data of the audiovisual presentation;
fifteenth field containing data that indicate that the RTCP message contains a fragment of the message session description;
sixteenth field containing data that identifies the fragment;
the length of the RTCP message in the third field is generated in pore the STV summation of the length of the first field, the length of the second field, the length of the third field, the length of the fourth field, the length of the fifth field, the length of the sixth field, the length of the seventh field, the length of the eighth field, the length of the ninth field, tenth length field, the length of the eleventh field length twelfth field, the length of the thirteenth field, the length of the fourteenth field, the length of the fifteenth field and length sixteenth field; and
transmit RTCP message to the receiving computing device.

2. Machine-readable media according to claim 1, in which the representation of audio-visual information includes a multimedia presentation.

3. Machine-readable media according to claim 1, in which the data in the seventh field contains a checksum calculated according to the session description and the address of the sender of the message session description.

4. Machine-readable media storing executable computer commands that, when executed by one or more processors of the computing device, force mentioned one or more processors of the computing device to perform a method of receiving audiovisual information using messages describing multimedia session, containing the steps:
take from the source audiovisual content information message of the transmission Control Protocol real-time (RTCP);
extracted from the RTCP message the message the op is writing session, associated with one of the many parts of the content of audiovisual information in the playlist content of audiovisual information transmitted in the form of a stream from a content source audiovisual information on this device, and referred to the description message is the session description message session according to the Session Description Protocol (SDP)the SDP information about one of the many parts of the content of audiovisual information is made available to the client without receiving client-specific message session description (SDP); and
handle one of the many parts of the content of audiovisual information based, at least partially, the above message session description, and the RTCP message contains:
the first field containing data that identifies the RTCP message as related to the type of message that embeds the message session description;
a second field containing data that is the message of the session description for presenting audiovisual information; and referred to the description message is the session description message session according to the Session Description Protocol (SDP)the SDP information of the presentation of audiovisual information is made available to the client device through the said RTCP messages without receiving a client separate soo the communication session description SDP;
a third field containing data identifying the length of the RTCP message;
a fourth field containing data that identifies the version of RTP (transport Protocol real-time), used for transmission in the form of a stream representation of audio-visual information;
the fifth field containing data that identifies whether the message RTCP additional padding octets;
the sixth field containing data that identifies whether the data in the second field is compressed;
the seventh field containing data that identifies the message session description and address of the sender of the message session description;
the eighth field containing data that identifies the number of blocks RTP-state contained in the RTCP message;
the ninth field containing data identifying apply whether the data in the block RTP-state RTCP messages to all RTP packets having a specific thread ID SDP, or only to the RTP packet having a specific number of RTP payload type;
the tenth field containing data that identifies the number of RTP payload type for the block RTP-state of the RTCP message;
the eleventh field containing data that identifies the stream of audiovisual data of the audiovisual presentation applies to the unit RTP-state of the RTCP message;
the twelfth field containing data identifying the source of the stream of audiovisual data of the audiovisual presentation, applies to block RTP-state of the RTCP message;
thirteenth field containing data that identifies the field value of the RTP Timestamp, which is an RTP packet for the stream of audiovisual data of the audiovisual presentation, if the RTP packet is sent at the beginning of the audiovisual presentation;
fourteenth field containing data that identifies the field value numbers RTP sequence of the first RTP packet, which is sent to the stream of audiovisual data of the audiovisual presentation;
fifteenth field containing data that indicate that the RTCP message contains a fragment of the message session description;
sixteenth field containing data that identifies the fragment; and
the length of the RTCP message in the third field is generated by summing the length of the first field, the length of the second field, the length of the third field, the length of the fourth field, the length of the fifth field, the length of the sixth field, the length of the seventh field, the length of the eighth field, the length of the ninth field, tenth length field, the length of the eleventh field length twelfth field, the length of the thirteenth field, the length of the fourteenth field, the length of the fifteenth field and length sixteenth field.

5. Machine-readable media according to claim 4, in which the RTCP message is part of the RTCP packet.

6. Machine-readable media according to claim 4, in which the method will add the flax contains a reproduction of one of the many parts of the content of audiovisual information in the device.

7. Machine-readable media according to claim 4, in which the method further comprises repeating the receiving, extracting, and processing for each of the other parts of the content of audiovisual information in many parts of the content of audiovisual information.

8. A method for sending a message describing the media session, the method is implemented in the device and includes the steps:
create a message of the transmission control Protocol real-time (RTCP), which includes the message session description, and the message session description associated with one of the many parts of the content of audiovisual information in the playlist content of audiovisual information transmitted in the form of a stream from the device to the client device, and the description message is the session description message session according to the Session Description Protocol (SDP), which contains SDP information about one of the many parts of the content of audiovisual information; and
send the RTCP message to the client device, and information about SDP one of the many parts of the content of audiovisual information is available on the client device without sending a separate message session description SDP from this device; and
when the RTCP message contains:
the first field containing data that identifies the message is giving RTCP as related to the message type, which implements the message session description;
a second field containing data that is the message of the session description for presenting audiovisual information; and referred to the description message is the session description message session according to the Session Description Protocol (SDP)the SDP information of the presentation of audiovisual information is made available to the client device through the said RTCP messages without receiving client-specific message session description (SDP);
a third field containing data identifying the length of the RTCP message;
a fourth field containing data that identifies the version of RTP (transport Protocol real-time), used for transmission in the form of a stream representation of audio-visual information;
the fifth field containing data that identifies whether the message RTCP additional padding octets;
the sixth field containing data that identifies whether the data in the second field is compressed;
the seventh field containing data that identifies the message session description and address of the sender of the message session description;
the eighth field containing data that identifies the number of blocks RTP-state contained in the RTCP message;
the ninth field containing data identifying apply whether the data is in the block RTP-state RTCP messages to all RTP packets, have a specific thread ID SDP, or only to the RTP packet having a specific number of RTP payload type;
the tenth field containing data that identifies the number of RTP payload type for the block RTP-state of the RTCP message;
the eleventh field containing data that identifies the stream of audiovisual data of the audiovisual presentation applies to the unit RTP-state of the RTCP message;
the twelfth field containing data that identifies the source of the stream of audiovisual data of the audiovisual presentation applies to the unit RTP-state of the RTCP message;
thirteenth field containing data that identifies the field value of the RTP Timestamp, which will be RTP packet for the stream of audiovisual data of the audiovisual presentation, if the RTP packet is sent at the beginning of the audiovisual presentation;
fourteenth field containing data that identifies the field value numbers RTP sequence of the first RTP packet, which is sent to the stream of audiovisual data of the audiovisual presentation;
fifteenth field containing data that indicate that the RTCP message contains a fragment of the message session description;
sixteenth field containing data that identifies the fragment; and
the length of the RTCP message in the third field generates the I by summing the length of the first field, the length of the second field, the length of the third field, the length of the fourth field, the length of the fifth field, the length of the sixth field, the length of the seventh field, the length of the eighth field, the length of the ninth field, tenth length field, the length of the eleventh field length twelfth field, the length of the thirteenth field, the length of the fourteenth field, the length of the fifteenth field and length sixteenth field.

9. The method of claim 8, further comprising stages:
create a second RTCP message that includes the second message session description, while the second communication session description associated with the second of many parts of the content of audiovisual information in the playlist; and
send the second message RTCP on the client device.

10. The method according to claim 8, in which the device is a server device.

11. The system of transmission and reception of audiovisual information using a message describing the multimedia session includes:
the server device;
client device;
moreover, the server device is configured to:
create a message of the transmission Control Protocol real-time (RTCP), which includes the message session description, and the message session description associated with one of the many parts of the content of audiovisual information in the playlist content of audiovisual information is then, transmitted in the form of a stream from the server device to the client device; and
the description message is the session description message session according to the Session Description Protocol (SDP), which contains SDP information about one of the many parts of the content of audiovisual information; and
send the RTCP message to the client device and the client device is configured to:
receiving from the server device of the RTCP message;
extract from the message RTCP message session description, with the SDP information about one of the many parts of the content of audiovisual information is available on the client device without receiving client-specific message session description SDP from the server device; and
processing one of the many parts of the content of audiovisual information based, at least partially, the message session description,
when the RTCP message contains:
the first field containing data that identifies the RTCP message as related to the type of message that embeds the message session description;
a second field containing data that is the message of the session description for presenting audiovisual information; and referred to the description message is the session description message session according to the Session Description Protocol (SDP), so inform the tion SDP this view audiovisual information is made available to the client device through the said RTCP messages without receiving client-specific message session description SDP;
a third field containing data identifying the length of the RTCP message;
a fourth field containing data that identifies the version of RTP (transport Protocol real-time), used for transmission in the form of a stream representation of audio-visual information;
the fifth field containing data that identifies whether the message RTCP additional padding octets;
the sixth field containing data that identifies whether the data in the second field is compressed;
the seventh field containing data that identifies the message session description and address of the sender of the message session description;
the eighth field containing data that identifies the number of blocks RTP-state contained in the RTCP message;
the ninth field containing data identifying apply whether the data in the block RTP-state RTCP messages to all RTP packets having a specific thread ID SDP, or only to the RTP packet having a specific number of RTP payload type;
the tenth field containing data that identifies the number of RTP payload type for the block RTP-state of the RTCP message;
the eleventh field containing data that identifies the stream of audiovisual data of the audiovisual presentation applies to the unit RTP-state of the RTCP message;
the twelfth field containing data identifying the source of the stream of audiovisual data of the audiovisual presentation, applies to block RTP-state of the RTCP message;
thirteenth field containing data that identifies the field value of the RTP Timestamp, which will be RTP packet for the stream of audiovisual data of the audiovisual presentation, if the RTP packet is sent at the beginning of the audiovisual presentation;
fourteenth field containing data that identifies the field value numbers RTP sequence of the first RTP packet, which is sent to the stream of audiovisual data of the audiovisual presentation;
fifteenth field containing data that indicate that the RTCP message contains a fragment of the message session description;
sixteenth field containing data that identifies the fragment; and
the length of the RTCP message in the third field is generated by summing the length of the first field, the length of the second field, the length of the third field, the length of the fourth field, the length of the fifth field, the length of the sixth field, the length of the seventh field, the length of the eighth field, the length of the ninth field, tenth length field, the length of the eleventh field length twelfth field, the length of the thirteenth field, the length of the fourteenth field, the length of the fifteenth field and length sixteenth field.



 

Same patents:

FIELD: physics, communications.

SUBSTANCE: present invention is related to video-telephone system, in which possibility to transfer and receive visualised moving images at basic radio communication station is provided by use of mobile communication terminal (terminal of PHS or PDA system). Video-telephone system, comprising the following components: mobile communication terminal, which is able to transmit and receive introduced moving images and speech data into basic radio communication station; autonomous basic station that operates to receive mentioned moving images sent to mentioned basic radio communication system and received from it, on the basis of communication protocol that is used between mentioned basic station of radio communication and mentioned mobile communication terminal; and monitor that operates to reflect moving images received by mentioned autonomous basic station.

EFFECT: possibility to realise video-telephone communication with high-quality images using camera installed in mobile communication terminal.

13 cl, 9 dwg

FIELD: radio-engineering.

SUBSTANCE: invention refers to transmission of digital services, and namely to DVB- compatible services (digital video broadcasting). Method involves at least the following steps at which: receiver is connected to a stream of digital data in order to receive that stream of digital data; receiver extracts from the above stream the information on the network location, on the one hand, of streams transmitting contents of the above digital services, and on the other hand, of individual streams transmitting the information on the above digital services; receiver is connected at least to the stream of the above individual streams transmitting information that describes the service, in order to receive the information on those services; receiver uses that information for making the list that includes at least one service available in network.

EFFECT: receiver connected to bidirectional network identifies digital services in bidirectional network.

8 cl, 10 dwg

FIELD: communications engineering, possible use for conducting multi-point video/audio conferences which use transport protocols of TCP/IP stack in star circuit (point to multi-point) in broadcasting networks.

SUBSTANCE: method results in reduction of required total traffic capacity of data transmission channels between remote video conference communication terminals and the server for setting up multi-point video conferences(MCU) due to usage of filtration mechanisms for data streams which transmit video images and voice information.

EFFECT: increased efficiency.

2 cl, 6 dwg

FIELD: communications engineering, transmission of images and signals, possible use for simultaneous transmission and receipt of N digital signals in communication equipment.

SUBSTANCE: the method for encoding and decoding signals includes combination of N synchronous digital signals at transmitting side into one multilevel resulting signal S, p-multiple transformation of original multilevel resulting signal S into new multilevel resulting signal Sp by execution of analog-digital and digital-analog transformations of signals in separate serially connected signal processing units. At receiving side, reverse operations are performed with the signals in comparison with the transmitting side.

EFFECT: possible transmission of one or combined transmission of several N synchronous analog television signals of multi-scale images or different digital signals in M-times lesser frequency band of one analog signal, transformed to digital form.

3 cl, 3 dwg, 1 tbl

FIELD: data transfer networks; transmission and reception of audio and video signals.

SUBSTANCE: as distinct from centralized architecture of building video information service center, proposed system is built as distributed system that has one base video information service center and several local ones coupled with the latter through main-line communication channels. Base video information service center has in addition output to public telephone network, Internet, and mobile communication network. In addition, local information service centers and base one incorporate digital telephone exchanges, and users of local information service centers are additionally connected to these digital telephone exchanges. Technologies of local networks (Ethernet, Fast Ethernet, Gigabit Ethernet) are used for connecting users of local information service centers to servers.

EFFECT: enhanced servicing speed at lower throughput of main-line communication channels.

2 cl, 3 dwg

FIELD: technology for transferring images, can be used for concurrent transmitting and receiving within frequency band of one digital television image signal, N synchronous television signals of multi-gradation images, formed using, for example, television cameras or other sources of black and white, colored, spectrum-zonal, volumetric, or other multi-gradation images, and also can be used for concurrent recording and conservation of N synchronous television signals of multi-gradation images, and can be possibly utilized for construction of video-surveillance systems, television systems for remote probing of Earth surface and multimedia video-information transfer systems.

SUBSTANCE: method for transferring synchronous television signals of multi-gradation images includes, at transferring side, operations for processing and transformation of N synchronous analog television signals to N synchronous digital television signals, by means of their analog-digital transformation, forming of digital synchronization signals, transmission of signals via communication channel, and at receiving side, reversed operations over signals are performed to in comparison with transmitting side, during which after forming of digital synchronization signals, aforementioned signals are joined as one multi-level resulting signal S by means of analog-digital transformation of N synchronous digital signals, further, multi-level resulting signals S is added to synchronization signal, amplified and transferred along communication channel, and at receiving side multi-level resulting signal S is amplified, then synchronization signals are separated from it, controlling and clock pulses are formed, then, operation of analog-digital transformation of multi-level resulting signal S is performed and N synchronous digital television signals are formed in serial code, after that, operations over signals are performed, being reversed relatively to those at transferring side, namely, each digital signal, represented by serial code, is transformed to parallel n-byte binary code, then, forming of N source analog synchronous television signals of multi-gradation images is performed by means of their digital-analog transformation.

EFFECT: possible concurrent transmission of N synchronous analog television signals of multi-gradation images within frequency band of one analog signals, transformed into television form.

2 dwg, 1 tbl

FIELD: communications.

SUBSTANCE: video-phone communication is based on cable television networks. At client side attachment and transmitting camera are mounted. In cable networks commutation blocks are positioned - automatic phone stations, provided with input devices. Clients are distributed among client networks, in which all clients are connected to same two-wire line. Each client has transmission frequencies and receipt frequencies for television signals assigned.

EFFECT: higher efficiency.

3 dwg

The invention relates to a method and apparatus for a television game program

The invention relates to the field of video equipment, in particular to method and system, according to which data transfer is performed by a telecommunications network which at least partly consists of a mobile radio network

The invention relates to the field of education, audiovisual systems, communication systems and computer networks

FIELD: physics; computer engineering.

SUBSTANCE: invention relates to presentation of a remote terminal service (TS) application. Invented are systems and methods of presenting a unified type of remotely installed applications, to which a user has access, based on a terminal sever. A client computer device generates one or more shortcuts for one or more corresponding applications. Each of the applications is installed on one or more install points of an intranet. The client computer device is external with respect to the intranet. One or more shortcuts are combined, thereby presenting a combined type of applications. The combined type is transparent from the point of view of whether the applications are managed by different information sources in the intranet and/or configured for TS server based remote execution of different information sources of one or more install points.

EFFECT: easier launching of remotely installed applications by a user.

21 cl, 5 dwg

Network zones // 2363041

FIELD: information technology.

SUBSTANCE: invention relates to computer security and, more specifically, to managing connection to computer applications and managing data transfer using predefined network zones to networks with varying properties. A computer assigns networks to network zones, based on predefined properties for each zone and/or network properties. An application program installed on the computer provides the computer with preference information, which indicates the network zone, network strategy or properties most suitable for the application program. After execution of the application program, the computer limits network contact of the application program with the network (networks) assigned to network zone (network zones), identified as the preferred network zone (network zones) or identified by preferred network property or properties from preference information from the application program.

EFFECT: increased security of computers and data transferred on a network (networks).

51 cl, 7 dwg

FIELD: computer engineering.

SUBSTANCE: invention relates to computer engineering and is meant for use as intelligent information technology when making a local network. In the operating state, the local network contains one switching master local network station (LNS), several switching and non-switching slave local network stations and one information transmission line or several duplicated with shared access and centralised determined management of message exchange between legitimate subscribers of all local network stations. The method is realised through composition, sending on the network, receiving from the network and breaking down two types of control frames, data frames of one type and three types of response frames. In the information process senders of types of frames and their sequence is determined taking into account status of the network, corresponding to the operating or first anomalous state, arising with simultaneous appearance in the network of more than one switching master local network station during switching, due to failure, of at least one switching slave local network station to master state.

EFFECT: improved quality of local network through intelligent composition of six types of frames.

FIELD: information technology.

SUBSTANCE: system of digital media in a computer network consists of massive servers for the adaptation of streams of content of the provider (computer-aided manufacturing system), appropriating IP addresses of a computer network, access to which is possible through many network terminals (STB or personal computers), containing a player of content, the module of demand of the content is connected with the servers of administration of access of the subscriber in the local computer network and the server giving session keys for the protection of the operating words of the provider. By means of the SK on the CAM system controlled by the computer network, encoding CW by means of which the content of the provider is protected, placed further in rights usage management a stream of the content, further the control of access over network terminals (CT) is organised for the subscribers to the IP addresses appointed for adapted streams of content of the provider, by control facilities and configuration. In this case to realise the flexibility according to the tariff plans not achievable with the use of the chip of cards of conditional access of extended CAS.

EFFECT: possibility of relaying of the protected content by the provider in a computer network with the retention of control over subscribers from the side of the provider of the content.

60 cl, 3 dwg

FIELD: physics, computer equipment.

SUBSTANCE: invention is related to automatics and computer equipment and may be used in building of distributed systems of program control of technological processes, robots and robotic complexes, and also subsystems of logical control of multilevel hierarchical automated control systems and multiprocessor systems of wide class. Device contains M*N single-type modules, which are combined into matrix structure, where N - number of modules in line of network matrix structure, M - number of lines, at that every module of microcontroller network includes unit of program memory, address register, register of commands, multiplexer of logical conditions, address commutator, unit of synchronisation, elements OR, register of match vector, buffer register, decoders of synchronisation top number, unit of OR elements, univibrators, delay element, group of synchronisation control module units, multiplexer of synchronisation channels, register of configuration, multiplexers, decoder of synchronisation channels, generator of synchronisation control signals, two groups of AND elements.

EFFECT: expansion of microcontroller network application by lower complexity of intermodule interface.

4 cl, 10 dwg

FIELD: physics.

SUBSTANCE: invention pertains to the field of increasing information security in computer networks, in particular to the system of preventing intrusion into network servers. The filter (302) for analysing incoming commands processes commands directed to the web-server, through comparison with a previously entered set of commands for the web-server, which are stored in comparison tables. Intercepted commands for which no correspondence is found during comparison with the previously entered commands are deleted. In that case the intercepted commands are not transmitted to the web-server. The system can contain extra algorithms for recursive checking of files in catalogues, located lower than the root directory of the web-server, as well as means of monitoring web-server resources and lowering of the level of use of the web-server resources when the last threshold value is exceeded.

EFFECT: protection of data of the web-server.

32 cl, 7 dwg

FIELD: physics, computing.

SUBSTANCE: invention relates to local area networks. The technical effect is the increase in the local area network quality by generating seven frame types, decrease in the quantity of frame types and overhead costs for the data packet (DP) transmission implementation, and provision of automatic network restoration from a failure/malfunction state when more than one master local area network stations (LNSs) appear in the network or the master LNS disappears from the network, correspondingly. The method of local area network operation with the data communication line with common access and centralised determinated message communication control based on their separation into DPs and the DP transmission in information frames between the addressed subscribers of all its LNSs is implemented using three types of control frames (CF1, CF2, CF3), an information frame (IF), and three types of response frames (RF1, RF2, RF3), CF1 containing the first start combination (SC1), a discriminator field, an individual LNS address field, an addressing modifier (AM), a control command (CC) field, and a monitoring bit; the difference of CF2 from CF1 is the contents of the CC field and the presence of a service word following the CC field; CF3 contains SC1, a discriminator field, an individual DP recipient address field, an individual DP sender address field, a DP length field, an AM, a communication method modifier (CMM), and a monitoring bit, the IF contains the second start combination, a DP field, and a resolvable checksum word, RF1 contains SC1, a discriminator field, an LNS status word, and a monitoring bit, the difference of RF2 from RF1 is the presence of a service word following the LNS status word field, RF3 is CF1 or CF2 or CF3 with inversed values of bits in the discriminator field.

EFFECT: increase in local area network quality, decrease in quantity of frame types and overhead costs for implementation of data packet transmission, and provision of automatic network restoration in event of malfunction.

FIELD: physics; communications.

SUBSTANCE: invention relates to the authentication in communications. The method and device for speech encryption at cellular authentication in the extensible authentication protocol format is put forward.

EFFECT: providing of usual authentication and setting format in communications.

11 cl, 10 dwg

FIELD: computer engineering, possible use for creating multi-processor multi-thread computers.

SUBSTANCE: method for organization of multi-processor computer includes parallel execution of a thread of computations by means of distributed representation of thread descriptor stored in virtual memory, execution of primary selection of architecture commands by means of thread monitors, generation of graph for information dependencies of transactions, which are serially outputted through network into execution clusters, active thread is transferred to resident queue of transactions awaiting completion and next active thread is selected, by sequencers of execution clusters transactions are received and their commands and aforementioned graph are copied to registry file of cluster, execution-ready commands are copied to priority-ordered secondary selection queues, aforementioned selection and transfer of complete commands to the cluster are performed, graph is corrected based on these, on basis of correction results, the finalized command is added to either secondary selection queue or transaction completion result is transferred to monitor, thread is moved to queue for completed threads with correction of thread descriptor representation root, where completed thread is removed from waiting queue, and completion reason is outputted as a result available for software analysis.

EFFECT: fully hardware-based realization of multi-program control over threads with priority-based exclusion with precision up to an individual command.

FIELD: technology for solving network analysis problems provided by probability graphs.

SUBSTANCE: device contains clock impulse generator, cycle counter, input field, group of AND elements, output counter block, delay element, AND element, OR element, control signal block, second AND elements, pseudo-random series generators, pseudo-random series generators, pseudo-random series generators.

EFFECT: expanded functional capabilities due to modeling of various communication channel conditions, increased trustworthiness of results due to modeling of various types of failures of communication units in process of operation.

2 dwg

FIELD: computer science.

SUBSTANCE: device has matrix of m rows and n columns of homogeneous environment elements, block for finding maximum, adder, memory block, m blocks for counting units, block for estimation of channels load level, containing two pulse generators, two row selection decoders, unary value selection decoder, element selection decoder, element selection multiplexer, channel load decoder, two comparison elements, m channel load counters, two groups of m OR elements, two groups of m forbidding elements, current column counter, group of m AND elements, third group of m OR elements, two groups of m triggers, two row counters, two column counters, two OR elements, delay elements, counter of next column.

EFFECT: broader functional capabilities.

5 dwg

Up!