Systems and methods for channels switching

FIELD: information technologies.

SUBSTANCE: CSF is developed with one or more units of network abstraction level (NAL) as a frame of a random access point (RAP), and adjacent frames are sent, which include CSF and a frame, which is not a RAP frame, besides, each of them has an identical identification number.

EFFECT: provision of video coding and video decoding of channels switching frames, making it possible to provide for a grip and resynchronisation of a video flow with preservation of compression efficiency.

46 cl, 21 dwg

 

Cross references to related applications

This patent application claims the positive effect of belonging to the same holder of the provisional application U.S. No. 60/865822 titled "SYSTEMS AND METHODS FOR CHANNEL SWITCHING", filed November 14, 2006. This provisional patent application incorporated herein by reference.

This application is in all respects fully incorporates by reference belong to the holder of the patent application U.S. No. 11/527306, filed September 25, 2006, and No. 11/528303, filed September 26, 2006.

The technical field to which the invention relates

This invention relates to processing a multimedia signal and, in particular, relates to methods of coding and videodatabase frame switching channels (CSF) to allow capture and resynchronization of the video stream while maintaining compression efficiency.

The level of technology

Multimedia processing, such as video encoders can encode the multimedia data by using the encoding method on the basis of international standards, such as standards of the Expert group on cinematography (MPEG)-1, -2 and -4, the H.263 standard of the International telecommunication Union (ITU)-T, standard ITU-T H.264 and its counterpart, ISO/IEC MPEG-4 part 10, i.e. standard advanced videocoding the project (AVC), each of which is entirely incorporated herein by reference. These methods of coding in General aim at compressing multimedia data for transmission and/or storage. Compression in the broadest sense can be viewed as the process of removing redundancy from the multimedia data.

The video signal can be described on the basis of the sequence of images, which include personnel (whole image) or field (for example, an interlaced video stream contains fields alternating odd or even rows of the image). Used herein, the term "frame" refers to the image frame or field. The methods provide compression coding of video signals by use of the compression algorithm is lossless or lossy compression of each frame. Intraframe encoding refers to the encoding of the frame using only the frame. Interframe encoding refers to the encoding of the frame on the basis of other, so-called "reference" frame. For example, the video signals often differ temporal redundancy in which nearby frames in a temporal sequence of frames have at least a few areas that are fully or at least partially coincide with each other.

Multimedia processors, such as video encoders can encode the frame, R is sbiba it on blocks or macroblocks", consisting, for example, 1616 pixels. The encoder may further divide each macroblock into sub-blocks. Each subunit may also contain additional subunits. For example, the sub-blocks of the macroblock may include the sub-blocks of 168 and 816. The subunits of the sub-blocks of 816 can include sub-blocks of 88, which may contain sub-blocks of 44, etc. Used herein, the term "block" refers to either the macroblock or subblock.

Coders take advantage of the temporal redundancy between successive frames, applying algorithms interframe coding based on motion compensation. Algorithms for motion compensation identify portions of one or more reference frames, which at least partially correspond to one block. This unit can be shifted in the frame relative to the area of coincidence of the reference frame (frame). This shift is characterized by one or more motion vectors. Any differences between this unit and the area of overlap of the reference frame (frames) can be characterized on the basis of one or more residues. The encoder can encode the frame in the form of data that contain one or more motion vectors and residuals for a particular variant of the split frame. Specific variant of split blocks for encoding of the frame can be selected by AP the military minimize the cost function, which, for example, compares the bit coding with distortion or perceived distortion for the content frame, the resulting encoding.

Interframe coding allows to provide more efficient compression than the intraframe coding. However, interframe coding can create problems associated with loss of supporting data (e.g., reference frames or reference fields) due to channel errors, etc. in Addition to the loss of the reference data due to errors these reference data can also be available from the initial capture or re-capture video frame interframe encoding. In such cases, the decoding data encoded by the way interframe coding may not be possible or may lead to unwanted errors and their propagation. These scenarios can lead to the loss of synchronization of the video stream.

The most General form of the frame, which allows resynchronization of the video signal is independently decoded frame intraframe coding. In the MPEG-x and H.26x, so-called "group of pictures" (GOP), containing frame intraframe coding (also called I-frames and P-frames are encoded based on the time of the prediction, or B-frames are encoded based on the forecast the of "forward/backward", accessing the I-frame and/or P and/or b frames in the GOP group. To increase the degree of compression is desirable to have a longer GOP group; but it should be borne in mind that the shorter GOP group allow for faster capture and resynchronization. The increase in the number of I-frames allows for quick capture and resynchronization, however, due to lower compression.

Therefore, there is a need in the ways of coding and videodatabase frame switching channels (CSF), allowing the capture and resynchronization of the video stream while maintaining compression efficiency.

The invention

Methods of coding and videodatabase frame switching channels (CSF), enabling capture and resynchronization of the video stream while maintaining compression efficiency. According to one aspect, an apparatus containing a processor used to create the frame channel (CSF) from one or more blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream.

Another aspect includes a computer program product containing a machine-readable medium of commands causing the computer to create the frame channel (CSF) from one or a few the fir blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream.

Another aspect includes a device containing the processor that is used to decode one or more adjacent frames, each of which has the same identification number of the frame and the first frame of the adjacent frames is a frame random access point (RAP), and the second frame is not a frame RAP.

Additional aspects of the invention will become more apparent from the detailed description of the invention, in particular, taken together with the attached drawings.

Brief description of drawings

Figure 1 - block diagram of an exemplary system for multimedia communication according to specific configurations;

figa is a block diagram of an exemplary encoding device that can be used in the system of figure 1;

FIGU is a block diagram of an exemplary decoding device that can be used in the system of figure 1;

figure 3 - approximate relationship between message-level synchronization and real-time media stream, which is displayed on the device or output device on the network FLO;

figure 4 - approximate levels of Protocol for real-time network FLO;

figa-5V - alternative exemplary relationship between packet-level synchronization and mediacare;

6 is an exemplary state machine for processing the sync level for an individual flow in the device.

7 is an exemplary frame switching channels (CSF);

Fig - sample frame (CSF) with 3 blocks NAL;

Fig.9 - generator frame switching channels;

figure 10 - process for decoding a bit stream with the staff CSF;

11 - generator header synchronization, creating a header synchronization;

figa generator additional field generator header synchronization;

figw - generator type adaptation generator header synchronization;

figa - assembler General mediaserve;

figw - assembler header, reflecting the specificity of media;

Fig - assembler-level directory synchronization video;

Fig - assembler record VSL;

Fig - block diagram of the process with sync level on MediaBrowser in the device;

Fig is an example of the bitstream generated by the network with the same identification numbers of adjacent frames.

The image on the drawings for illustrative purposes, is shown in simplified form and not to scale. To facilitate understanding, where possible, used the same reference position to designate identical elements that are common to all drawings, except that to distinguish some elements added indexes when necessary.

In the accompanying drawings is shown an exemplary configuration of the invention, and they should not be construed as limiting the scope invented the I, which allows other, less-effective configuration. It is assumed that the signs or blocks from one configuration can be successfully included in other configurations without additional enumeration.

Detailed description of the invention

Reduction

In the following description uses the following abbreviations:

FLO - Link only in the forward direction;

IDR - instant update of the decoding;

IEC - international electrotechnical Commission;

IETF - Engineering group on the development of the Internet;

ISO - international organization for standardization;

ITU - international telecommunication Union;

ITU-T telecommunication standardization Sector of ITU;

NAL - level network abstractions;

RBSP - payload raw byte sequence;

TIA - Association of communications industry;

TM3 - Terrestrial mobile multimedia multicast;

UINT - unsigned integer mark;

RAP - point random access;

PTS - the timestamp of the view.

Used herein, the term "exemplary" means "serving as an example, option, or illustration". Any configuration or technical solution described herein as "exemplary", not necessarily be construed as preferred or having an advantage over other configurations or technical re enemy, and the terms "core", "mechanism", "car", "processor" and "processing unit" are used interchangeably.

The described methods can be used for wireless communication, data processing, personal electronic devices, etc. Approximate the use of these methods for wireless communication is described below.

Subsequent detailed description focused on some custom configuration of the invention. However, the invention can be embodied in many different ways, as defined and covered by the claims. In the following description referring to the drawings, in which similar components are denoted by the same everywhere reference positions.

The video signals can be characterized on the basis of the sequence of images, frames and/or fields, each of which may optionally include one or more segments (slices). Used herein, the term "frame" has a wide meaning and can cover one or several frames, fields, images, and/or slices.

Configurations include systems and methods that facilitate the switching of channels in the system for multimedia transmissions. Multimedia data may include one or more of the following: TV or film image, sound, photographic images, text or audio-visual data of any other is of the appropriate type.

Figure 1 shows a block diagram of an exemplary system 100 multimedia communications according to some configurations. The system 100 includes an encoder 110, associated with a decoding device 150 via the network 140. In one example, the encoder 110 receives a media signal from an external source 102 and encodes the signal for transmission over the network 140.

In this example, the encoder 110 includes a processor 112 that is associated with the memory 114 and the transceiver 116. The processor 112 encodes the data from the source media data and delivers them to the transceiver 116 for transmission over the network 140.

In this example, the decoding device 150 includes a processor 152, associated with the memory transceiver 154 and 156. Although the decoding device 150 may have a transceiver 156 for transmission and for reception, decoding device 150 need only receiver, such as receiver 158. The processor 152 may include one or more General-purpose processors and/or digital signal processor. The memory 154 may include one or more storage devices for solid-state basis or on disks. The transceiver 156 is configured to receive multimedia data over the network 140 and return them to the processor 152 for decoding. In one example, the transceiver 16 includes the devil is Novotny transceiver. Network 140 may include one or more wired or wireless links, including one or more of the following: Ethernet, telephone (e.g., plain old telephone system (POTS)and cable systems, communication lines, electrical and fiber-optic systems, and/or wireless system that contains one or more of the following systems: communication system with multiple access and code division multiple access (CDMA or CDMA2000)system with multiple access and frequency division multiple access (FDMA)system, multiple access and orthogonal frequency division multiplexing (OFDM system with multiple access and time division multiplexing (TDMA), such as GSM/GPRS the General Packet radio service destination)/EDGE (enhanced GSM environment for high-speed data transfer), the mobile communication system TETRA (terrestrial trunk radio), wideband multiple access and code division multiplexing (WCDMA), high speed data transmission system (1xEV-DO or 1xEV-DO Gold multicast)system, an IEEE 802.11 system, mediaFLO, DMB system, the system DVB-H, etc.

On figa shows a block diagram of an exemplary encoder device 110, which can be used in the system 100 of figure 1 according to some configurations. In this configuration, the encoder 110 contains an element 118 for between igrovogo encoding, element 120 for intraframe coding, the element 122 that generates reference data, and the transmitting element 124. Element 118 for interframe coding encodes part of the video data using interframe coding with temporal prediction (for example, using prediction with motion compensation), referring to other parts of the video data located in other time frames. Element 120 for intraframe coding encodes part of the video data using intra-frame encoding, and these parts can be decoded independently without referring to other temporarily attached image data. In some configurations, the element 120 for intraframe coding may use spatial prediction to take advantage of redundancy in the other video, which is in the same time frame.

The generator 122 of the reference data according to one aspect creates data that indicate, where the video data is coded by intraframe coding and interframe coding, which were created by coded elements 120 and 118, respectively. For example, the reference data may include identifiers of the sub-blocks and/or macroblocks that are used by the decoder to determine the location in the frame. Reference data may also include, at alcovy the frame number, used to determine the location of the frame in the sequence of video frames.

The transmitter 124 transmits the data encoded by interframe coding, data encoded by intraframe coding, and in some configurations, and reference data across the network, such as network 140 figure 1. Data can be transmitted over one or more communication lines. The term "line" is used in a General sense and may include any communication channels, including but not limited to wired or wireless networks, virtual channels, optical communication lines, etc. In some configurations, data encoded by intraframe coding, transmitted via communication base level, and the data encoded by interframe coding, transmitted over the communication line level improvements. In some configurations, data encoded by intraframe coding, and data encoded by interframe coding are transmitted over the same communication line. In some configurations, one or more of the following types of data: data encoded by interframe coding, data encoded by intraframe coding, and/or reference data can be transmitted over the communication line on the side of the band. For example, you can use the line on Boko is th frequency band, such as SEI message (format additional information support) in the H.264 standard or message user_data in the MPEG-2 standard. In some configurations, one or more of the following: data encoded by intraframe coding, data encoded by interframe coding, and/or reference data, is transmitted over a virtual channel. The virtual channel may contain packets identified by the packet header that identifies the data packet as belonging to a given virtual channel. Specialists in this field known in the art, other types of identifying a virtual channel, such as frequency division, time division, code extension, etc.

On FIGU shows a block diagram of an exemplary decoding device 150, which may be used by the system 100 of figure 1 according to some configurations. In this configuration, the decoder 150 includes a receiving element 158, selective decoding element 160, the element 162 that defines the reference data, and one or more detectors connected to the reference data, such as item 164 of the detector channel, and the element 166 detector errors.

The receiver 158 receives encoded video data (e.g., data encoded by the encoding device 110 according to figures 1 and 2A). The receiver 158 may take the AMB encoded data over a wired or wireless network, such as the network 140 of figure 1. Data can be received by one or more communication lines. In some configurations, data encoded by intraframe coding, taken along the line a basic level, and the data coded by inter-frame encoding, accept line-level improvements. In some configurations, data encoded by intraframe coding, and data encoded by interframe coding, taken on the same line. In some configurations, one or more of the following types of data: data encoded by interframe coding, data encoded by intraframe coding, and/or reference data, may be passed over the connection on the side of the band. For example, you can use the line on the side of the band, such as SEI messages in the H.264 standard or message user_data in the MPEG-2 standard. In some configurations, one or more of the following: data encoded by intraframe coding, data encoded by interframe coding, and/or the reference data is received through the virtual channel. The virtual channel may contain packets identified by the packet header that identifies the data packet as belonging to this in toilsome channel. Specialists in this field known in the art, other types of identifying a virtual channel.

Selective decoder 160 decodes the received video data encoded by interframe coding and intraframe coding. In some configurations, the received data contains the version part of the video data encoded by interframe coding, and the version part of the video data coded by intraframe coding. Data encoded by interframe coding can be decoded after the decoded reference data on the basis of which they were predicted. For example, data encoded using prediction with motion compensation, contain a motion vector and a frame identifier that identifies the location of the reference data. If part of the frame identified by the motion vector and the frame identifier for the version encoded by interframe coding available (for example, already decoded), then the selective decoder 160 may decode this version, encoded by interframe coding. However, if the supporting data is not available, then the selective decoder 160 may decode the version encoded by intraframe encoding.

Keys 162 reference data according to one the mu aspect identifies the received reference data, which indicate where the video data is encoded by intraframe encoding and interframe encoding, the received encoded video data. For example, the reference data may include identifiers of the sub-blocks and/or macroblocks that are used selective decoder 160 for positioning in the frame. Reference data may also include the sequence number of the frame that is used to determine the location of the frame in the sequence of video frames. Using the received reference data allows the decoder to determine whether the reference data, depending on the data encoded by interframe encoding.

On the availability of reference data can be influenced by the user, switching the channel in a multichannel communication system. For example, the receiver 158 may be available many broadcasts video using one or more communication lines. If the user gives a command to the receiver 158 to switch to another broadcast channel, then the reference data for the data encoded by interframe coding, the new channel may not be immediately available. The detector 164 channel detects the occurrence of a command for switching channels and signals about this election decoder 160. For the eat electoral decoder 160 may use the information obtained from the determinant of the reference data to identify whether the reference data version, encoded by interframe coding, and then identifies the location of the nearest version, obtained by intraframe coding, and selectively decodes identified this version, encoded by intraframe encoding.

On the availability of reference data can also be affected by errors in the received video data. Detector 166 errors may use technologies to detect errors (e.g., forward error correction) to identify uncorrectable errors in the bit stream. If uncorrectable errors exist in the reference data, depends on the version encoded by interframe coding, then the detector 166 errors may signal about this election decoder 160, identifying which video data is affected by these errors. Then, the selective decoder 160 may determine whether to decode the version encoded by interframe coding (for example, if the supporting data is available), or to decode the version coded by intraframe coding (for example, if the reference data is unavailable).

In some configurations, one or more elements of the coding device 110 figa can be redesigned and/and the and United. These elements can be implemented in hardware, software, firmware, hardware, middleware software, microcode, or any combination thereof. In some configurations, one or more elements of the decoder 150 figv can be redesigned and/or merged. These elements can be implemented in hardware, software, firmware, hardware, middleware software, microcode, or any combination thereof.

Video

Some configuration described here can be implemented using the coding MediaFLOTMfor the delivery of video services, real-time systems TM3, based on the specification of the FLO air interface Specification radio communication only in the forward direction (FLO) for terrestrial mobile multimedia multicast transmission", published as technical standard TIA-1099, which is entirely incorporated herein by reference. Some configurations define the syntax and semantics of the bitstream and the decoding process to provide these services through the levels 412 of the FLO air interface.

The description given here forms at least partially the standard compatibility for multimedia multicast si is the FLO which helps adaptive device 304 FLO to receive service (s) through any network 302 FLO (figure 3), satisfy this standard.

Normative references

Recommendation H.264 ITU-T and/or the international standard ISO/IEC 14496-10 advanced video coding (referred to here as the "standard H.264/AVC") is entirely included here by reference, and can be invoked on specific aspects.

For configurations described here apply the definitions in paragraph 3 of the standard H.264/AVC. In addition, the frame switching channels (CSF) as described here, an exemplary configuration is defined as the encoded image containing a set of sequence parameters and/or set parameters of the image and/or image instant updates decoding. Frame switching channels (CSF) can be encapsulated in an independent package of transport Protocol to allow random access points in the coded bit stream or to help troubleshoot errors. Frames switching channels (CSF) is defined below.

The conventions used here for operators, recording systems, mathematical functions, syntax elements, tables, and processes that match the specified in paragraph 5 of standard H.264/AVC.

Some configuration described here include a description of the volume of the well, normative references, definitions, abbreviations, and the structure of the invention, and a description of the syntax, semantics and processes the decoded bit stream.

The format of the bitstream low level of complexity and decoding for media broadcast

In the description here, among other things, describes an exemplary format of the bitstream and the decoding process, which provides the expansion of low complexity for multimedia broadcast. The bit stream corresponding to the expansion of low complexity described in this specification, consistent with the profiles in section A.2 of the standard H.264/AVC with the following additional limitations and extensions: 1) the parameter Sets the sequence can have profile_idc equal to 66 or 88; 2) the parameter Sets the sequence can be constraint_set0_flag is equal to 0; 3) Sets of parameters sequence can have constraint_set1_flag is equal to 1; 4) Sets of parameters sequence can have constraint_set2_flag, 0; 5) May be present in the slice type; and/or 6) Slices for images can to have nal_ref_idc equal to 0 (idc represents the index profile).

According to another aspect of this configuration bit stream corresponding to the expansion of low complexity described in this specification that corresponds to the light profiles in section A.2 of the standard H.264/AVC with the following limitations and extensions: 1) the parameter Sets the sequence can have profile_idc, equal to 66 or 88; 2) the parameter Sets the sequence can be constraint_set0_flag is equal to 1; 3) Sets of parameters sequence can have constraint_set1_flag equal to 0; 4) the parameter Sets the sequence can be constraint_set2_flag equal to 1; 5) May be a type In the slice type; and/or 6) Slices for In-images can have nal_ref_idc equal to 0.

Frame switching channels

7 shows an exemplary frame 700 switching channels (CSF). To allow the change of the channel environment MediaFLOTMand to facilitate Troubleshooting, coding elements 120 according to some configurations can insert frames switching channels (CSF). The frame 700 switching channels (CSF) can contain up to 3 or more blocks of the NAL: NAL1... NALXdenoted by reference positions 702 and 704. X can be 2 or more. However, CSF 700 can have only one NAL unit.

On Fig shows the approximate CSF frame 800 with 3 blocks NAL. In this example, the frame CSF 800 contains 3 blocks 802, 804 and 806 NAL. If you use 3 NAL unit, in some situations, if available, may be the next order in the bit stream: a set 812 parameters sequence (SPS), a set 814 image parameters (PPS) and instant update 816 decode (IDR). Block IDR NAL can be a block IDR NAL low quality.

This arrangement CSF are shown in Table 1. T the blitz 1 determines what types of NAL unit being currently used for CSF 800. In this exemplary configuration, the type of the NAL unit include the numbers 7, 8 and 5. However, in other circumstances, type 5 for IDR NAL can be replaced by type 1 NAL I-frame (coded slice). Payload (RBSP indicates the payload raw byte sequence, which is presented in the column under the heading "RBSP syntax structure". Column nal_unit_type is a type of NAL unit used here to frame CSF. The column is other supporting structure. For example, the numbers 2, 3 and 4 represent sections a, b and C data. Room 1 also represents the NAL unit 1 coded slice. The number 0 is not defined.

Table 1
The NAL units and RBSP syntax for frame switching channels
The content of the NAL unitRBSP syntax structureNal_unit_typeC
The set of parameters of a sequenceseq_parameter_set_rbsp()70
Set the image settingspic_paramete_set_rbsp()8 1
Coded slice of an IDR imageslice_layer_with_
partitioning_rbsp()
52, 3
Coded sliceSlice_layer_with_
partitioning_rbsp()
12, 3, 4

Syntax, semantics and decoding processes for these blocks NAL match in the standard H.264/AVC.

Specifications parameters frame switching channels

The semantics of bit streams frame switching channels (CSF) different requirements for the number of syntax elements, variables, and functions that are different from the requirements of the standard H.264/AVC.

Figure 9 shows the generator 900 frame switching channels (CSF). Generator 900 CSF includes generator 902 set of SPS, the generator 904 set PPS generator 906 update the IDR and the generator 908 I-frame. These requirements differ from the standard H.264/AVC. Generator 902 set of SPS allows to block NAL result set of parameters sequence (SPS) frame CSF 800 had pic_order_cnt_type is equal to 0. In addition, the generator 902 set of SPS allows to block NAL result set of parameters sequence (SPS) frame CSF 800 had the flag gaps_in_frm_num_value_allowed_flag equal to 0.

Generator set PPS creates the NAL block resultyou the corresponding PPS. Generator 908 I-frame creates the NAL unit I-frame. Generator 906 update IDR creates a block NAL result IDR, so that the syntactic element pic_order_cnt_lsb for image IDR can be non-zero. The IDR picture, PicOrderCnt(), equals the image of the corresponding P-slice PicOrderCnt(). In addition, a syntax element frame_num image IDR can be non-zero. frame_num image IDR frame_num equal to the corresponding P-slice. frame_num next image may be equal to (frame_num+1)%MaxFrameNum.

Thus, the generator IDR includes the transmitter 910 counter value image sequence (POC) for IDR, which sets the value of the POC unit IDR NAL equal to the value of POC P-slice. Generator IDR also includes the transmitter 912 frame number of the image IDR, which sets the frame number of the image is equal to the number of frame images of the P-slice. Generator IDR in some cases also provides non-zero values of frame images, and counter POC. The encoder 110 keeps track of the frame number in block 916, where the frame_num of the image may be equal (frame_num+1)%MaxFrameNum.

The encoder 110 may track the value of the variable PrevRefFrameNum, so it can be set equal to the value of CSF frame_num minus 1.

The decoding of the frame channel

Figure 10 shows a process 1000 for decoding the bit stream of the frames CSF. Process 1000 for decoding the I-slice in the form specified in paragraph 8 of the standard H.264/AVC, can be used for decoding of the frame channel (CSF), if the NAL unit c IDR replaced by a NAL unit with the I-slice (coded slice NAL type 1), created by the generator 908 I-frame. The image at the requested channel to the order o (display) before the frame channel (CSF) can be discarded. Terms of order o changes in future decoding of the images are missing. Future image, following behind the scenes CSF cannot be used as a reference frame any image that is displayed before the frame CSF.

In various configurations below, the blocks in the flowchart are performed in the depicted order or these units or parts thereof can be performed simultaneously, in parallel or in a different order.

Thus, the process 1000 decoding begins at block 1002, where the decoded bit stream with images. Block 1002 is followed by block 1004, where it is determined that the detected whether the frame CSF. If not detected ("NO"), then going to loop back from block 1004 block 1002, where a further decoding of the bitstream.

However, if in block 1004 the result of determination is "YES", then the CSF frame is decoded according to the I-slices and/or Protocol type of the NAL unit. The unit is m 1006 is followed by block 1008, where it is determined whether the output order any images requested channel before the frame CSF. If it is determined "YES", then the image in block 1010 discarded. At block 1010 is followed by block 1012. However, if in block 1008 determined "NO", then at block 1008 is followed by block 1012. In block 1012 determines whether the output order any image in the requested channel after the frame CSF. If it is determined "YES", in block 1014 as a non-key frames are set image before CSF frame in output order. At block 1010 is followed by block 1012. However, if it is determined "NO", then loop back from block 1012 block 1002. Also performs loop back from block 1014 block 1002, where the standard decoding. Not a keyframe can be set by resetting the frame or by push installation frame as a keyframe.

The level of synchronization

The MediaFLO systemTMcan deliver content at least three types: real-time, non-real-time and technology IP Datacast (for example, multicast, unicast, and so on). The network interface device multicast (MDNI) to deliver services real-time is shown in figure 4.

Figure 3 shows the approximate relationship between messages of level C is chronic and real-time media stream, output device 304 or the output device 304 in the network 302 FLO. Example network 302 FLO can support continuous delivery of streaming content in real time to the device 304. Each stream can be delivered separately, and the corresponding flow can be identified as belonging to a common service or set of services through the use of system information. The network 302 can optionally provide data that allows devices to synchronize the media streams in real time with each other and align them with the requirements of the temporary presentation of the content. Level to combine media streams and data synchronization known as level 406 synchronization.

In the device 304 required for access to the service, real-time positioning this service uses the system information. After processing the metadata related to the service, such as, for example, the title and the nominal mode of submission, is available at this moment on this service, the device 304 may select a suitable stream and play back the received stream. Timing and synchronization representation of these flows can be regulated by appropriate protocols.

Architecture protocols

Figure 4 shows an exemplary Protocol levels 400 for real BP is like in the network 302 FLO. Service is real-time can use these services 408 crop, and this level 410 encryption/decryption flow. It can consist of at least two Saburova: level 404 mediacodec and level 406 synchronization. Level 402 of real-time applications is shown at a higher area of Protocol levels 400.

Level 404 mediacodec supports specialized mediacodec, which are not included in the scope of this configuration. Mediacodec delivers a sequence of MediaCoder at level 406 synchronization in the network. Each mediacat can be identified by a time stamp representation (PTS), which usually sets the time of submission of the frame, and the corresponding identifier (ID) of the frame that identifies the relative position of the frame in the sequence of frames in supercade. Codec video source can create many medicaton with the same PTS and the ID of the frame in supercade.

For some types of media, especially video, the level 404 mediacodec in the network 302 also supplies metadata to the level 406 synchronization that this level in the device 304 may be used to support the capture and recovery sequence MediaCoder to be delivered at level 404 mediacodec device 304.

Level 406 synchronize is responsible for the adaptation of MediaCoder, as required by the type of media, and for ensuring synchronization of media and timing of presentation. Level 406 synchronization transports a sequence of packet level synchronization. Packet level synchronization carries or mediacar or frame adaptation, as will be described below. Packet level synchronization carrying mediacat, is formed by appending a header synchronization (SH) to medicate. Header synchronization (SH) consists of a media type, General mediaserve and title-specific media, as described in more detail below.

Additionally, the level 406 synchronization can carry some metadata describing the specifics of each type of media. These metadata are transferred in two ways. First, as noted in the header of the synchronization packet level synchronization can be enabled extensions that reflect the specific media. Secondly, the packet level synchronization can be used to transfer personnel adaptation, which are created at the level 406 synchronization and interspersed with packet level synchronization, carrier mediacity in the same thread. Different types of frame adaptation are identified by the application ID in the header of the synchronization frame of the application.

On figa and 5B show an example of an alternative relationship between the package and the level of synchronization and mediacare according to some configurations. On figa shows the approximate package 500 level synchronization encoded by the encoding device 110. In the package 500 level synchronization is contained, for example, many MediaCoder 504, 506 and 510 of variable length. Each mediacat 504, 506 and 510 precedes the corresponding header 502 synchronization (SH). The header 502 synchronization (SH) includes three components. These three components contain the type 520 media (MT)total mediatoolbox (CMH) 522 and the header 524, reflecting the specificity of media (MSH), which are described in detail below.

In the example on figa frame 508 adaptation inserted between mediacare 506 and 510. Frame 508 adaptation is preceded by a header 512 synchronization (SH), which has two components. These two components of header 512 synchronization (SH) include type 530 media (MT) and type 532 adaptation (AT).

Figure 5 shows a second exemplary package 550 sync level encoded by the encoding device 110. In the package 550 sync level contains, for example, many MediaCoder 564, 570 and 580 of variable length. Each mediacat 564, 570 and 580 is preceded by a heading 560 synchronization (SH) and the header 562 adaptation (AH). In the example on FIGU frame 574 adaptation inserted between frames 570 and 580. Frame 574 adaptation is preceded by a header 572 synchronization (SH).

Configuration options flow real-time

For streams, providing the operating data real-time configuration options flow can be configured in the following way: 1) FASB_ALLOWED means "not selected"; 2) CHECKSUM_ACTIVE means "configurable"; and (3) STREAM_ENCRYPTION_ACTIVE means "configurable".

The level interfaces mediacodec and framing

Service is real-time may consist of streaming components more than one type, such as video, audio and text used for comments or input coded titles, and it is possible streams in multiple languages and even many of their combinations. Each stream component can be transported in a separate thread, or many stream component can be transmitted in a single stream.

With regard to figure 3, there is every type of content is encoded and formatted accordingly. There are three types of streaming content, however, specialists in the art of the obvious possibility of extension concepts presented here: the video (e.g., H.264); audio (e.g., HE-AAC version 2); and/or synchronous data (for example, synchronous text 3GPP PSS).

Frames 508 or 574 adaptation transferring the metadata associated with the stream are considered as the content of the fourth type.

Interface mediacodec in the network 302 sends a sequence of MediaCoder 504, 506, 510, 564, 570 and 580 on the level 406 synchronization. In the device 304 level 406 sync the organization submits a sequence of MediaCoder (for example, 504, 506 and 510) in mediacodec. Mediacity (e.g., 504, 506 and 510) may be aligned on byte boundaries when passing through the interface between the level 406 synchronization and level 404 mediacodec as in the device 304 and the network 302.

Level 406 synchronization in the network 302 adds the header sync level (e.g., 502) in mediacity (e.g., 504, 506 and 510) to create a packet synchronization, punctuates them with sync packets, delivering frames 508 adaptation, and delivers the resulting packet synchronization at the level 408 framing for transmission. The sync packets carrying mediacity video can be transmitted either in the component modulation of the underlying level or component of the modulation level of improvement, as it is set by the level 404 mediacodec for video. Other sync packets can be transmitted at the basic component level.

Level 406 synchronization device 304 delivers mediacity (e.g., 504, 506 and 510) to the level 404 mediacodec in ascending order ID of the frame in each supercade. Order delivery MediaCoder video there are some additional restrictions in the case when there is more than one MediaCoder video with the same ID of the frame.

The maximum size MediaCoder (e.g., 504, 506 and 510) may not exceed the PMAX_RTbytes, where PMAX_RT- configurable parameter of the system is neither FLO which can be configured to provide the ability to have many different sizes MediaCoder.

In the following description defines the adaptation of the service packages provided by mediamolecule for transportation through the level 406 synchronization for each type of media, and interaction level 406 synchronization level 408 crop, reflecting the specifics of the media.

Video

Interface network mediacodec

Videos can be created with any nominal rate specified in Table 8 below. Nominal frame rate can vary supercade, for example, because the content from different sources enters the network at different frame rates. For each superquadra level 404 mediacodec can specify the level 406 synchronization number of MediaCoder that it is desirable to present to the user. The frames consist of an integer number of bytes. Therefore, there is no need for byte alignment MediaCoder transporting video frame.

Level 404 video codec may be video frames at level 406 synchronization in the decoding order. Level 404 mediacodec can provide with each video frame following metadata for level 406 synchronization: 1) PTS ID and frame; 2) the frame rate associated with the given frame, which identifies gnomenu frequency, with which the frames should be provided to the user; 3) whether the frame is a random access point (RAP), which device 304 may be used to capture video; 4) whether the frame is a reference frame; 5) whether the frame is essential videos, or additional video information; and/or 6) is whether the frame for transmission in the component base-level or component-level improvements. The criterion that determines whether the video material or additional, determined by the level 404 video codec.

The ID of the frame can be set to zero for the first frame in supercade. It can either remain the same for each subsequent video frame presented on the level 406 synchronization, either to increase until a value equal to the number of MediaCoder be submitted by device 304.

Delivery of frames with the same ID of the frame via an interface associated with a number of limitations. The first limitation is that, if the level 404 mediacodec creates one or more RAP-frames and one or more alternate frames with the same ID of the frame, then he can submit the RAP frame (frames) on the level 406 synchronization before alternative frames. The second limitation is that, if the level 404 mediacodec creates two frames with od is to them and the same ID of the frame, which differ only in the level of video quality, low frame can be transmitted at the basic component level, and high-quality frame can be transmitted in the component-level improvements.

The network interface level framing

Level 406 synchronization can group the sync packets carrying video frames, in accordance with the fact that they are a component of the base-level or component-level improvements. Each group can be processed separately.

Level 406 synchronization may submit the sync packets for each group at level 408 crop in ascending order ID of the frame. Two synchronization packet with the same ID of the frame in the same component can be enjoyed on the level 408 crop in the order in which they were received from the level 404 mediacodec.

Interface level framing device

The device 304 can restore the sync packets transmitted from the base level and the level of improvement, and can also restore order in which they should be delivered through the interface mediacodec in the device by joint processing.

Interface mediacodec device

Level 406 synchronization device 304 may represent mediacity video (e.g., 504, 506 and 510) to the level 404 mediacodec is the decoding order, determined on the basis of the ID of the frame according to additional recommendations (all or some of which may be eliminated in alternative configurations). The first recommendation is that, if the level 406 synchronization detects mediacat video where the flag RAP (RAP-frame) and one or more frames, non-RAP frame, and the same ID of the frame, then additionally evaluated one of two conditions. The first condition (first recommendation) is that, if the level 406 synchronization is not captured video stream, it can deliver RAP-frame interface mediacodec (MCI) and may discard frames that are not RAP-frames. Otherwise (the second condition), the level 406 synchronization can RAP drop-frame and can deliver the image (s), non-RAP frame, through the interface mediacodec (MCI)if required. RAP-frame may be a frame CSF.

The second recommendation is that, if the level 406 synchronization detects two MediaCoder video with identical headers sync level (SH), he can deliver the frame obtained at the level of improvements on the level 404 mediacodec and to discard the frame received in the base level.

The third recommendation is that, if the level 406 synchronization detects mediacat video with the essential video information and the second mediacat video with the same ID of the frame and the additional video information, it discusses two additional conditions. According to the first condition in the third recommendation, if the level 404 mediacodec cannot handle additional video information, the level 406 synchronization may discard this mediacat video and deliver mediacat video with the essential video information on the level 404 mediacodec. The second condition is that the third recommendation, if the first condition is not met, then the level 406 synchronization can deliver both MediaCoder video level 404 mediacodec.

On Fig shows a block diagram of a process 1600 processing level 406 synchronization level 404 mediacodec device 304. The process 1600 begins at block 1602, where mediacity video feature with level 406 synchronization level 404 mediacodec in decoding order based on the identification number of the frame. At block 1602 is followed by block 1604, where it is determined whether there are two adjacent frames with the same ID number, and where one frame is RAP-frame (e.g., CSF), and the other is not RAP-frame. If the result of determination is "NO", it executes the loop back from block 1604 in block 1602. However, if it is determined "YES", then at block 1604 should block 1606, where it is determined that captured whether the level 406 synchronize the video stream. If it is determined that "YES", then for the eye 1606 should block 1608, where RAP frame is discarded, and frames that are not RAP-frames are delivered to MCI if necessary. If it is determined "NO", then at block 1606 should block 1610 where the RAP frame is sent to MCI, and frames that are not RAP-frames are discarded.

On Fig shows an example bitstream 1700, created by the network 302, where there are consecutive frames with the same ID numbers. Bitrate 1700 is similar to the bit stream shown and described in connection with figa. As an example, mediacat 1704 includes a P-frame (1) for a channel designated as CH-CNN. Mediacat 1704 includes the header 1702 synchronization (SH). Header 1702 synchronization (SH) is similar to the previously described header 502 synchronization (SH).

In this example, we will assume that the CSF frame was inserted to trigger the change of the channel, for example, on the channel CH-ESPN. Frame CSF is represented by MediaCoder 1708 and includes the header 1706 synchronization (SH). Frame CSF is RAP-frame and has a header CMH 1720 with identification number of a frame. For illustrative purposes, the frame adaptation with the appropriate header SH is shown after the frame CSF (mediacat 1708). Mediacat 1712 designated as frame, non-RAP frame, and it is preceded by a header 1710 synchronization (SH). In this bit stream 1700 mediacity 1708 and 1712 are adjacent. The CSF frame is intended for the pereklucheniya channels, for example, on the channel CH-ESPN. To initiate a channel change mediacat 1712 is a P-frame (2) and has the CMH in the header 1710 synchronization with the identification number of the frame is coincident with the identification number of the frame header 1706 synchronization (SH) for frame CSF (mediacat 1708).

For MediaCoder 1712 should mediacat 1716, with the title 1714 synchronization. Mediacat 1716 may be In-frame. From the point of view of the order of the output frame is preceded by a P-frame. Thus, the frame is dropped or falls (see figure 10).

With regard to figure 10, mediacat 1704 must be defined as not a keyframe. Because there is a change of channel, the frame 1704 may not serve as a reference frame for frame in another channel. Mediacat 1704 can be installed as a non-key frame or removed from the buffer. However, there can be used other means, excluding the use of MediaCoder as a reference frame.

Audio

Network interface mediacodec

Audiometry are created with a fixed frequency in accordance with the type of audio codec. However, the frequency audiodrv may not be a multiple of the frequency of supercarb. For each superquadra level 404 mediacodec can specify the level 406 synchronization the number of media frames that gellately is about to present.

ID frame can be associated with each of the frames represented by the level 406 synchronization. ID frame can be assigned to any level 404 mediacodec or level 406 synchronization. The ID of the frame can be set to zero for the first frames in supercade. This value can be increased for each subsequent frames, presented at level 406 synchronization until a value equal to the number of MediaCoder represented by device 304.

Level 404 mediacodec in the network 302 may represent audiometry level 406 synchronization in the order of their creation. The frames may consist of a non-integral number of bytes. Level 404 mediacodec can provide byte alignment in accordance with the tool specified for the type of audio codec.

Level 404 mediacodec can provide the metadata level 406 synchronization with each frames. These metadata include the ID of the frame, if it is assigned to a level 404 mediacodec.

Determining whether the frame RAP-frame

Determines whether the frame contains significant audio data or additional audio data. The criterion that determines whether the audio information essential or additional, determined by the level 404 mediacodec.

The network interface level framing

Packages synchro is Itachi, containing audiometry can be transmitted in the modulation component directed level 404 mediacodec. Audiometry obtained in each component of the modulation can be represented by the level 408 crop in the order of their creation.

Interface level framing device

Level 406 synchronization device 304 may process the packet synchronization in the order they are received through the interface level framing.

Interface mediacodec device

Level 406 synchronization device 304 may represent audiometry level 404 mediacodec in the order they are retrieved from the packet synchronization.

The content of synchronous data

Network interface mediacodec

Frames of synchronous data are generated with variable frequency. Typically, but not necessarily, includes at least one frame synchronous data on one supercar in the stream of synchronous data, as clearly seen in figure 3.

ID frame can be associated with each frame synchronous data presented will drop 406 synchronization. ID of a frame can be assigned to any level 404 mediacodec or level 406 synchronization. The ID of the frame can be set to zero for the first frame of the synchronous data supercade. This value can be increased for each subsequent frame synchronous data presented level, the synchronization until a value equal to the number of MediaCoder presented by the device.

Level 404 mediacodec in the network may represent frames of synchronous data level 406 synchronization in the order of their creation. Frames of synchronous data can consist of a non-integer number of bytes. Byte alignment can be achieved in accordance with the tool set used by the type of synchronous data. The metadata provided by the level 404 mediacodec level 406 synchronization with each frame synchronous data, if any, depend on the data type.

The network interface level framing

The sync packets containing frames of synchronous data can be transmitted in the modulation component aimed level 404 mediacodec. Frames of synchronous data taken in each component of the modulation may be represented at the level of the crop in the order of their creation.

Interface level framing device

Level 406 synchronization device can process the packet synchronization in the order of reception via the interface level framing.

Interface mediacodec device

Level 406 synchronization device may represent frames of synchronous data at level 404 mediacodec in the order they are retrieved from the packet synchronization.

Capture the level of synchronization

Figure 6 shows an exemplary state machine 600 for processing level 406 synchronization for a separate thread in the device according to some configurations. The state machine 600 illustrates the transitions from state to state, as well as the processing performed in each state.

State capture is

The device 304 may enter into a state 606 capture in any of the following cases: 1) capture signal FLO, denoted by the reference 602; 2) receive from level 408 framing indication about changing the ID of the thread denoted by the reference 612; 3) loss of signal FLO, denoted by the reference 610, when in state 614 after capture; 4) detection MediaCoder with errors, also denoted by the reference 610, when in state 614 after capture; alarm error can be level 408 crop or errors can be detected by monitoring using redundant cyclic code (CRC), if the CRC processing is configured. In addition, the transition in state 606 capture is possible upon receipt of the frame, non-RAP frame (denoted by the reference 604).

In the case of transmission of the video stream, the device 304 may use the information provided by the directory sync level video, if available, to determine the nature of medicaton affected by the error. The device 304 JV is really be able to determine what procedures Troubleshooting possible without having to go to 614 capture.

Condition after capture

After taking the RAP frame (denoted by the reference 608), i.e. if there are no errors, the device 304 may go into a state 614 after capture. The transition to this state occurs when not found any errors in frames (denoted by the reference 616), while in state 614 after capture.

When in state 614 capture device 304 can handle mediacity provided by level 408 crop. Reliable mediacity can be delivered at level 404 mediacodec.

Header synchronization (SH)

Figure 11 shows the generator 1110 header synchronization, creating the header 1100 synchronization. Generator 1110 header synchronization includes the generator 1130 type of media that creates codes type of media. Codes type of media is retrieved to insert into the format of the header 1100 synchronization, and they include 00 video in block 1132, 01 audio in block 1134, 10 for synchronous data in block 1136 and 11 for adaptation in block 1140. Generator 1110 header synchronization also includes a generator 1150 additional fields. Generator 1110 header synchronization also creates a type of adaptation (AT), as shown in figa, using a generator 1160 type adapt the AI, shown in figv.

The header 1100 synchronization consists of field 1102 media type, followed by additional fields 1104, whose format depends on the value of the type field of the media created by the generator 1130 type of media. Generator 1150 additional fields shown in figa.

The General format of the header 1100 synchronization shown in table 2. The tables include field name, field type and the presence of the field. The presence of this field indicates whether the field is mandatory, conditional, etc. Type field indicates whether the field is unsigned integer (UINT), variable, bits, etc.

Table 2
The General format of the header sync level
Field nameField typeThe presence of the field
MEDIA_TYPE (1102)UINT(2)Required
Additional fields (1104)VariableRequired

MEDIA_TYPE

Field 1102 MEDIA_TYPE identifies the type MediaCoder, portable package sync level, or determines that the packet level synchronization frame is adaptation. The values defined for the field MEDIA_TYPE, the category is Slany in Table 3.

Table 3
The values defined for MEDIA_TYPE
NameValue
Video00
Audio01
TIMED_DATA10
Adaptation11

Additional fields

On figa shows the generator 1150 additional fields included in the generator 1110 header synchronization. Generator 1150 additional fields includes assembler 1200 total mediaserve and assembler 1202 of title-specific media.

The format of the additional fields 1104 depends on the value of a field 1102 media type. Assembler 1200 General header media generates a header CMH (figa) in accordance with Table elements 4. Assembler header 1202, reflecting the specificity of media, forms the header MSH (figa) according to the items in Table 4. The General format header fields for packet synchronization, transmitting mediacity with video, audio, or with synchronous data shown in Table 4.

Table 4
General the format of the additional header field level synchronization for MediaCoder
Field nameField typeThe presence of the field
General mediatoolboxBit(22)Required
Field nameField typeThe presence of the field
The title, reflecting the specifics of the mediaVariableConditional

The General format header fields for packet synchronization, transmitting frames of adaptation, shown in Table 5. On FIGU shows the generator 1160 type of adaptation, which is the generator 1110 header synchronization.

Table 5
The General format of the additional header field level synchronization frames for adaptation
Field nameField typeThe presence of the field
ADAPTATION_TYPEUINT(6)Required

General mediatoolbox

On figa shown assembler 1200 total mediaserve. Assembler 1200 header CMH includes generator mark PTS, generator 1304 ID frame_id, generator 1306 flag information_level_flag and generator 1308 flag RAP_flag. The General format of mediaserve (CMH) for packet-level synchronization, load-bearing medically shown in Table 6. General mediatoolbox (CMH) provides a variety of information. Information CMH includes: 1) information about the timestamp and ID MediaCoder; 2) random access point in a continuous data stream. It supports quick capture audio and video streams and stream with synchronized text. Information CMH also includes: 3) display does not support frames. This allows in some circumstances to drop mediacity without processing them (e.g., fast forward). Information CMH also includes: 4) indicator of level of quality.

The General format of mediaserve generated by the assembler 1200 total mediaserve shown in Table 6

Table 6
The General format of mediaservice
Field nameField typeThe presence of the field
PTSUINT(14)Required
FRAME_IDUINT(6)Commitment is th
INAORMATION_LEVEL_FLAGBits(1)Required
RAP_FLAGBits(1)Required

Individual header fields CMH defined below.

Field PTS

Field PTS is time-stamped representation MediaCoder which is created by the generator 1302 PTS. This field is specified in milliseconds. Field PTS is added to the time superquadra to obtain a valid point-in-time view MediaCoder.

FRAME_ID

FRAME_ID is a number MediaCoder in supercade, which is generated by generator 1304 frame_id. This number is set to 0 for the first MediaCoder in supercade and increase by one for each subsequent MediaCoder, which has a different value for PTS.

INFORMATION_LEVEL_FLAG

INFORMATION_LEVEL_FLAG is a bit that indicates, does mediacat essential information for this MediaCoder or additional information that can be obtained with the help of essential information. INFORMATION_LEVEL_FLAG is created by the generator 1306 flag information_level_flag. Generator 1306 creates INFORMATION_LEVEL_FLAG according to the following conditions. If mediacat carries significant information (condition 1), then INFORMATION_LEVEL_FLAG can be set equal to 0. If mediacat neseccarily sign (condition 2), INFORMATION_LEVEL_FLAG can be set equal to 1. If mediacodec does not support the level of additional information (condition 3), then INFORMATION_LEVEL_FLAG can be set equal to 0, and this field can be ignored by the device.

RAP_FLAG

Flag RAP_FLAG indicates whether mediacat point random access; it is created by the generator 1308 RAP_flag. The device 304 may use RAP_FLAG during re-capture or change channels to determine whether you can start accessing the media stream with the specified medicaton. Generator 1308 RAP_flag creates RAP_FLAG according to the following conditions. If (condition 1) MEDIA_TYPE is set to VIDEO or AUDIO, and if this mediacat is a random access point, RAP_FLAG can be set equal to 1. If (condition 2) MEDIA_TYPE is set to VIDEO or AUDIO, and if this mediacat is not a random access point, RAP_FLAG can be set equal to 0. If (condition 3) MEDIA_TYPE is set to TIMED_DATA, RAP_FLAG can be set equal to 1 in all MediaCoder.

Headings that reflect the specific media

On FIGU shown assembler 1202 of title-specific media. Assembler 1202 of title-specific media (MSH), generates the header formats, reflecting the specificity of media (figa), for packet-level synchronization, load-bearing mediacat, in the accordance with the type of media. Types of media are audio, video, synchronous data and adaptation. Assembler 1202 header MSH includes generator 1322 frame rate, the generator 1324 flag unreferenced_frame_flag and generator 1326 backing field for the type of media intended for video.

Video

The title, reflecting the specificity of media (MSH), for packet-level synchronization, load-bearing mediacity video is mediatakeout.com for video. The format mediaserve for video specified in Table 7.

Table 7
Mediatoolbox video
Field nameField typeThe presence of the field
FRAME_RATEUINT (3)Required
UNREFERENCED_FRAME_FLAGBits(1)Required
ReservedUINT(4)Required

Separate fields mediaserve for video defined below.

FRAME_RATE

Field FRAME_RATE is the rate at which frames are generated by the network, and this field is generated by the generator 1322 frame rate in accordance with the values in Table 8. the values, defined for FRAME_RATE shown in Table 8.

Table 8
The values defined for FRAME_RATE
Frame rate (fps)Value
2400/1001 (23,976)000
24001
25010
30000/1001 (29,97)011
30100
50101
60000/1001 (59,94)110
60111

Frequency FRAME_RATE is a nominal frame rate (frames per second) when receiving full stream. For example, if the stream is sent using both the base level and the level of improvement it FRAME_RATE is the frequency after both of the data stream is completely decoded. The actual frequency display may differ from each other. For example, a device receiving only a basic level of transmission that can display frames with low frequency

UNREFERENCED_FRAME_FLAG

Flag UNREFERENCED_FRAME_FLAG is a bit that specifies whether mediacat as a reference for restoration of other MediaCoder, and this flag is generated by the generator 1324 flag unreferenced_frame_flag. Generator 1324 creates UNREFERENCED_FRAME_FLAG based on the following conditions. If mediacat is a reference frame (condition 1), then UNREFERENCED_FRAME_FLAG can be set equal to 0. If MediaCoder is not a keyframe (condition 2), then UNREFERENCED_FRAME_FLAG can be set equal to 1.

RESERVED (reserved)

The value of the RESERVED bits may be set equal to 0, and this value is generated by a generator 1326 backing field when necessary.

Audio

Assembler header 1202, reflecting the specifics of the media, does not create a header that indicates the specifics of the media, for packet-level synchronization, load-bearing mediacity with audio data. However, the assembler header 1202, reflecting the specifics of the media may be modified to provide the specified header MSH for audio.

Synchronous data

Assembler header 1202, reflecting the specificity of media, includes generator 1332 timed_data_type. The title, reflecting the specifics of the media, for packet-level synchronization, load-bearing mediacity synchronous data represents mediatoolbox synchronous data. Format mediation the WHC synchronous data created by the generator 1332 timed_data_type shown in table 9.

Table 9
The format mediaserve synchronous data
Field nameField typeThe presence of the field
TIMED_DATA_TYPEUINT(8)Required

TIMED_DATA_TYPE

Field TIMED_DATA_TYPE identifies the specific type of data in MediaCoder TIMED_DATA, and it is generated 1332 timed_data_type. The values defined for TIMED_DATA_TYPE, specified in Table 10.

Table 10
The values defined for TIMED_DATA_TYPE
NameValue
CHARACTER_TEXT0
Values from 1 to 256 reserved

ADAPTATION_TIPE

On FIGU shows the generator 1160 type of adaptation, which is the generator 1110 header synchronization. Generator 1160 type of adaptation involves the Assembly 1220 catalog video_sync_layer. Field ADAPTATION_TIPE specifies the data type of adaptation in the frame adaptation. The values defined for the field ADAPTATION_TIPE, for the Ana in Table 11.

Table 11
The values defined for ADAPTATION_TIPE
NameValue
video_sync_layer_directory1
All other values are reserved

Frames adaptation

The structure of the body frame adaptation (e.g., 508) depends on the type of adaptation. The body frame adapted to each type of adaptation is specified in Table 11 and described below.

Directory sync level video (VSL)

Assembler 1220 catalog video_sync_layer creates a directory level sync video, which is an optional frame adaptation and can be used by level 406 synchronization device to help eliminate errors performed by the codec. For example, it allows the level 406 synchronization to determine whether intended a lost or damaged frame as a reference frame. This information allows the codec to decide whether to process subsequent frames until the next key frame or drop it.

Assembler 1160 catalog video_sync_layer shown in Fig, includes module 1402 records VSL_records module 1412 bit RAP_flag_bits module 1422 bit _frame_flag_bits and backup module 1432 to create and catalog video_sync_layer. Directory video_sync_layer, if present, can be transported in the form of a frame adaptation level synchronization component of base flow level, transporting video, for which it is used. This directory should be transmitted at least once during superquadra. The format of the directory video_sync_layer specified in Table 12.

Table 12
Directory sync level video (VSL)
Field nameField typeThe presence of the field
VSL_RECORDSVSL_RECORD_TYPERequired
RAP FLAG_BITSBIT(60)Required
U_FRAME_FLAG_BITSBIT(60)Required
ReservedBIT(variable)Conditional

VSL_RECORD

On Fig shows module 1402 account VSL. Module 1402 account VSL includes a module 1502 more_VSL_records module 1504 frame_rate, module 1506 num_frames, module 1508 first_frame_PTS and module 1510 last_frame_PTS.

Module 1502 more_VSL_records can create and generate one or several whom are records VSL_RECORD for the directory. Format VSL_RECORD specified in Table 13.

Table 13
Format VSL_RECORD
Field nameField typeThe presence of the field
MORE_VSL_RECORDSBits(1)Required
FRAME_RATEUINT(3)Required
NUM_FRAMESUINT(6)Required
FIRST_FRAME_PTSUINT(14)Required
LAST_FRAME_PTSUINT(14)Required

MORE_VSL_RECORDS

Module 1502 more_VSL_records creates a flag MORE_VSL_RECORDS, which can be set equal to 0 if the current record is VSL_RECORD is the latest in the directory sync level video.

Module 1502 more_VSL_records creates a flag MORE_VSL_RECORDS, which can be set equal to 1 if the current record is VSL_RECORD is not the last in the directory sync level video.

The number of records VSL_RECORD in the directory sync level video can be 1 more than the amounts of the changes of the nominal frame rate of the video in supercade.

FRAME_RATE

Module 1504 frame_rate creates and forms the FRAME_RATE field, which provides information on the frame rate used in recording VSL_RECORD. Table 8 sets the values defined for the field FRAME_RATE.

NUM_FRAMES

Module 1506 num_frames creates a field NUM_FRAMES, which specifies the number of MediaCoder video with different values for the ID of the frame with the frame rate specified FRAME_RATE field, in block serial MediaCoder video, since FIRST_FRAME_PTS in supercade.

FIRST_FRAME_PTS

Module 1508 first_frame_PTS creates a time stamp FIRST_FRAME_PTS, which is the PTS of the first MediaCoder video block serial MediaCoder video frame rate specified by FRAME_RATE.

LAST_FRAME_PTS

Module 1510 last_frame_PTS forms LAST_FRAME_PTS, which is the mark PTS last MediaCoder video block serial MediaCoder video with the frequency defined by parameter FRAME_RATE, since FIRST_FRAME_PTS.

RAP_FLAG_BITS

Module 1412 RAP_flag_bits creates bits RAP_FLAG_BITS. Directory sync level video contains 60 RAP_FLAG_BITS in accordance with the maximum number of MediaCoder video, 60, supercade. Each bit field RAP_FLAG_BITS corresponds to a specific mediacat video-defined ID of the frame until a value equal to the number of individual frames of video in supercade. The least significant bit corresponds to the first mediacat video, etc is UEMOA first record VSL_RECORD. For bits RAP_FLAG_BITS covered the first record VSL_RECORD, followed by bits RAP_FLAG_BITS covered the second and subsequent records VSL_RECORD, if any, in the order of their transmission.

Each bit in the field RAP_FLAG_BITS directory sync level video can be set equal to 1 if the corresponding mediacat video is a random access point and is not accompanied by frame, non-RAP frame with the same ID of the frame. Otherwise, this bit is set to 0. The bits following the bit in RAP_FLAG_BITS, which corresponds to the last transmitted mediacat video supercade, can be set to 0.

U_FRAME_FLAG_BITS

Module 1422 U_frame_flag_bits creates a message containing 60 bits U_FRAME_FLAG_BITS in accordance with the maximum number of MediaCoder video 60 in supercade. Each bit field U_FRAME_FLAG_BITS corresponds to a specific mediacat video, identified by the identifier of the frame until a value equal to the number of individual MediaCoder video supercade. The least significant bit corresponds to the first mediacat video covered the first record VSL_RECORD. For bits U_FRAME_FLAG_BITS covered the first record VSL_RECORD, followed by bits U_FRAME_FLAG_BITS covered the second and subsequent records VSL_RECORD, if any, in the order of their transmission.

Each bit in the field U_FRAME_FLAG_BITS directory sync level video signal is to be set equal to 1, if the corresponding video frame is not a keyframe. Otherwise, this bit is set to 0. The bits following the bit in U_FRAME_FLAG_BITS, which corresponds to the last transmitted frame in supercade, can be set to 0.

RESERVED

For field U_FRAME_FLAG_BIT should the minimum number of reserved bits (RESERVED)created by the module 1432 reservations, which are required to align the last byte of the directory synchronization video on a byte boundary. The network may have redundant bits in the directory synchronization video is set to 0.

Specialists in the art it is obvious that information and signals may be represented using any of a variety of different technologies and methods. For example, data, instructions, commands, information, signals, bits, symbols, and basic assumptions, which may contain references throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.

In addition, specialists in the art it is clear that the various illustrative logical blocks, modules, and algorithm steps described in connection with the disclosed here are examples, may be implemented as electronic hardware,software and hardware, computer software, middleware, microcode, or any combination thereof. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above from the point of view of their functionality. What means: hardware or software that implements this functionality depends on the particular application and design constraints imposed on the system as a whole. Specialists in the art can implement the described functionality in different ways for each particular application, but such decisions on their implementation should not be interpreted as going beyond the scope of the disclosed here.

The various illustrative logical blocks, components, modules, and circuits described in connection with the disclosed here are examples, may be implemented or performed using a General purpose processor, a digital signal processor (DSP), a specialized application integrated circuit (ASIC), gate arrays, user-programmable (FPGA)or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof, before aznacenni to perform the functions described here. The General-purpose processor may be a microprocessor, but in an alternative embodiment, the processor may be any known processor, controller, microcontroller, or state machine. The processor may also be implemented as a combination of computing devices, such as a combination of a DSP and a microprocessor, a variety of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other specified configuration.

The blocks of method or algorithm described in connection with the disclosed here are examples that can be directly implemented in hardware, in one or more program modules, executed by one or more processing elements, or a combination of both. A software module may reside in random access memory (RAM), flash memory, read-only memory (ROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), registers, hard disk, removable disk, ROM, CD-ROM (CD-ROM) or any other form or combination of storage media known in the art. Illustrative media storage associated with the processor so that the processor can read information from and write information to it. In an alternative embodiment, the media storage may be with the processor a single is. The processor and the media storage can be applied in a specialized integrated circuit (ASIC). Diagram of the ASIC may reside in the wireless modem. In an alternative embodiment, the processor and the media storage can be in wireless modem in the form of a discrete component.

In one or more exemplary configurations, the functions described here can be implemented in hardware, software, and hardware-software means or any combination thereof. When implementing software, these functions can be stored or transmitted as one or more commands or code read by the computer environment. Machine-readable media includes both computer-readable media and communication environment, including any media that provides the transmission of a computer program from one place to another. The storage media can be any available media that can be accessed by computer. For example, but not limitation, these machine-readable media can include RAM, ROM, electrically erasable EPROM, ROM, CD ROM or other storage device on the optical disk storage device on a magnetic disk or other magnetic storage devices, or any other media, to the which can be used for transporting or storing the necessary code in the form of commands or data structures and which can be made available to the computer. Also machine-readable media can be called any connection. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, digital subscriber line (DSL), or wireless technologies such as infrared, radio and microwave communications, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio and microwave communications, are included in the definition of the media. Used herein, the term "drive" includes compact disc (CD), laser disc, optical disc, digital versatile disk (DVD), floppy disk and a laser disk in the blue range of the spectrum, and disks (disks usually reproduce data on the basis of magnetic phenomena, while disks (discs) reproduce the data by using optical lasers. In the scope of the concept of "machine-readable medium" shall also include combinations of the above.

The above description disclosed here are examples proposed in order to enable specialists in the art to make or use the disclosed methods and apparatus. Specialists in the art will easily suggest various modifications to these examples, however defined, is Lennie principles here can be applied to other examples and they can be made more elements.

1. Device for encoding video data that contains the processor, operating to generate a frame switching channels (CSF) from one or more blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream, and one of the blocks contains NAL unit NAL low-quality instant updates decode (IDR)with non-zero identification number of the frame.

2. The device according to claim 1, in which the NAL unit IDR contains a non-zero value of the counter image sequence (ROS).

3. The device according to claim 1, in which a non-zero identification number of the frame NAL unit IDR is equal to the frame number of the corresponding P-slice.

4. The device according to claim 1, in which the processor operates to generate the CSF at least two additional NAL units, and these at least two additional NAL unit contains the unit NAL parameter set sequence (SPS) and the NAL block of the set of image parameters (PPS).

5. The device according to claim 1, in which the processor operates to generate the CSF with a single NAL unit, and that one block contains NAL unit NAL low-quality instant updates decode (IDR).

6. The device according to claim 1, in which the CSF includes the NAL unit I-frame.

7. The device according to claim 1, additionally containing PE adamcik for broadcast CSF as a frame random access point (RAP).

8. The device according to claim 1, in which CSF is current in order to cause switching from one channel to another channel or to help eliminate errors.

9. Integrated circuit for encoding video data that contains the processor, operating to generate a frame switching channels (CSF) from one or more blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream, and one of the blocks contains NAL unit NAL low-quality instant updates decode (IDR)with non-zero identification number of the frame.

10. Integrated circuit according to claim 9, in which the NAL unit IDR contains a non-zero value of the counter image sequence (ROS).

11. Integrated circuit according to claim 9, in which a non-zero identification number of the frame NAL unit IDR is equal to the frame number of the corresponding P-slice.

12. Integrated circuit according to claim 9, in which the processor operates to generate the CSF at least two additional NAL units, and these at least two additional NAL unit contains the unit NAL parameter set sequence (SPS) and the NAL block of the set of image parameters (PPS).

13. Integrated circuit according to claim 9, in which the processor operates to generate the CSF with a single NAL unit, and that one block contains NAL unit NAL low-quality, mcneven the th update decode (IDR).

14. Integrated circuit according to claim 9, in which the CSF includes the NAL unit I-frame.

15. Integrated circuit according to claim 9, further containing a transmitter for broadcasting CSF as a frame random access point (RAP).

16. Integrated circuit according to claim 9, in which CSF is current in order to cause switching from one channel to another channel or to help eliminate errors.

17. Machine-readable media containing machine-executable commands, which, when performed, cause the computer to: generate a frame switching channels (CSF) from one or more blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream to cause switching from one channel to another channel or to help eliminate errors, and one of the blocks contains NAL unit NAL low-quality instant updates decode (IDR)with non-zero identification number of the frame.

18. Machine-readable media according to 17, in which the NAL unit IDR contains a non-zero value of the counter image sequence (ROS).

19. Machine-readable media according to 17, in which a non-zero identification number of the frame NAL unit IDR is equal to the frame number of the corresponding P-slice.

20. Machine-readable media according to 17, in which oterom command to generate the CSF include commands for generating the CSF at least two additional NAL units, moreover, these at least two additional NAL unit contains the unit NAL parameter set sequence (SPS) and the NAL block of the set of image parameters (PPS).

21. Machine-readable media according to 17, wherein the commands for generating the CSF include commands for generating the CSF with a single NAL unit, and that one block contains NAL unit NAL low-quality instant updates decode (IDR).

22. Machine-readable media according to 17, wherein the commands for generating the CSF include commands for generating the NAL unit I-frame.

23. A device for decoding video data that contains the processor, operating to decode one or more adjacent video frames in the media from a sequence of consecutive video frames of the media, and each of adjacent video frames in the media has the same identification (ID) number of the frame that identifies the relative location of one or more adjacent video frames of the media in the sequence of consecutive video frames of the media, and the first frame of the adjacent frames of the video media is a frame random access point (RAP), and the second frame of the adjacent frames of the video media is a non-RAP frame.

24. The device according to item 23, in which the processor operates to decode only one of the adjacent frames as the media, having the same ID number of the frame.

25. The device according to item 23, in which the processor operates to decode frame RAP and drop frames, which are arranged in order of output before the RAP frame.

26. The device according A.25 in which to decode frame RAP processor operates to decode frame switching channels containing one or more blocks of the level of network abstraction layer (NAL).

27. Machine-readable media containing machine-executable commands, which, when performed, cause the computer to decode one or more adjacent video frames in the media from a sequence of consecutive video frames of the media, and each of adjacent video frames in the media has the same identification (ID) number of the frame that identifies the relative location of one or more adjacent video frames of the media in the sequence of consecutive video frames of the media, and the first frame of the adjacent frames of the video media is a frame random access point (RAP), and the second frame of the adjacent frames of video media is a non-RAP frame.

28. Machine-readable media according to item 27, wherein the command decoding include commands to decode only one of two adjacent video frames in the media that have the same ID number of the frame.

29. cityhey computer media item 27, in which commands for decoding include commands to decode frame RAP and drop frames, which are arranged in order of output before the RAP frame.

30. Machine-readable media according to clause 29, in which commands for decoding of the frame of the RAP include a command to decode frame switching channels containing one or more blocks of the level of network abstraction layer (NAL).

31. The method of decoding video data, containing the step of decoding one or more adjacent video frames in the media from a sequence of consecutive video frames of the media, and each of adjacent video frames in the media has the same identification (ID) number of the frame that identifies the relative location of one or more adjacent video frames of the media in the sequence of consecutive video frames of the media, and the first frame of the adjacent frames of the video media is a frame random access point (RAP), and the second frame of the adjacent frames of the video media is a non-RAP frame.

32. The method according to p, in which the step of decoding includes decoding only one of two adjacent video frames in the media that have the same ID number of the frame.

33. The method according to p, further containing the step of providing to the decoder as the first frame and the second frame and the adjacent frames of the video media.

34. The method according to p, in which the step of decoding includes decoding frame RAP and drop frames, which are arranged in order of output before the RAP frame.

35. The method according to clause 34, in which the decoding of the frame of the RAP includes decoding frame switching channels containing one or more blocks of the level of network abstraction layer (NAL).

36. A method of encoding video data containing the phase encoding one or more adjacent frames, each of which has the same identification (ID) number of the frame and the first frame of the adjacent frames is a frame random access point (RAP), and the second frame is not a frame RAP.

37. System for encoding and decoding video data, comprising: an encoder, the encoding of one or more adjacent video frames in the media from a sequence of consecutive video frames of the media, and each of adjacent video frames in the media has the same identification (ID) number of the frame that identifies the relative location of one or more adjacent video frames of the media in the sequence of consecutive video frames of the media, and the first frame of the adjacent frames of the video media is a frame random access point (RAP), and the second frame of the video media is a non-RAP frame; and a decoder, is operating to decode one or more adjacent video frames of the media to change channels or help troubleshoot errors.

38. System for encoding and decoding video data containing the encoder operating to generate and broadcast transmission frame switching channels (CSF) from one or more blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream, and one of the blocks represents the NAL unit NAL low-quality instant updates decode (IDR)with non-zero identification number of the frame; and a decoder operating to decode the CSF to cause switching from one channel to another channel or to facilitate Troubleshooting.

39. A method of encoding video data, containing the step of generating the frame channel switching (CSF) from one or more blocks of the level of network abstraction layer (NAL) to allow random access points in the coded bit stream, and one of the blocks represents the NAL unit NAL low-quality instant updates decode (IDR)with non-zero identification number of the frame.

40. The method according to 39, in which the NAL unit IDR contains a non-zero value of the counter image sequence (ROS).

41. The method according to p, in which the step of generating further includes the generation of CSF at least two additional NAL units, and these at least two additionally the x of the NAL unit contains a NAL unit parameter set sequence (SPS) and the NAL block of the set of image parameters (RRC).

42. The method according to 39, in which the step of generating includes generating the CSF with a single NAL unit, and that one block contains NAL unit NAL low-quality instant updates decode (IDR).

43. The method according to 39, in which the step of generating includes generating the CSF with the NAL block of the I-frame.

44. The method according to 39, further containing the step of broadcasting CSF as a frame random access point (RAP) to perform switching of channels.

45. The method according to item 44, optionally containing reception CSF and switching from one channel to another channel in response to the accepted CSF.

46. The method according to item 44, optionally containing reception CSF and elimination of errors in the coded bit stream in response to the accepted CSF.



 

Same patents:

FIELD: information technologies.

SUBSTANCE: first ratio is defined, related to the first type of a video unit; the second ratio is defined, related to the second type of a video unit, besides, the first and second ratios are based on symbols, indicating whether signs of specification coefficients have changed or remained the same relative to the appropriate coefficients of the previous level in SVC circuit. The first table VLC is selected from multiple VLC tables for use in coding of a video unit of the first type on the basis of the first ratio, and the second table VLC - from multiple VLC tables for use in coding of a video unit of the second type on the basis of the second ratio, and the video units of the first type are coded on the basis of the first table VLC, and the video units of the second type - on the basis of the second table VLC.

EFFECT: improved efficiency of video coding.

23 cl, 7 dwg

FIELD: information technology.

SUBSTANCE: encoder for video signal scalable coding has an image encoder for generating a bit stream of the base layer and a bit stream of an improved layer. The bit stream of the base layer and the bit stream of the improved layer are formed by breaking down the image into multiple image blocks, grouping the multiple image blocks into at least one group of sectors in the bit stream of the base layer and into at least two groups of sectors in the bit stream of the improved layer, coding all of the at least one group of sectors in the bit stream of the base layer and less than all of the least two groups of sectors in the improved layer so that, at least one group of sectors from at least two groups of sectors in the bit stream of the improved layer is intentionally not coded, coding the syntax structure element in the heading to indicate at least one group of sectors in the improved layer which is intentionally not coded.

EFFECT: high efficiency of video signal scalable coding.

18 cl, 5 dwg

FIELD: information technology.

SUBSTANCE: video carrying capacity is estimated based on the size of the radio link video protocol (RLP) in an access terminal, and video data are coded using the estimated carrying capacity.

EFFECT: reduced video delay by regulating the video coding speed, fine reduction of quality and improving user impression, especially when channel state deteriorates.

33 cl, 13 dwg

FIELD: information technologies.

SUBSTANCE: two different stages of series lengths are not carried out to separately code significant ratios and refinement ratios. Lengths of series of significant ratios and refinement ratios are coded together. Therefore, values of series of coding of series lengths code significant ratios with refinement ratios. Sign information for some of refinement ratios may be produced in decoder on the basis of sign values of appropriate coefficients of previously coded levels of SVC circuit, which may additionally increase efficiency of compression.

EFFECT: efficient coding of series lengths of significant ratios and refinement ratios of expansion level, which eliminate necessity to send sign information for some of refinement ratios.

32 cl, 9 dwg

FIELD: information technologies.

SUBSTANCE: during CAVLC coding quantity of coefficients is used that are higher than one (NLrg1), as an element of syntax. Coding of coeff_token is based on values TotalCoeff and NLrg1. Levels are adaptively coded on the basis of syntax element NLrg1, total_zeros are coded by adaptive selection of VLC tables on the basis of assessment of quantity of total_zeros of adjacent units.

EFFECT: improved efficiency of residual coding of CAVLC level of CGS improvement.

19 cl, 10 dwg, 6 tbl

FIELD: information technologies.

SUBSTANCE: scalable video coder comprises coder for coding of unit at higher level of image by means of application of a higher level of the same weight parametre to reference image, which has been used to reference image of lower level used to code unit at lower level of image. Unit at higher level complies with the unit at lower level, and reference image of higher level complies with reference image of lower level.

EFFECT: reduced volume of memory or reduction of code and decoder complexity and saving of bit quantity at very low speeds of data transfer.

31 cl, 6 dwg, 1 tbl

FIELD: information technology.

SUBSTANCE: invention discloses an indirect aggregator NAL unit for the SVC file format and RTP payload format for video coding. The base layer and at least one enhancing layer in the access unit are coded, wherein the access unit includes: at least one elementary data unit used for decoding and an elementary data unit with information on scalability, linked with at least part of the access unit, where the elementary data unit with information on scalability is configured in such a way that it is ignored during decoding in accordance with a first algorithm.

EFFECT: presentation of an indirect aggregating network abstraction layer unit for the scalable video coding file format and the real-time transport protocol payload format, which enables simple determination of dependency of scalability in the bit stream, thereby providing simple and efficient stream processing.

33 cl, 3 dwg

FIELD: information technologies.

SUBSTANCE: several various VLC-tables are stored in coding devices, in process of coding and decoding, one of VLC-tables is selected and used to do coding of SVR for this video unit. Table may be selected on the basis of number of neighbouring video units for current video unit, which include non-zero transformation ratios.

EFFECT: increased coefficients of coding of SVR video units, in which structures of coefficients occurring with higher probability, are coded with the help of shorter codes, while structures of coefficients, which occur with lower probability, are coded with the help of longer codes, which is especially useful in coding of video units of improved layers in coding of scalable video.

25 cl, 7 dwg, 1 tbl

FIELD: information technologies.

SUBSTANCE: information is sent from coder device to coder device, which identifies the coding tables with alternate length to be used for decoding of two or more different types of video units. Information may be sent once per frame (or other coded unit) and may identify the first table to be used for intra-coded units, and the second table to be used for inter-coded units of appropriate frame; decoder selects table on the basis of this information and decodes various types of video units, using selected table for each type of unit.

EFFECT: improved efficiency of coding of transformation ratios, in particular in coding with alternate length of coefficients of SVC circuit expansion level detailing.

25 cl, 8 dwg

FIELD: information technologies.

SUBSTANCE: target image that forms video image is divided into multiple division areas (DA); pass band (PB) width applied to DA is determined; array of filtration ratios (FR) is calculated to realise frequency characteristics corresponding to limitation of band, with application of PB width; image data is filtered with application of FR array; error information value is produced between obtained data and data of initial image, and distribution ratio (DR) is calculated to be used to determine optimal width of PB, on the basis of produced value; optimal width of PB corresponding to DR is defined for each DA, and array of optimal FR is calculated to realise frequency characteristics corresponding to limitation of band, using optimal width of PB; image data of division area is filtered using array of optimal FR; and produced data of each DA are synthesised.

EFFECT: generation of filtered image with specified value of image quality assessment.

29 cl, 27 dwg

FIELD: information technologies.

SUBSTANCE: user is offered a set of types of typical virtual channels based on a certain previously determined or to be determined category, for instance, a news channel, which contains previously determined default settings and a procedure of actions. Templates of typical virtual channels considerably simplify setting a virtual channel to a viewer. For instance, for a news channel a procedure of default actions consists in storing only the latest news, for series - a procedure of defaults actions is in storing everything until viewed.

EFFECT: provision of simple and efficient device to determine television channels for users in compliance with their own interests and habits, using previously specified templates.

21 cl, 7 dwg

FIELD: information technology.

SUBSTANCE: invention proposes a method for processing a transport stream (TS) received as an input TS in a processing device (SDR), the transport stream comprising a plurality of elementary streams (ES), each ES being a set of TS packets having the same Packet IDentifier (PID), at least one of these ES being time-sliced so as to be sent in bursts, timing information indicating within a burst the time to the beginning of the next burst, applying a filtering operation to the input TS so as to filter out from the input TS part or all of one or more time- sliced ES; modifying the bursts scheduling of the input transport stream so as to generate a DVB-H compliant output TS from the filtered input TS.

EFFECT: providing a technology which, based on centralising the created TS, creates and synchronously distributes specific local TS to each specific cell of a single-frequency network (SFN).

16 cl, 4 dwg

FIELD: information technology.

SUBSTANCE: invention proposes a method for processing a transport stream (TS) received as an input TS in a processing device (SDR), the transport stream comprising a plurality of elementary streams (ES), each ES being a set of TS packets having the same Packet IDentifier (PID), at least one of these ES being time-sliced so as to be sent in bursts, timing information indicating within a burst the time to the beginning of the next burst, applying a filtering operation to the input TS so as to filter out from the input TS part or all of one or more time- sliced ES; modifying the bursts scheduling of the input transport stream so as to generate a DVB-H compliant output TS from the filtered input TS.

EFFECT: providing a technology which, based on centralising the created TS, creates and synchronously distributes specific local TS to each specific cell of a single-frequency network (SFN).

16 cl, 4 dwg

FIELD: information technology.

SUBSTANCE: invention discloses a system for transmitting and receiving sound broadcasting signals having a transmitting part consisting of a digital signal former for the sound broadcasting program, a source of audio signals and a switch, a sound broadcasting signal transmission channel in which are connected two reception points on audio frequency, units for determining the signal type and a reception controlled switch, a digital signal former, a signal distributor and re-reception controlled switches which are configured to shunt the re-reception point.

EFFECT: possibility of transmitting analogue and digital broadcast signals.

1 dwg

FIELD: information technology.

SUBSTANCE: method involves creating a list of peer-to-peer devices which contains a list of peer-to-peer servers from one or more network computers. The list of peer-to-peer devices contains not more than a predetermined number of potential peer-to-peer servers. Potential peer-to-peer servers in the list of peer-to-peer devices are requested relative a file or part of a file. A message is received from a peer-to-peer server in the list of peer-to-peer devices, which indicates that the peer-to-peer server has a file or part of a file available for loading. The computer system loads the file or part of the file from the peer-to-peer server.

EFFECT: reduced load on the server.

12 cl, 4 dwg

FIELD: information technology.

SUBSTANCE: service rendering method realised on a computer, in which a client pseudo-virtual path, which establishes conformity directly with a service, is provided. The pseudo-virtual path includes a special marker which identifies a pseudo-virtual path. Content following the special market establishes conformity directly with the service by identifying, for example, information on type associated with the service. The pseudo-virtual path can be formed through an application program interface and can also be encoded before transmission to a client.

EFFECT: fast service accessing by a client.

15 cl, 8 dwg

FIELD: information technologies.

SUBSTANCE: request of subscription is created from viewer, in order to realise access to additional paid content, parametre of viewer identification is formed in receiver in response to request of viewer subscription, parametre of viewer identification is saved in receiver, parametre of viewer identification is sent, as well as request of viewer subscription from receiver to broadcasting station along feedback channel, viewer access is authorised to additional paid content, parametre of viewer identification is sent, which has been received at the stage, from broadcast station in transport DVB-stream substantially to all receivers in broadcasting system, a transport DVB-stream is received in receiver, and requested additional paid content is unlocked from transmitted transport DVB-stream in receiver with use of transmitted parametre of user identification and stored parametre of viewer identification.

EFFECT: provision of conditional access to additional paid content received via medium of transfer in broadcasting system with support of multimedia home platform or platform of applications of open channel.

27 cl, 3 dwg

FIELD: radio engineering.

SUBSTANCE: device includes memory for association of at least one channel identifier set by user with at least one element of selection in device and control unit related to memory. At the same time control unit is arranged with the possibility to play channel identifier set by user when user activates according element of selection.

EFFECT: spreading of technical measures range.

26 cl, 8 dwg

FIELD: physics, communications.

SUBSTANCE: invention relates to communication networks, and particularly to providing different types of services in a digital broadcast system. The invention discloses devices and methods of transmitting information on the type of a service or program and/or interactivity or information on interactivity for the service or program. A parameter or attribute is provided in an ESG fragment for indicating the type of program or service or type of interactivity associated with the program or service. In another example, a first parameter is provided for indicating the type of program or service, and a second parameter is provided for indicating the type of interactivity associated with the program or service. The parameter or attribute may be included in metadata of the ESG fragment.

EFFECT: efficient and accurate provision of information on interactivity in an electronic service guide (ESG).

34 cl, 9 dwg, 4 tbl

FIELD: information technology.

SUBSTANCE: parametre, element or sub-element may be included in a service guide or other data fragment indicating the location within which the program, service or corresponding ESG fragment(s) may be broadcast. For example, a Cell ID may be indicated as an element or sub-element in an ESG fragment to indicate the location in which a program or service may be broadcast. Also, Cell IDs may be grouped to indicate a larger region in which to broadcast a program or service and corresponding ESG fragment(s) in a service guide.

EFFECT: standard method of providing information for determining location through varying communication systems.

29 cl, 6 dwg, 2 tbl

FIELD: technology for simultaneous broadcasting radio-transmission of signals with analog modulation and of digital transmission signals.

SUBSTANCE: in accordance to the invention, amplitude-modulated signal of simultaneous broadcasting radio-transmission, which incorporates digital transmission signal and analog transmission signal in one transmission channel, is characterized by the fact that one side band of carrier of signal of simultaneous broadcasting transmission is modulated by digital transmission signal, and another band is modulated by correcting signal, which ensures provision of analog transmission signal of waveform envelope for demodulation. Generator of amplitude-modulated signal is intended to be used for generation and transmission of aforementioned signals.

EFFECT: creation of method for simultaneous transmission of digital and analog signals through a single channel.

4 cl, 2 dwg

Up!