RussianPatents.com

Video data filtration with help of multiple filters

IPC classes for russian patent Video data filtration with help of multiple filters (RU 2521081):
Another patents in same IPC classes:
Method and system for wireless transmission of audio data in wireless networks Method and system for wireless transmission of audio data in wireless networks / 2519817
Invention relates to wireless communication and particularly to transmission of audio data in wireless networks which transmit digital video and digital audio in high-definition multimedia interface (HDMI) format. The method involves obtaining position information of audio packets within an HDMI frame; transmitting digital audio information including the position information from a data source device to a data sink device via a wireless communication medium. At the data sink device, an HDMI frame is reconstructed by inserting received audio packets into horizontal and vertical blanking periods of the HDMI frame.
Image display device, remote controller and control method thereof Image display device, remote controller and control method thereof / 2519599
Invention relates to an image display device, a remote controller and a control method thereof. Disclosed is a method of controlling an image display device which includes: displaying an infrared (IR) blaster menu screen, receiving a selection input for selecting one of electronic devices included in the IR blaster menu screen, and transmitting key IR format information about the selected electronic device or device information about the selected electronic device to a remote controller in accordance with the selection input.
Motion vector predictive encoding method, motion vector predictive decoding method, moving picture encoding apparatus, moving picture decoding apparatus, and programmes thereof Motion vector predictive encoding method, motion vector predictive decoding method, moving picture encoding apparatus, moving picture decoding apparatus, and programmes thereof / 2519526
Invention relates to computer engineering. A motion vector predictive encoding method in a moving picture encoding scheme involves performing a motion search for a block to be encoded, using an encoded reference picture; setting a plurality of blocks located in predetermined positions relative to the position of the block to be encoded as primary candidate blocks, and determining N primary candidate reference motion vectors from motion vectors used in encoding the primary candidate blocks; calculating degrees of reliability of the primary candidate reference motion vectors using encoded picture information for each of the primary candidate reference motion vectors; selecting upper M primary candidate reference motion vectors having higher degrees of reliability from the N primary candidate reference motion vectors as secondary candidate reference motion vectors; and calculating a predictive motion vector of the block using the secondary candidate reference motion vectors, and encoding a residual between the motion vector and the predictive motion vector.
Motion vector predictive encoding method, motion vector predictive decoding method, moving picture encoding apparatus, moving picture decoding apparatus, and programmes thereof Motion vector predictive encoding method, motion vector predictive decoding method, moving picture encoding apparatus, moving picture decoding apparatus, and programmes thereof / 2519525
Invention relates to predictive motion vector predictive encoding/decoding of moving pictures. The moving picture encoding apparatus includes a primary candidate reference motion vector determination unit which sets N primary candidate reference motion vectors, a degree of reliability calculation unit which calculates the reliability of each primary candidate reference motion vector which quantitatively represents effectiveness in motion vector prediction of the block to be decoded, using encoded or decoded picture information, a reference motion vector determination unit selects M (M<N) secondary candidate reference motion vectors in accordance with the degree of reliability of N primary candidate reference motion vectors, a motion vector encoding unit calculates a predictive motion vector of the block to be encoded using M secondary candidate reference motion vectors with high reliability.
Method for alphabetical image representation Method for alphabetical image representation / 2519445
Method for alphabetical representation of images includes a step for primary conversion of an input image to a multi-centre scanning (MCS) format, constructed according to rules of a plane-filling curve (PFC). The initial MSC cell is a discrete square consisting of nine cells (3×3=9), having its own centre and its own four faces (sides). Scanning of the initial MSC cell is performed from the centre to the edge of the square while bypassing the other cells on a circle. The path with a bypass direction to the left from the centre of the square and then on a circle, clockwise, is the priority path for scanning and displaying images.
Method of controlling access to set of channels for receiving or decoding device (versions) Method of controlling access to set of channels for receiving or decoding device (versions) / 2519395
Invention relates to computer engineering. A method of controlling access to a set of channels using a receiver/decoder comprising a security module (SC), each channel being encrypted by a specific channel control word (CW1, CW2), each channel having a channel identifier and transmitting access control messages ECM containing at least the current channel control word and the channel access conditions. The method comprises the following steps: tuning to a first channel having a first channel identifier (ID1); transmitting the ID1 to the SC; receiving first access control messages ECM1 containing a first control word (CW1); transmitting the first access control messages ECM1 to the SC; decrypting the first access control messages ECM1 and verifying the channel access conditions; if the access conditions are met; transmitting the CW1 to the receiver/decoder; storing of the CW1 and the ID1 in the SC; tuning to a second channel having a second channel identifier ID2; transmitting the ID2 to the SC; calculating, by the SC, the second control word (CW2) by performing the following steps: calculating a root control word (RK) with an inverse cryptographic function F-1 using the CW1 and the ID1; calculating the CW2 with the cryptographic function F using the RK and the ID2; transmitting the CW2 to the receiver/decoder.
Video encoding method and apparatus, and video decoding method and apparatus Video encoding method and apparatus, and video decoding method and apparatus / 2519280
Invention relates to video encoding and decoding. Disclosed is a video encoding method which comprises steps of: dividing a current picture into at least one maximum coding unit, determining encoded depth to derive the final encoding result according to at least one division region obtained by dividing the maximum the region of the maximum coding unit according to depths, by encoding at least one division region based on depth which is increased in proportion to the number of times the region of the maximum coding unit is divided; and outputting image data constituting the final encoding result according to at least one division region, and encoding information on encoded depth and a prediction mode according to at least one maximum encoding unit.
Method of encoding video and apparatus for encoding video based on coding units defined according to tree structure, and method of decoding video and apparatus for decoding video based on coding units defined according to tree structure Method of encoding video and apparatus for encoding video based on coding units defined according to tree structure, and method of decoding video and apparatus for decoding video based on coding units defined according to tree structure / 2518996
Method of encoding video comprises steps of: breaking down a video image into one or more maximum coding units, encoding the image based on coding units according to depths obtained via hierarchical breakdown of each of the one or more maximum coding units, and outputting data which are encoded based on the coding units, having a tree structure, information with respect to the encoded depths and the encoding mode and information of the structure of the coding units, indicating the size and variable depth of the coding units.
Method and apparatus for encoding and decoding image and method and apparatus for decoding image using adaptive coefficient scan order Method and apparatus for encoding and decoding image and method and apparatus for decoding image using adaptive coefficient scan order / 2518935
Method of encoding an image using an adaptive coefficient scan order comprises: projecting coefficients of a current block to a reference axis, from among a horizontal axis and a vertical axis, along a first straight line perpendicular to a second straight line with a predetermined angle α from the reference axis; scanning the coefficients of the current block in an arrangement order of the projected coefficients projected to the reference axis; and entropy-encoding information about the predetermined angle α and the scanned coefficients.
Hypothetical reference decoder for scalable video coding Hypothetical reference decoder for scalable video coding / 2518904
Invention relates to a hypothetical reference decoder (HRD) for scalable video coding (SVC). The invention proposes to modify the H.264/AVC HRD standard for use with the SVC of advanced video coding (AVC). That implementation defines HRD constraints for each interoperability point of SVC. The changes for spatial, temporal and SNR scalability are shown. There are also changes to the related HRD parameters that are shown. At least one implementation proposes the SVC-HRD rules as modifications to the AVC-HRD rules. A user may use the proposed SVC-HRD rules to build an SVC-HRD and test a bitstream for SVC compliance.
Method and apparatus for generating recommendation for content element Method and apparatus for generating recommendation for content element / 2420908
Like or dislike of a content element played on a personalised content channel is determined based on feedback from the user; the profile is updated based on the determined like or dislike, wherein that profile is associated with the personalised content channel and contains a plurality of attributes and attribute values associated with said content element, where during update, if like has been determined, a classification flag associated with each of said attributes and attribute values is set; the degree of liking is determined for at least on next content element based on said profile; and that at least one next content element is selected for playing on the personalised content channel based on the calculated degree of liking.
Method and apparatus for generating recommendation for content element Method and apparatus for generating recommendation for content element / 2420908
Like or dislike of a content element played on a personalised content channel is determined based on feedback from the user; the profile is updated based on the determined like or dislike, wherein that profile is associated with the personalised content channel and contains a plurality of attributes and attribute values associated with said content element, where during update, if like has been determined, a classification flag associated with each of said attributes and attribute values is set; the degree of liking is determined for at least on next content element based on said profile; and that at least one next content element is selected for playing on the personalised content channel based on the calculated degree of liking.
Method to grant license to client device corresponding to coded content and system of conversion to manage digital rights, applying this method Method to grant license to client device corresponding to coded content and system of conversion to manage digital rights, applying this method / 2421806
Method of a conversion system operation to manage digital rights to grant a license to a client's device corresponding to coded content consists in the following. The first content of the first type of digital rights content and the first license corresponding to the first content are converted to manage digital rights in order to generate the second content of the second type of digital rights content and the second license corresponding to the second content. A license request is received, corresponding to the second content distributed by means of superdistribution to a third party. The second license corresponding to the second content distributed by means of superdistribution is requested from a server corresponding to the second management of digital rights. The second license corresponding to the second content distributed by means of superdistribution is received and sent to a third party.
Server device, method of license distribution and content receiving device Server device, method of license distribution and content receiving device / 2447585
Network server of television server sets in random manner according to Internet protocol (IPTV) time of request for receiving main license within time period starting from time of broadcast transmission and ending at preset time in accordance with request for receiving license for playback of encrypted content, where request for receive comes from IPTV client terminal, and transmits to IPTV client terminal information about time of request for receiving main license and temporary license comprising temporary key of content which key corresponds to playback of broadcast transmission content from time of broadcast transmission start till preset time. License server transmits main license including content main key which corresponds to full playback of content according to request for receiving main license which request is executed using IPTV client terminal based on information about request for receive.
Connecting devices to multimedia sharing service Connecting devices to multimedia sharing service / 2449353
Multimedia content purchasing system comprising: a memory area associated with a multimedia service; a multimedia server connected to the multimedia service via a data communication network; a portable computing device associated with a user; and a processor associated with the portable computing device, said processor being configured to execute computer-executable instructions for: establishing a connection to the multimedia server when the multimedia server and the portable computing device are within a predefined proximity; authenticating the multimedia server and the user with respect to the authenticated multimedia server; transmitting digital content distribution criteria; receiving, in response, promotional copies of one or more of the multimedia content items and associated metadata; and purchasing, when the multimedia server and the portable computing device are outside the predefined proximity, at least one of said one or more multimedia content items.
Device and method to process and read file having storage of media data and storage of metadata Device and method to process and read file having storage of media data and storage of metadata / 2459378
Device (600) to process stored data packets (110; 112) in a container of media data (104) and stored related meta information in a container of meta data (106); related meta information, including information on timing of transportation and information on location, indicating location of storage of saved data packets in the media data container (104); a device, comprising a processor (602) for detection, based on stored data packets (110; 112) and stored related meta information (124; 128); information on decoding (604; 704) for media useful load of stored data packets (110; 112), where information on decoding (604; 704) indicates, at which moment of time to repeatedly reproduce which useful load of stored data packets.
Integrated interface device and method of controlling integrated interface device Integrated interface device and method of controlling integrated interface device / 2465740
Provided is an integrated interface device for performing a hierarchical operation for specifying a desired content list. The interface device has a function to display a content list, content specified by the content list, or the like by efficiently using a vacant area in the lower part of the display by displaying icons which display a hierarchical relationship, for example, "display in a row", in the upper part of the screen, thereby freeing a large space in the lower part of the display.
Method and system to generate recommendation for at least one additional element of content Method and system to generate recommendation for at least one additional element of content / 2475995
Channel of individualised content makes it possible to play multiple elements of content (programs) meeting multiple selection criteria. At least one additional element of content is recommended by a mechanism (107) of recommendations, besides, at least one additional element of content meets less quantity of criteria. In the version of realisation at least one recommended additional element of content is selected, and multiple selection criteria are corrected by a planner (109) on the basis of at least one characteristic of a selected recommended additional element of content.
Method and system to generate recommendation for at least one additional element of content Method and system to generate recommendation for at least one additional element of content / 2475995
Channel of individualised content makes it possible to play multiple elements of content (programs) meeting multiple selection criteria. At least one additional element of content is recommended by a mechanism (107) of recommendations, besides, at least one additional element of content meets less quantity of criteria. In the version of realisation at least one recommended additional element of content is selected, and multiple selection criteria are corrected by a planner (109) on the basis of at least one characteristic of a selected recommended additional element of content.
Wireless transmission system, relay device, wireless recipient device and wireless source device Wireless transmission system, relay device, wireless recipient device and wireless source device / 2480943
Wireless transmission system includes: a device (1) which wirelessly transmits AV content and a plurality of wireless recipient devices (5, 6) for reproducing the transmitted AV content. The device (1) for transmitting content has a group identification table which stores a group identifier for identification of a group formed by the wireless recipient device (5, 6). The device (1) adds the group identifier extracted from the group identification table to a control command for controlling recipient devices (5, 6) and wirelessly transmits the control command having the group identifier. The recipient devices (5, 6) receive the wirelessly transmitted control command from the device (1) if the corresponding group identifier has been added to the control command. The device (1) for transmitting content consists of a wired source device and a relay device which is connected by wire to the wired source device, and the relay device is wirelessly connected to the wireless recipient device and mutually converts the wired control command transmitted to the wired source device, and the wireless control command transmitted to the wireless recipient device, wherein the wired source device and the relay device are connected via HDMI (High-Definition Multimedia Interface).

FIELD: process engineering.

SUBSTANCE: invention relates to filtration of video data with the help of multiple filters. Proposed method comprises reception and decoding of multiple filters built in video data bit flow in video decoder. Note here that particular filter is selected from said multiple filters proceeding from said data built in video data bit flow. Particular filter is applied to, at least a portion of decoded video data of bit video data flow to generate filtered decoded video data.

EFFECT: decreased number of bits required for transfer of filter coefficients for multiple filters.

37 cl, 5 dwg

 

I. the technical FIELD TO WHICH the INVENTION RELATES.

The present disclosure is generally directed to a system and method for filtering video data using a variety of filters.

II. The LEVEL of TECHNOLOGY

Advancement in technology resulted in smaller and more powerful computing devices. For example, there are currently a variety of portable personal computing devices, including wireless computing devices, such as portable wireless telephones, personal digital assistants (PDA) device, a personal call, which is small, light in weight and easily carried by the users. More specifically, portable wireless telephones, such as cellular phones and phones that support Internet Protocol (IP), can transmit voice and data packets over wireless networks. In addition, many of these cordless phones include other types of devices that are built into them. For example, wireless phones can also include a digital still camera, digital video camera, a digital recorder and audio player. In addition, these cordless phones can handle executable commands, including software applications, such as web browser, which is may be used to provide access to the Internet. As such, these wireless phones may involve significant computational capabilities.

Digital signal processors (DSP), image processors and other processing devices are often used in portable personal computing devices that include a digital camera or which display image data or video data captured by a digital camera. Such processing devices may be used to provide video and audio, to process the received data, such as image data, or to perform other functions.

One type of video processing includes filtering, which can be used to improve the quality of the decoded video signal. The filter can be applied as postprocessing filter, where the filtered frame is not used to predict future frames, or in the form of a circular filter, where the filtered frame is used to predict future frames. The filter can be designed according to the reduction of the error between the original signal and the filtered decoded signal. Similarly, for the coefficients of the transform coefficients of the resulting filter can quantize, encoded and sent to the video decoder. More accurate quantized filter coefficients can deathlock operating characteristics. However, if increased precision of the quantized filter coefficients, can also increase the number of bits required for transmission coefficients, with corresponding impact on network resources, to speed delivery of data, or both.

III. The INVENTION

In the video encoder can be defined multiple filters and provided to the receiver through the video stream. The receiver can extract information from the data stream to identify which of the set of filters to apply to a particular frame of a particular macroblock, the specific pixel or any combination thereof. Multiple filters can be used for filtering post-processing or filtering inside the loop in the decoder.

In the specific embodiment disclosed, the method including receiving and decoding in the decoder sets of filters embedded in a bit stream of video data. The method includes selecting based on the information included in the bitstream of video data, a specific filter from the set of filters. The method further involves applying a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data.

In another embodiment, the implement is placed disclosed device, which includes a video decoder configured to receive and decode a variety of filters embedded in a bit stream of video data. The device also includes a processor configured to select, based on the information included in the bitstream of video data, a specific filter from a variety of filters and application-specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data.

In another embodiment, disclosed integrated circuit, which includes the decoding scheme of the video that is configured to receive and decode the signal, which includes a set of filters embedded in a bit stream of video data. The integrated circuit also includes a processing circuit configured to process the decoded signal so based on the information included in the bitstream of video data, to select a specific filter from a variety of filters and apply a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data.

In another embodiment, the disclosed device includes a means for decoding the plural the CTB filters, nested in the bitstream of video data. The device includes means for selecting, on the basis of the information included in the bitstream of video data, a specific filter from the set of filters. The device additionally includes means for applying a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data.

In another embodiment, disclosed computer-readable media storing computer-executable code. The computer-readable medium includes code for receiving and decoding in the decoder sets of filters embedded in a bit stream of video data. The computer-readable medium includes code for selecting, based on the information included in the bitstream of video data, a specific filter from the set of filters. The computer-readable medium additionally includes code for applying a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data.

One particular advantage provided by disclosed variants of implementation, is to improve the working characteristics of the filter, especially the working characteristics of postfi is Tracie, to improve the quality of the decoded video signal. Another particular advantage provided by disclosed variants of implementation, is to reduce the number of bits required for transmission of filter coefficients for a variety of filters.

Other aspects, advantages and features of the present disclosure will become apparent after considering a complete description of the application, including the following sections: Brief description of drawings, Detailed description and claims.

IV. BRIEF DESCRIPTION of DRAWINGS

Figure 1 - block diagram of a particular illustrative variant implementation of the system for processing video data, which includes a bit stream of video data and multimedia receiver;

Figure 2 - block diagram of a particular illustrative case for device processing video data includes a video decoder and processor;

Figure 3 - block diagram of a particular illustrative case for the integrated circuit, which includes a diagram of the video decoding and processing circuitry;

4 is a diagram of a sequence of operations for a specific illustrative variant of the method for filtering video data using a variety of filters; and

5 is a block diagram of a particular case for a portable communication device that includes the module is kodirovanija and filtering using multiple filters.

V. DETAILED DESCRIPTION

Refer to Figure 1, which illustrates a particular implementation of system 100, the video data processing. System 100 processing video data includes a bit stream 102 of video data received by the multimedia receiver 108. The bitstream 102 video data includes encoded video data 106, multiple filters 104 and information 122 filter selection. Multimedia receiver 108 includes a decoder 110 video module 112 of the filter, the selector 118 of the filter and the display 116. The system 100 enables multimedia receiver 108 to select a filter from the bitstream 102 video data on the basis of information 122 filter selection.

Decoder video data 110 is configured to decode encoded video data 106. For example, the decoder video data 110 may be configured to decode the entropy-encoded data and performing inverse discrete cosine transform (DCT) on the resulting data. In a specific embodiment, the decoder 110 video data includes a decoder that is compatible with H.264 or standard of the Expert group on the moving images (MPEG).

Module 112 filter configured to receive the filter from the selector 118 of the filter, for example, the second decoded fil is R 120. Module 112 filter configured to apply the received filter 120 to the decoded video data received from the decoder 110 video. Module 112 of the filter can be configured to apply a filter to the decoded video data at the granularity of a frame, macroblock, or pixel, to produce filtered decoded video data 114, which are provided on the display 116. Module 112 of the filter can be implemented inside the loop decoding (not shown) or to filter post-processing, or any combination thereof.

The selector 118 filter configured to receive position information 122 select filter and select the appropriate filters from a variety of filters 104. In a specific embodiment, the selector 118 of the filter is adapted to decode multiple filters 104 and to provide the selected decoded filter, such as a second decoded filter 120, the module 112 of the filter. The selector 118 of the filter can choose the decoded filters to provide to the module 112 filtering on the basis of information 122 filter selection. In a specific embodiment, the selector 118 filter compares one or more characteristics of the decoded video data, which are generated by the decoder 110 video information 122 of the filter selection to select the proper filter for a specific geodannyh, provided on the module 112 of the filter.

During operation, the encoded video data 106 are received and decoded by the decoder 110 of the video data in the multimedia receiver 108. Many filters 104 and information 122 filter selection are received and decoded in the selector 118 of the filter in a multimedia receiver 108. The selector 118 filter selects a specific decoded filter 120 from a variety of filters 104 on the basis of information 122 of the filter selection is included in the bitstream 102 video. Specific decoded filter 120 is applied, at least part of the decoded video data in the module 112 of the filter in a multimedia receiver 108 that produce filtered decoded video data 114. The filtered decoded video data 114 is displayed on the display 116 multimedia receiver 108.

By taking multiple filters together with the coded video data 106 of the multimedia receiver 108 may select specific filters, which give, as a result, the lowest error for each unit of the decoded video data. For example, you can select the filter that provides the smallest mean square error for a specific frame of video data on a frame-by-frame basis. As another example, you can select the filter that provides the lowest error for it is Kratovo macroblock on a block-by-block basis or on a pixel-by-pixel basis. System 100 processing video data may, therefore, provide a better working characteristics of the filter, in particular, the working characteristics of the postfilter to improve the quality of the decoded image. In addition, by encoding the filter coefficients and, in some embodiments, implementation factors, some filters for prediction of subsequent coefficients filter system 100 processing video data additionally provides a reduction in the number of bits required for transmission of filter coefficients for each filter of the set of filters 104.

Refer to Figure 2, which illustrates a particular implementation of the device 200, the video data processing. The device 200 of processing video data includes a video decoder 202 and a processor 206. The video decoder 202 is configured to receive and decode a variety of filters 204, nested in the bitstream of video data. In a particular embodiment, at least a portion of the video data in the bit stream of video data encoded using the encoding by MPEG. The processor 206 includes module 208 define the frame, the module 210 to determine the macroblock, the module 212 determining a pixel module 230 of the filter selection module 232 of the filter. In the illustrative embodiment, the video decoder 202 t is aetsa decoder 102 video data of figure 2, and many filters 204 is embedded in the bitstream of video data in the same way as many filters 104 of figure 1, embedded in the bitstream 102 video.

In a particular embodiment, the module 230 filter selection is performed by processor 206 to select a specific filter from the set of filters 204 on the basis of the information included in the bitstream of video data. In a specific embodiment, the information included in the bitstream of video data is of such information 122 of the choice of the filter of figure 1, included in the bitstream 102 video.

In a particular embodiment, the module 232 of the filter is executed by the processor 206 to apply a specific filter, the selected module 230 filter selection, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data. In a specific embodiment, generated filtered decoded video data is similar to the filtered decoded video data 114 of figure 1.

In a particular embodiment, the module 208 define the frame is executed by the processor 206 to determine frames of video data, for which you must apply each filter of the set of filters 204, and the information included in the bitstream of video data, identify the range of frames corresponding to each filter by at least one of: the frame number or frame type. In a particular embodiment, the frame types may include a frame type with internal encoding (I-frame), the frame type of the image prediction (P-frame) or the type of frame images with bi-directional prediction (B-frame). For example, the module 208 define the frame can determine the frame number for each frame and to provide a frame number on the module 230 filter selection. For illustration module 208 define the frame may determine that the workpiece frame 222 has a frame number "5"in response to that module 230 filter selection selects the first decoded filter 216 to be applied to the decoded frame 222 with the number "5". Can be used different ways to specify which filters are used and what filters are combined. For example, the decoder may be signaled, for types B-frame use the filters f1f2and f3.

In a particular embodiment, module 210 to determine the macroblock is executed by the processor 206 to determine macroblocks for which to apply each filter of the set of filters 204. The information included in the bitstream of video data, can identify the macroblock matching the each filter, at least, by one of: list of types of macroblocks (for example, a frame with internal encoding of frame interframe encoding, and frame with bidirectional interframe coding) or range of values of the quantization parameter used for the recovery of macroblocks as illustrative, non-restrictive examples. For example, the module 210 to determine the macroblock can determine the type of each macroblock and to provide a certain type of macroblock module 230 filter selection. For illustration, the module 210 to determine the macroblock may determine that specific processed macroblock 224 is of type "A" (for example, the frame type with internal coding), in response to that module 230 filter selection selects the second decoded filter 218 to apply to a specific macroblock 224.

In a particular embodiment, the module 212 determining pixel is executed by the processor 206 to determine the pixels that should be applied to each filter of the set of filters 204, based on a predetermined measure of local characteristics of the image 214. Module 212 determining pixel can generate a value of a predetermined index 214 for a particular pixel (i,j) 226 processed in row i and column j of the macroblock or frame of the decoded video signal, in response to that module 23 filter selection selects, to the third decoded filter 220 were to be applied to the pixel (i,j) 226.

In a particular embodiment, the predefined measure of local characteristics of the image 214 includes a variance value of the reconstructed image from the average values for the reconstructed image. For example, for the restored image R(i,j), where i=0..., M and j=0,..., N, the average value of R ( i , j ) can be set so that the R ( i , j ) = k = - K K l = - L L R ( i + k , j + l ) ( 2 K + 1 ) ( 2 L + 1 ) . The variance value var(i,j) restored image R(i,j) from the average value R ( i , j ) can be set so that

var(i,j)= k = - K K l = - L L ( R ( i + k , j + l ) - R ( i , j ) ) 2 ( 2 K + 1 ) ( 2 L + 1 )

In a particular embodiment, the predefined measure of local characteristics of the image 214 includes the absolute value of the difference inside the restored image. For example, for the restored image R(i,j), where i=0...,M and j=0,..., N, the absolute value of the difference abs(i,j) can be defined so that

abs(i,j)= k = - K K l = - L L | R ( i + k , j + l ) - R ( i , j ) | ( 2 K + 1 ) ( 2 L + 1 ) .

In a particular embodiment, the predefined measure of local characteristics of the image 214 includes the values of the gradients inside the restored image. For example, the gradient values of the image of interest, the pixel can be defined as a predetermined measure of local characteristics of the image 214. In another embodiment, the predefined measure of local characteristics of the image 214 includes indicators of sharpness within the reconstructed image.

In a specific embodiment, the first filter set filter 204 is applied to the first pixel having the first value is a predetermined measure of local characteristics of the image 214 in the first range of values, and the second filter set filter 204 is applied to the second pixel having the second value predetermined measure of local characteristics of the image is of 214 in the second range of values. For example, the filters fmfor m=0,...,n+1 can be applied so that the filter f0is applied to the pixel (i,j), with the variance value var(i,j), which lies in the range 0≤var(i,j)<var0the filter f1is applied to the pixel (i,j), with the variance value var(i,j), which lies in the range of var0≤var(i,j)<var1and, generally, the filter frfor r=1,...,n is applied to the pixel (i,j), with the variance value var(i,j), which lies in the range of varr-1<var(i,j)<varrwhere the filter fn+1is applied to the pixel (i,j), with the variance value var(i,j), which lies in the range of varn≤var(i,j). In an alternative embodiment, the filters f1and f2can be applied so that the filter f1is applied to the pixel (i,j), with the variance value var(i,j), which lies in the range 0≤var(i,j)<var0the filter f1is applied to the pixel (i,j), with the variance value var(i,j), which lies in the range of var0≤var(i,j)<var1and otherwise applies the filter f2.

In a specific embodiment, the quantized filter coefficients for each filter of the set of filters 204 are cut off so that they were within the range of from about 0 to about 2 in the n-th degree. Range from about 0 to about 2 in the n-th degree can be divided into many m intervals. The number m of intervals defines who I am, at least in part, on the basis of the indices of the quantized filter coefficients for each filter of the set of filters 204. For example, a quantized filter coefficients fr(k,l) for r=0...,s+1, k=-K,..., K, and l=-L,..., L can be cut so that they were within the range 0≤fr(k,l)≤2n. The range 0≤fr(k,l)≤2ncan be divided into many m intervals, where the number m of intervals is determined, at least partially, on the basis of indices (k,l) of the quantized filter coefficients fr(k,l) for r=0...,s+1, k=-K,..., K, and l=-L,..., L. In a specific embodiment, the specific quantized coefficient of the filter is determined by decoding code words of variable length, pointing to a specific interval of the m intervals, which corresponds to a specific value of the quantized coefficient of the filter, and decode the code words of a fixed length that specifies the value of a specific quantized coefficient of the filter within a specific interval.

In a specific embodiment, the first filter coefficients for the first filter from a set of filters 204 are used to predict the second filter coefficients for the second filter from the set of filters 204. For example, if the filters fmfor m=0,..., n+1 correspond to different values of the variance varrfor r=0,..., n, as opisanie, the filter f1can be predicted on the basis of filter f0the filter f2can be predicted on the basis of filter f1and, in General, the filter fs+1can be predicted on the basis of filter fsfor s=0,..., n.

One or more modules 208, 210, 212, 230, and 232 may be implemented as computer-executable code comprising program commands executed in the processor 206, in the form of specialized hardware circuits, in the form of finite state machines, in the form of a programmable gate arrays (FPGA), or any combination thereof. The processor 206 may execute one or more modules from the module 208 define the frame, the module 210 to determine the macroblock and module 212 determining the pixel to define filters to be applied to the decoded video data. In a particular embodiment, the device 200 of processing video data may include other components not shown, such as a display device configured to display the filtered decoded video data, similar to the display 116, shown in figure 1.

Refer to Figure 3, which illustrates an integrated circuit 300 of processing video data. Integrated circuit 300 of processing video data includes circuit 302 video decoding and circuit 306 processing. The circuit 302 configured to decode the video at the mA and decode the signal 328, includes many filters 304, sub-bitstream of video data. In a particular embodiment, multiple filters 304 are invested in the bitstream of a video image, similar to shown on Figure 1 multiple filters 104 inserted into the bitstream 102 video.

Circuit 306 processing configured to process the decoded signal 328 to select a specific filter from the set of filters 304 on the basis of the information included in the bitstream of video data. In a specific embodiment, the information included in the bitstream of video data is of such information 122 of the choice of the filter of figure 1, included in the bitstream 102 video. Circuit 306 processing includes circuit 308 define the frame, the circuit 310 defining a macroblock, the circuit 312 define a pixel circuit 330 filter selection and circuit 332 filter. Circuit 306 processing configured to process the decoded signal from circuit 302 video decoding, to apply a specific filter, such as a second decoded filter 316, the third decoded filter 318 or fourth decoded filter 320, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data. In a specific embodiment, are you botania filtered decoded video data are similar to the filtered decoded video data 114 of figure 1.

In a specific embodiment, the circuit 308 define the frame is configured to determine the frames that should be applied to each filter of the set of filters 304, and the information included in the bitstream of video data, identifies frames corresponding to each filter, at least through one of the frame number or frame type. For example, the circuit 308 define the frame may determine that a particular frame 322 has a frame number "6" and can provide the frame number on the diagram 330 of the filter selection. Circuit 330 filter selection may select the second decoded filter 316 for frame 322 on the basis of the frame number in accordance with the information, adopted using a bit stream of video data. Circuit 332 filter can apply a second decoded filter 316 to the frame 322 with a frame number "6".

In a specific embodiment, the circuit 310 determine the macroblock is configured to determine the macroblocks for which to apply each filter of the set of filters 304, and the information included in the bitstream of video data, identifies macroblocks corresponding to each filter by at least by one of: list of types of macroblocks or range of values of the quantization parameters used for the recovery of macroblocks. For example, the circuit 310 to define the population of the macroblock may determine, that particular macroblock 324 is of type "B" (for example, the frame type with bidirectional interframe prediction) and can provide the type of the macroblock on the schema 330 filter selection. Circuit 330 filter selection can choose the third decoded filter 318 for a particular macroblock 324 based on the type of macroblock and in accordance with the information, adopted using a bit stream of video data. Circuit 332 filter can apply the third decoded filter 318 to a specific macroblock 324 type "B".

In a specific embodiment, the circuit 312 define a pixel configured to process the decoded signal to determine the pixels that should be applied to each filter of the set of filters 304, based on a predetermined measure of local characteristics of the image 314. For example, the circuit 312 define a pixel can determine the value of a predetermined measure of local characteristics of the image 314, corresponding to a particular pixel (m,n) 326 in row m and column n, and may provide a value of a predetermined measure of local characteristics of the image 314 of the scheme 330 filter selection. Circuit 330 filter selection may select the fourth decoded filter 320 of the pixel (m,n) 326 based on the value of a predetermined measure of local characteristics of the image is of 314 in accordance with the information adopted using a bit stream of video data. Circuit 332 filter can apply the fourth decoded filter 320 to the pixel (m,n) 326. In a particular embodiment, the predefined measure of local characteristics of the image 314 is determined essentially in a similar manner to a predetermined measure of local characteristics of the image 214 of figure 2, for example, using a dispersion or gradient, as illustrative, non-restrictive examples.

In a particular embodiment, the device includes means for decoding the set of filters embedded in a bit stream of video data. Means for decoding the set of filters embedded in the bitstream of video data may include a video decoder, such as video decoder 202, shown in figure 2, the video decoding scheme, such as circuit 302 video decoding, shown in Figure 3, the corresponding hardware, software, firmware, or any combination thereof. The device includes means for selecting, on the basis of the information included in the bitstream of video data, a specific filter from the set of filters. Means for selecting a specific filter from the set of filters may include a processor, such as processor 206 shown in IG, a processing circuit, such as circuit 306 of the processing shown in Figure 3, the corresponding hardware, software, firmware, or any combination thereof. The device additionally includes means for applying a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data. Means for applying a particular filter may include a processor, such as processor 206, shown in figure 2, a processing circuit, such as circuit 306 of the processing shown in Figure 3, the corresponding hardware, software, firmware, or any combination thereof.

In a particular embodiment, the device includes a means for determining frame for which to apply each filter of the set of filters, and the information included in the bitstream of video data identifies frames corresponding to each filter by at least by one of: the frame number or frame type. Means for determining the frame may include a processor, such as processor 206, shown in figure 2, a processing circuit, such as circuit 306 of the processing shown in Figure 3, the corresponding hardware-ware is, software, firmware, or any combination thereof.

In a particular embodiment, the device includes a means for determining macroblocks for which to apply each filter of the set of filters, and the information included in the bitstream of video data identifies macroblocks corresponding to each filter by at least by one of: list of types of macroblocks or range of values of the quantization parameter that is used to reconstruct the macroblock. Means for determining macroblocks may include a processor, such as processor 206, shown in figure 2, a processing circuit, such as circuit 306 of the processing shown in Figure 3, the corresponding hardware, software, firmware, or any combination thereof.

In a particular embodiment, the device includes means for identifying pixels that should be applied to each filter of the set of filters, based on a predetermined measure of local characteristics of the image. Means for determining the pixels may include a processor, such as processor 206, shown in figure 2, a processing circuit, such as circuit 306 of the processing shown in Figure 3, the corresponding hardware on AspectJ, software, firmware, or any combination thereof.

In a particular embodiment, the device includes means for receiving a bit stream of video data via wireless transmission. Means for receiving a bit stream of video data from the wireless transmission may include a wireless receiver, the wireless receive circuit, the wireless transceiver, the portable communication device, such as shown in Figure 5 and described more fully below, the appropriate hardware, software, firmware, or any combination thereof.

Refer to Figure 4, which illustrates a method 400 for filtering video data using a variety of filters. The method 400 includes receiving and decoding a variety of filters embedded in the bitstream of video data in the video decoder at step 402. For example, the set of filters 204 of figure 2 can be embedded in the bitstream of video data, such as bit stream 102 video data of figure 1. Many filters 204 can be received and decoded in the video decoder 202 of figure 2.

The method 400 at step 404 includes selecting, on the basis included in the bit stream of video information, a specific filter from the set of filters. For example, the processor 206 of figure 2 can wybir the th specific filter from the set of filters 204, such as the first decoded filter 216, on the basis of the information included in the bitstream of video data, such as information 122 of the choice of the filter of figure 1, included in the bitstream 102 video.

The method 400 at step 406 additionally involves applying a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data. For example, the processor 206 of figure 2 can apply the decoded filter 216, at least part of the decoded video data, for example, a particular frame 222, a bit stream of video data to produce filtered decoded video data, such as filtered decoded video data 114 of figure 1.

Figure 5 shows the block diagram of a particular variant of the implementation of the system including the module decoding and filtering using multiple filters. The system 500 may be implemented in a portable electronic device and includes a processor 510, such as a digital signal processor (DSP), coupled to the memory 532. The system 500 includes a module for decoding and filtering using multiple filters 564. In the illustrative example, the decoding module and filtering using multiple filters 564, includes any of the systems the figures 1-3, acts in accordance with the method of Figure 4, or any combination thereof. Module decoding and filtering using multiple filters 564 may be in the processor 510 or may be a separate device or circuit together with the hardware pipeline image processing (not shown), or a combination thereof.

Interface 568 camera is connected to the processor 510 and is also connected to a camera, such as camera 570. Interface 568 camera can be controlled by the processor 510, for example, on the management of AF and AE. The controller 526 display connected to the processor 510 and display 528. Coder/decoder (CODEC) 534 may also be connected to the processor 510. Loudspeaker 536 and the microphone 538 can be connected to the codec 534. Wireless interface 540 may be connected to the processor 510 and the wireless antenna 542.

The processor 510 may also be adapted for forming the processed image data. The controller 526 display configured to receive the processed image data to provide processed image data to the display 528. In addition, the memory 532 may be configured to receive and store the processed image data, and a wireless interface 540 may be configured to receive the processed image data for p is passing through the antenna 542.

In a particular embodiment, the decoding module and filtering using multiple filters 564, implemented in the form of computer code running on the processor 510, for example, computer-executable commands that are stored on computer-readable media, illustrated in the form of computer code 590, stored in the memory 532. For example, the computer code 590 may include code for receiving and decoding in the decoder sets of filters embedded in the bitstream of video data, the code to select a specific filter from the set of filters on the basis of the information included in the bitstream of video data, and code for applying a specific filter, at least part of the decoded video data from the bit stream of video data to produce filtered decoded video data.

For example, the computer code 590 may also include code for determining frame for which to apply each filter of the set of filters, and the information included in the bitstream of video data, identifies frames corresponding to each filter by at least one of: the frame number or frame type. As another example, the computer code 590 may also include code for determining macroblocks for which should be applied every fil is R from a set of filters, moreover, the information included in the bitstream of video data, identifies macroblocks corresponding to each filter by at least one of the types of macroblocks or range of values of the quantization parameter that is used to reconstruct the macroblock. Alternative or in addition, the computer code 590 may include code to determine the pixels that should be applied to each filter of the set of filters, based on a predetermined measure of local characteristics of the image. In a specific embodiment, the first filter from a variety of filters can be applied to the first pixel having the first value is a predetermined measure of local characteristics of the image in the first range of values, and the second filter from the set of filters is applied to the second pixel having the second value predetermined measure of local characteristics of the image in the second range of values.

In a particular embodiment, the processor 510, the controller 526 display, the memory 532, the codec 534, wireless interface 540 and the interface 568 camera included in a monohull or a single-chip device 522. In a particular embodiment, the device 530 input and 544 source of power connected on-chip device 522. In addition, concretamente implementation as illustrated in Figure 5, the display 528, device 530 input, loudspeaker 536, microphone 538, wireless antenna 542, 570 camcorder and 544 source of supply are external to the single chip device 522. However, each device from the display 528, device 530 input, loudspeaker, 536, microphone 538, wireless antenna 542, 570 camcorder and 544 source of power can be connected with a component of the single-chip device 522, such as an interface or a controller.

Specialists in the art will additionally appreciate that the various illustrative logical blocks, configurations, modules, circuits, and steps of the algorithm described in connection with the implementation disclosed in the document, may be implemented as electronic hardware, software, or combinations thereof. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, configurations, modules, circuits, and steps have been described above generally in terms of their functionality. Implemented whether such functionality in the form of hardware or software depends upon the particular application and design constraints imposed on the entire system. Specialists in the art can realitv is to describe the functionality in different ways for each particular application, but such solutions should not be interpreted driving beyond the scope of the present disclosure.

The stages of a method or algorithm described in connection with the disclosures provided in the document options implementation can be implemented directly in hardware, executable by the processor software module, or a combination of both. A software module may reside in random access memory device (RAM), flash memory, permanent memory (ROM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), registers, hard disk drive, removable drive, on a persistent storage device on the CD-ROM (CD-ROM) or any other form of storage medium known in the art. Illustrative data medium connected to the processor so that the processor can read information from the data medium and to record information on it. Alternatively, the data medium may be a built in processor. The processor and the storage medium may reside in a specialized integrated circuit (ASIC). Diagram of the ASIC may reside in a computing device or user is the tela terminal. Alternatively, the processor and the storage medium may reside as discrete components in the computing device or the user terminal.

The previous description of the disclosed embodiments is provided to enable any person skilled in the art to implement or use the present disclosure. Various modifications to these embodiments will be readily apparent to experts in the art and defined in the document General principles can be applied to other variants of implementation without departing from the scope of the essence or scope of the disclosure. Thus, the present disclosure does not purport to limit the options of the implementation shown in the document and must conform to the most widely possible extent in accordance with the principles and features of novelty, as defined by the following claims.

1. Way of filtering the decoded video data, comprising stages, which are:
decode video data of at least one frame and set of filters embedded in the bitstream of video data in the video decoder;
choose, on the basis of the information included in the bitstream of video data and associated with a characteristic of at least part of the decoded video data referred to at least one the Adra, a specific filter from the set of filters; and
apply a specific filter to one or more pixels of at least the said part of the decoded video data to produce filtered decoded video data.

2. The method according to claim 1, in which at least one frame contains multiple frames, and the method further comprises a stage on which:
determine a second filter from the set of filters to apply to the frame of the multiple frames, and the information included in the bitstream of video data, identifies the frame that corresponds to the second filter through at least one of: the frame number and frame type.

3. The method according to claim 1, additionally containing a stage, on which:
determine a third filter from the set of filters to apply to one or more macroblocks, and the information included in the bitstream of video data, identifies macroblocks corresponding to the third filter, at least via one of: a list of macroblock types and range of values of the quantization parameter used for the recovery of macroblocks.

4. The method according to claim 1, wherein the characteristic includes pre-specified measure of local characteristics mentioned one or more pixels, so the choice of a particular filter includes a stage on which selects a specific iltr based on a predetermined measure of local characteristics mentioned one or more pixels.

5. The method according to claim 4, in which the predefined indicator local characteristics includes the value of variance mentioned one or more pixels from the average value of one or more pixels.

6. The method according to claim 4, in which the predefined indicator local characteristics includes the absolute values of the differences mentioned one or more pixels.

7. The method according to claim 4, in which the predefined indicator local characteristics includes gradient values mentioned one or more pixels.

8. The method according to claim 4, in which the predefined indicator local characteristics include indicators sharpness mentioned one or more pixels.

9. The method according to claim 4, in which the first filter from a set of filters applied to the first pixel of the above-mentioned one or more pixels having the first value is a predetermined measure of local characteristics in the first range of values, and the second filter from the set of filters applied to the second pixel of the above-mentioned one or more pixels having a second value of the predetermined measure of local characteristics in the second range of values.

10. The method according to claim 1, in which the quantized filter coefficients for each filter set filter cut off so that they were within the range is from about 0 to about 2 in the n-th degree, moreover, the range from about 0 to about 2 in the n-th degree is divided into a number m of intervals, and the number m of intervals define, at least partially, on the basis of the indices of the quantized filter coefficients for each filter set filters.

11. The method according to claim 10, in which the specific quantized coefficient of the filter are determined by decoding code words of variable length, pointing to a specific interval for a certain number m of intervals that corresponds to a specific value of the quantized coefficient of the filter, and by decoding code words of a fixed length that specifies the value of a specific quantized coefficient of the filter within a particular interval.

12. The method according to claim 1, wherein the first filter coefficients of the first filter from the set of filters used for the prediction of the second filter coefficients of the second filter from the set of filters.

13. The method according to claim 1, in which the decoding of the set of filters includes a stage on which decode one or more filter coefficients from a set of filters embedded in a bit stream of video data.

14. The method according to item 13, in which the decoding set of filters contains the time that decode one of the coefficients of the filter and predict the other of the filter coefficients is based on the decoded one of the coefficients of the filter.

15. The method according to item 13, in which the selection of a particular filter includes a stage on which define the pixels of the above-mentioned one or more pixels for which to apply each filter of the set of filters on the basis of the estimated variance of these pixels from the average value of these pixels.

16. The method according to claim 1, additionally containing a stage, where the output of the filtered decoded video data to display.

17. A device for filtering the decoded video data,
contains:
video processor configured to decode video data of at least one frame and a variety of filters embedded in a bit stream of video data; and
a memory configured to store a variety of filters;
moreover, the video processor is additionally configured to:
to choose on the basis of the information included in the bitstream of video data and associated with a characteristic of at least part of the decoded video data referred to at least one frame, a specific filter from the set of filters; and
to apply a specific filter to one or more pixels of at least the said part of the decoded video data to produce filtered decoded video data.

18. The device according to 17, in which at least one frame contains many Cadro is, and the processor is additionally configured to:
to determine the second filter from the set of filters to apply to the frame of the multiple frames, and the information included in the bitstream of video data, identifies the frame that corresponds to the second filter through at least one of: the frame number and frame type;
to determine the third filter from the set of filters to apply to one or more macroblocks, and the information included in the bitstream of video data, identifies macroblocks corresponding to the third filter, through at least one of: a list of macroblock types and range of values of the quantization parameter used for the recovery of macroblocks; and
the feature contains predefined measure of local characteristics mentioned one or more pixels, so to select a specific filter processor configured to select a specific filter based on a predetermined measure of local characteristics mentioned one or more pixels.

19. The device according to 17, further containing a display device configured to display the filtered decoded video data.

20. The device according to 17, in which, for decoding multiple filters, video konfigurera is, to decode one or more filter coefficients from a set of filters.

21. The device according to claim 20, in which, for decoding multiple filters, the video processor is configured to decode one of the filter coefficients and to predict the other of the filter coefficients on the basis of the decoded one of the coefficients of the filter.

22. The device according to claim 20, wherein, to select a specific filter, the video processor is configured to determine the pixels of the above-mentioned one or more pixels for which to apply each filter of the set of filters on the basis of the estimated variance of these pixels from the average value of these pixels.

23. Integrated circuit for filtering the decoded
video contains:
the video decoding scheme, configured to
to decode a signal that includes video data, at least
a single frame and multiple filters, embedded in the bitstream
video data; and
a processing circuit configured to process
the decoded signal to:
selecting, on the basis of the information included in the bitstream of video data and associated with a characteristic of at least part of the decoded video data referred to at least one frame, a specific filter from the set of filters; and
the application of the Oia specific filter to one or more pixels, at least the said part of the decoded video data to produce filtered decoded video data.

24. Integrated circuit according to item 23, in which at least one frame contains multiple frames, and the processing circuitry is additionally configured to process the decoded signal to:
determine a second filter from the set of filters to apply to the frame of the multiple frames, and the information included in the bitstream of video data, identifies the frame that corresponds to the second filter through at least one of: the frame number and frame type;
definition of the third filter from the set of filters to apply to one or more macroblocks, and the information included in the bitstream of video data, identifies macroblocks corresponding to the third filter, through at least one of: a list of macroblock types and range of values of the quantization parameter used for the recovery of macroblocks; and
the feature contains predefined measure of local characteristics mentioned one or more pixels, so the choice of a particular filter includes the selection of a particular filter based on a predetermined measure of local characteristics mentioned one or more pixels.

25. Integra the other scheme for item 23, in which the first filter from the set of filters is applied to the first pixel having the first value is a predetermined measure of local characteristics containing the first pixels in the first range of values, and the second filter from the set of filters is applied to the second pixel of the above-mentioned one or more pixels having a second value of the predetermined measure of local characteristics in the second range of values.

26. Integrated circuit according to item 23, in which, to decode the signal, which includes multiple filters, the video decoding scheme is configured to decode one or more filter coefficients from a set of filters.

27. Integrated circuit according p in which to decode the signal, which includes multiple filters, the video decoding scheme is configured to decode one of the filter coefficients and to predict the other of the filter coefficients on the basis of the decoded one of the coefficients of the filter.

28. Integrated circuit according p in which to select a specific filter, the processing circuit configured to determine the pixels of the above-mentioned one or more pixels for which to apply each filter of the set of filters on the basis of the estimated variance of these pixels from the average value of these peak is spruce.

29. A device for filtering the decoded video data, comprising:
means for decoding video data, at least one frame and a variety of filters embedded in a bit stream of video data;
means for selecting, on the basis of the information included in the bitstream of video data and associated with a characteristic of at least part of the decoded video data referred to at least one frame, a specific filter from the set of filters; and
means for applying the filter to one or more pixels of at least the said part of the decoded video data to produce filtered decoded video data.

30. The device according to clause 29, in which at least one frame contains multiple frames, and optionally containing at least one of:
means for determining a second filter from the set of filters to apply to the frame of the multiple frames, and the information included in the bitstream of video data, identify the frame that corresponds to the second filter through at least one of: the frame number and frame type;
means for determining the third filter from the set of filters to apply to one or more macroblocks, and the information included in the bitstream of video data, identifies macroblocks according to the respective third filter, through at least one of: a list of macroblock types and range of values of the quantization parameter used for the recovery of macroblocks; and
the feature contains predefined measure of local characteristics mentioned one or more pixels, so the tool to select a specific filter includes means for selecting a particular filter based on a predetermined measure of local characteristics mentioned one or more pixels.

31. The device according to clause 29, further containing means for receiving the bit stream of video data via wireless transmission.

32. Computer-readable media containing the stored commands, which when executed, instruct one or more processors to:
decoding in the video decoder, at least one frame and set of filters embedded in a bit stream of video data;
to choose on the basis of the information included in the bitstream of video data and associated with a characteristic of at least part of the decoded video data referred to at least one frame, a specific filter from the set of filters; and
to apply a specific filter to one or more pixels of at least the said part of the decoded video data to produce filterand the e decoded video data.

33. Computer-readable media p, in which at least one frame contains multiple frames, and additionally contains the commands that when executed instruct mentioned one or more processors to:
to determine the second filter from the set of filters to apply to the frame of the multiple frames, and the information included in the bitstream of video data, identifies the frame that corresponds to the second filter through at least one of: the frame number and frame type;
to determine the third filter from the set of filters to apply to one or more macroblocks, and the information included in the bitstream of video data, identifies macroblocks corresponding to the third filter, through at least one of: a list of macroblock types and range of values of the quantization parameters used for the recovery of macroblocks; and
the feature contains predefined measure of local characteristics mentioned one or more pixels, so to select a specific filter mentioned one or more processors choose a specific filter based on a predetermined measure of local characteristics mentioned one or more pixels.

34. Computer-readable media p in which the first filter set filter p is menaut to the first pixels of the above-mentioned one or more pixels, having the first value is a predetermined measure of local characteristics in the first range of values, and the second filter from the set of filters applied to the second pixel of the above-mentioned one or more pixels having a second value of the predetermined measure of local characteristics in the second range of values.

35. Computer-readable media p, in which, for decoding multiple filter commands prescribe mentioned one or more processors to decode one or more filter coefficients from a set of filters.

36. Computer-readable media p, in which, for decoding multiple filter commands prescribe mentioned one or more processors to decode one of the filter coefficients and to predict the other of the filter coefficients on the basis of the decoded one of the coefficients of the filter.

37. Computer-readable media p in which to select a specific filter, prescribe commands mentioned one or more processors to determine the pixels of the above-mentioned one or more pixels for which to apply each filter of the set of filters on the basis of the estimated variance of these pixels from the average value of these pixels.

 

© 2013-2014 Russian business network RussianPatents.com - Special Russian commercial information project for world wide. Foreign filing in English.