Data recording device, data recording method, data processing device, data processing method, software, software media, data media and structure

FIELD: information technology.

SUBSTANCE: playback with variable speed is performed without picture quality deterioration. Controller 425 creates EP_map () with RAPI address in videoclip information file, dedicated information selection module 423 RAPI, image PTS with internal encoding, which is immediately preceded by RAPI, one of final positions of the picture with internal encoding, as well as the second, the third and the fourth reference pictures, which are preceded by the picture with internal encoding. The controller saves EP_map () in output server 426, i.e. controller 425 copies the value, close to given number of sectors (quantity of sectors, which can be read at one time during encoding process) of final positions for the four reference pictures (1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture) to N-th_Ref_picture_copy, defines value of index_minus 1 on the basis of N-th_Ref_picture_copy and records it to disc.

EFFECT: effective process performance with constant data reading time.

8 cl, 68 dwg

 

The technical field to which the invention relates.

The present invention relates to a recording device, data recording method, data processing device, data processing method, data, program, recording medium, recording medium and data structure data, in particular these that allow extremely convenient to process the data.

The level of technology

In recent years, for example, a digital versatile disk (DVD DVD) spread as a recording medium of large capacity and physical access. In addition, widespread DVD drives that perform various processing DVD.

The DVD includes a DVD recorder, which will record data broadcast television programs, etc. to DVD and reproduce these programs with DVD, car navigation systems, which use a DVD, on which are recorded map information, etc. that reproduce the map information from the DVD and show it's devices that use DVDs that have been recorded game programs, etc. that reads the program from the DVD and do it, and other devices.

Detail DVD is described, for example, non-patent document 1 "DVD Specifications for Read-Only Disc Part 3; Version 1.1 December 1997".

Device such as the DVD device, which can handle large amounts of data, should allow extremely convenient to process the data.

The invention

The present invention was developed considering the above problems, and the purpose of the present invention is to develop extremely convenient data processing.

The present invention is directed to a data recorder, which includes a detecting unit that selects image information with the internal coding contained in video data and becomes a point of operative access and detects information of position multiple images, in front of which is the selected image with the internal coding; unit designation, which identifies information image position located at a given distance from the image with the internal encoding status information of multiple images detected by the detection unit; and a recording unit that records information of the position of many of the images detected by the detection unit, and information about the image, indicated by the block symbols on the recording media data together with the video data.

The data recorder may further include block allocation, which allocates the information of the position for which zobrazenie, corresponding to the specified number of sectors, the information of the position of many of the images detected by the detection unit. Unit designation identifies information image position located at a given distance from the image with the internal encoding status information of multiple images, detected by the detecting unit, in accordance with the information of the position of the image corresponding to the specified number of sectors in the status information of multiple images, selected by the unit selection. The recording unit records the information on the position of the image corresponding to the specified number of sectors, the information of the position of many of the images selected by the block selection.

The specified number of sectors is set in accordance with the capacity of that device, reproducing video data from the recording media data, uses as a buffer for the video data.

The present invention is directed to a method of recording data, comprising the steps of selecting image information with the internal encoding, which is contained in video data and becomes operational point of access, and the detection information of the position of the multiple images, in front of which is the selected image with the internal encoding; designation information and the image position located at a given distance from the image with the internal encoding status information of multiple images, detektirovanii phase detection; and recording status information of multiple images, detektirovanii phase detection and image information indicated in the phase designations on the recording media data together with the video data.

The present invention is directed to the first program, which provides the computer processing, comprising the following steps: select image information from the internal encoding, which is contained in video data and becomes operational point of access, and the detection status information for multiple images, in front of which is the selected image with the internal encoding; designation information, the display position located at a given distance from the image with internal coding, status information of multiple images, detektirovanii phase detection; and recording status information of multiple images, the detected phase detection, and information of the image indicated in the phase designation on the recording media data together with the video data.

The present invention is directed to a program storage medium first is th program, moreover, the program execution of the computer leads to the implementation of the following stages of the method: image information with the internal encoding, which is contained in video data and becomes operational point of access, and the detection information of the position of the multiple images, in front of which is the selected image with the internal encoding; designation information, the display position located at a given distance from the image with internal coding, status information of multiple images, the detected phase detection; and recording status information of multiple images, the detected phase detection, and information of the image indicated on the phase designations on the recording media data together with the video data.

The present invention is directed to a processing unit that includes a block read that reads the position of many of the images in the internal encoding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images with NR the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding is contained in video data and becomes operational points of access; block selection, which selects images from the internal encoding, and many images that precede the image with internal coding, in accordance with the information symbols, which represents the image located at a specified distance from the images in the internal encoding in multiple images in the internal encoding, which become points online access to the video data read by the block read and recorded on the recording medium data, and many images that precede the image with the internal encoding; and block the play, which performs playback fast forward video with all images from the images in the internal encoding to the image located at a specified distance from the image with internal coding, many of the images that precede the image with internal coding, and which were selected by the block select.

Image located at a specified distance from the images in the internal encoding can represent images that are from from the of interests with the internal encoding approximately the specified number of sectors.

The specified number of sectors may be set in accordance with the capacity that is used as a buffer for the video data when the video data reproduced from the recording media data.

The present invention is directed to a method of processing data, comprising the following steps: reading status information of multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding is contained in video data and becomes operational points of access; selection of images from the internal encoding, and multiple images, which are preceded by image with internal coding, in accordance with the information symbols, which refers to images that are located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points online access to the video data, scianni is at the stage of reading and recorded on the recording media data, and many of the images that precede the image with the internal encoding; and playback fast forward video with all images from the images in the internal encoding to the image located at a specified distance from the images in the internal encoding, in many images, preceded by images from the internal encoding, and which have been selected in the selection step.

The present invention is directed to a second program, and program execution in a computer leads to the implementation of the following stages of the method: read status information of multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding are contained in video data and becomes operational points of access; selection of images from the internal encoding, and many images, preceded by images of internal the encoding, in accordance with the information symbols, which refers to images that are located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points online access to the video data read during read and recorded on the recording medium data, and many images that precede the image with the internal encoding; and playback fast forward video with all the images from the images in the internal encoding to the image located at a specified distance from the image with internal coding, many of the images that precede the image with internal coding, and which were selected in the selection step.

The present invention is directed to a program recording medium of the second program, and program execution in a computer leads to the implementation of the following stages of the method: read status information of multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images,in front of which there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding are contained in video data and becomes operational points of access; selection of images from the internal encoding, and multiple images, which are preceded by images from the internal encoding, in accordance with the information symbols, which refers to images that are located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points online access to the video data read during read and recorded on the recording medium data, and multiple images, in front of which there are images with internal encoding; and playback with fast fast forwarding a video with all the images from the images in the internal encoding to the image located at a specified distance from the images in the internal encoding, in many images, preceded by images from the internal encoding, and which have been selected in the selection step.

The present invention is a recording medium data recording data including information of position multiple images, in front of which is the image with the internal code of the management, selected from the image information with the internal encoding, which is contained in video data and becomes a point of online access, and information which designates the image located at a given distance from the image with internal coding, status information of multiple images.

The present invention is directed to a data structure that includes information of a position multiple images, in front of which is the image with the internal encoding is selected from the image information with the internal encoding, which is contained in video data and becomes a point of online access, and information which designates the image located at a given distance from the image with internal coding, status information of multiple images.

In the data recorder chosen by way of the write data and the first program in accordance with the present invention, image information with the internal encoding, which is contained in video data and becomes a point of online access. Detects information of position multiple images, in front of which is the selected image with the internal encoding. Denote the information provision image located at a given distance from the image is to be placed with the internal encoding, in the status information of multiple images that have been detected. Information of position multiple images that have been detected, and image information that has been marked, recorded on the recording media data together with the video data.

The processing unit reads the data processing method and the second program in accordance with the present invention, information of position multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data. Many of the images in the internal encoding are contained in video data and becomes operational points of access. Choose images from the internal encoding, and many images that precede the image with internal coding, in accordance with the information symbols, which represents the image located at a specified distance from the images in the internal encoding of multiple images from the internal to the a key, which are the points of operative access of video data that have been read and recorded on the recording medium data, and many images that precede the image with the internal encoding. Video plays fast forward with all images from the images in the internal encoding to the image located at a specified distance from the images in the internal encoding, in many images, preceded by images of internal coding that have been selected.

The recording media data and data structure in accordance with the present invention include the information of position multiple images, in front of which is the image with the internal encoding is selected from the image information with the internal encoding, which is contained in video data and becomes a point of online access, and information which designates the image located at a given distance from the image with the internal encoding is recorded in the status information of multiple images.

The device data and device data in accordance with the present invention can be implemented as an independent device or as a unit that performs data processing.

In accordance with the crust is Asim invention, the data processing can be performed with great ease. In particular, the manufacturing process can be effectively performed with a constant time read data.

Brief description of drawings

Figure 1 presents the block diagram, which depicts an example of a hardware structure of the playback device disk in accordance with a variant implementation of the present invention; Piga and figv presents a block diagram showing an example structure of a software module, which performs the CPU 112; figure 3 shows a graph depicting the relationship between the real elapsed time and the number of pulses synchronization when the synchronization frequency 90 kHz; figure 4 shows a graph depicting the relationship between the real elapsed time and time calculated on the basis of the synchronization pulses of the clock generator, which updates the time based on the output video the video decoder; figure 5 presents a block diagram showing an example of the structure module 215 management buffer; figure 6 presents a block diagram depicting an example of the directory structure of the disk 101; figure 7 presents a block diagram depicting a syntax file "PLAYLIST.DAT"; Fig presents a block diagram depicting the syntax of PlayItem(); figure 9 presents a block diagram depicting the syntax of the PlayListMark(); figure 10 presents the block diagram, Fig is concerned with the interdependence of values mark_type and the type of Mark(); figure 11 presents a block diagram depicting the dependence of the PlayList(), PlayItem(), clips and streams of the program stored in the stream file of the clip; Fig presents a block diagram depicting the syntax of the information file clip Clip(); Fig presents a block diagram depicting the interdependence of a file_id, private_stream_id and elementary streams identified in this way; on Fig presents a block diagram depicting the syntax of the StaticInfo(); Fig presents a block diagram depicting the syntax DynamicInfo(); Fig presents a block diagram depicting the syntax EP_map(); Fig presents a block diagram depicting the dependence of the values index_N_minus1 and 1stRef_Picture to 4thRef_Picture shown in Fig; Piga and figv presents diagrams showing the syntax of the program flow, the service flow of the program and the packet header of the program stream system MPEG-2; Piga and figv presents diagrams showing the syntax of a package PES system MPEG-2; figa, Fig In and figs presents diagrams showing the syntax of a package PES system MPEG-2; Piga and figv presents diagrams showing the syntax pack PES system MPEG-2; Piga and figv presents diagrams showing the interdependence of the values of a file_id PES_packet() and the attribute of the elementary stream system MPEG-2; Fig presents a block diagram depicting a file_id, who and what is the device the disc; on Fig presents a block diagram depicting the syntax private_stream1_PES_payload(); Fig presents a block diagram depicting the interdependence of values and private_stream_id of the attribute of the elementary stream stored in private_payload(); Fig presents a block diagram depicting the syntax private_stream2_PES_payload(); Fig presents a block diagram depicting the syntax au_information(); Fig presents a diagram depicting pic_struct; Fig presents a block diagram depicting a specific example of the file "PLAYLIST.DAT"; Piga and figv presents diagrams depicting examples of media clip "00001.CLP", "00002.CLP" and "00003.CLP"; Fig presents a block diagram depicting a specific example Ermer()file information of the video "00001.CLP"; Fig presents a block diagram depicting specific examples of PlayList#0 and PlayList#1 for PlayListMark(); Fig shows the block diagram of the sequence of operations describing processing before playback; Fig shows a block diagram of a sequence of operations that describe the process of reproduction; Fig shows a block diagram of a sequence of operations that describes the relationship between order decoding order and output; Fig presents diagram depicting the structure of the decoder; Fig presents a diagram depicting the structure of the DPB, as shown in Fig; Fig shows the block diagram is the sequence of operations, describe the process to update the time; Fig presents a diagram depicting the process of updating the time corresponding to the value pic_stnict; Fig shows a block diagram of a sequence of operations that describe the process of changing the PlayItem; Fig shows the block diagram of the sequence of operations describing a process of displaying a time code; Fig shows the block diagram of the sequence of operations describing the processing flow; Fig shows the block diagram of the sequence of operations describing the processing performed by module 215 management buffer; Fig shows the block diagram of the sequence of operations describing the processing performed by module 215 management buffer; Fig shows the block diagram of the sequence of operations describing the processing of reading the video stream; Fig shows the block diagram of the sequence of operations describing the processing of reading the audio stream; Fig shows the block diagram of the sequence of operations describing the processing of reading the subtitle stream; Fig shows the block diagram of the sequence of operations describing the processing of the re-synchronization; Fig shows the block diagram of the sequence of operations describing the processing of the label; Fig presents a diagram depicting the determination of compliance when handling marks; Fig p is redstavlena scheme, depicting synchronization playListEnd; Fig presents a diagram depicting the synchronization playListEnd; Fig presents a diagram depicting the intervals of events; Fig shows the block diagram of the sequence of operations describing the processing control attribute; Fig presents a block diagram depicting a concrete example of a set of pts_change_point and DynamicInfo() is described in the information file of the clip "00003.CLP"; Fig shows the block diagram of the sequence of operations describing the processing of controlling the display of the subtitles; Fig shows the block diagram of the sequence of operations describing the processing control capture and processing of the background image/screensaver; Fig presents a block diagram depicting another syntax private_stream2_PES_payload(); Fig presents a block diagram depicting another syntax au_information(); Fig presents a block diagram showing an example of the structure of hardware devices recording disk; Fig presents a diagram depicting the function of the recorder drive is shown in Fig; Fig shows the block diagram of the sequence of operations describing the processing of creating Ermar; Fig shows the block diagram of the sequence of operations describing the processing play in fast-forward; Fig shows the block diagram is posledovatelnosti operations describing the treatment of choice of entry point; Fig presents a block diagram depicting another function performed by the recorder drive is shown in Fig; Fig presents a block diagram depicting another syntax Erma(); Fig presents a block diagram depicting priority_flag shown in Fig; and Fig shows a block diagram of a sequence of operations that describes the install processing priority.

Detailed description of the invention

Next will be described embodiments of the present invention. The interdependence between the elements specified in the claims and in the variants of execution is as follows. Interdependence means that the specific examples that support the invention presented in the claims described in the variants of execution. Thus, even if some specific examples are not described as elements in the claims, this does not imply that the specific examples do not correspond to items presented in the claims. Conversely, even if specific examples are described in this section as the analogues of the elements referred to in the claims, this does not imply that these specific examples do not match other elements in addition to items specified in the claims.

This section is not implied is W that all aspects of the invention, corresponding to the specific examples described in the embodiments, execution of the present invention, presented in the claims. In other words, the description in this section correspond to the specific examples described in the embodiments, execution of the present invention. Thus, the description of this section does not deny that there are aspects of the present invention, which is not specified in the claims of the present patent application and that can be executed applications on the selected patent, and/or additional aspects of the present invention can be added as changes.

In other words, the data recorder in accordance with the present invention includes a detection unit (for example, module 421 multiplexing shown in Fig), which selects image information with the internal encoding, which is contained in video data and becomes a point of online access, and detects information of position multiple images, in front of which is the selected image with the internal encoding; block designation (for example, the controller 425, shown in Fig), which refers to the information provision image located at a given distance from the image with internal coding, information provision is noreste images detected by the detecting unit; and a recording unit (for example, the actuator 409 disk shown in Fig), which records information provisions for multiple images, detected by the detecting unit, and image information designated by the unit designation on the recording media data together with the video data.

The method of recording data in accordance with the present invention includes the steps of selecting image information with the internal encoding, which is contained in video data and becomes a point of online access, and detecting status information of the plurality of images, in front of which is the selected image with the internal encoding (for example, at step S383 a flowchart of the sequence of operations shown in Fig); indicate status information for an image at a given distance from the image with internal coding, status information of multiple images, the detected phase detection (for example, at step S386 a flowchart of the sequence of operations shown in Fig); - write status information for multiple images, the detected phase detection, and image information indicated in the phase designations on the recording media data together with the video data (for example, at step S386 the Lok-flow diagrams, shown in Fig).

The processing unit in accordance with the present invention includes a block read (for example, the actuator 102 of the disk shown in figure 1), which reads the information of position multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding are contained in video data and becomes operational points access; block selection (for example, the module 212 control the player, shown in figa and figv), which selects images from the internal encoding, and many images that precede the image with the internal encoding in accordance with the information symbols, which refers to images that are located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points online access to the video data read by the unit considered is ivania and recorded on the recording media data, and many images that precede the image with the internal encoding; and a playback unit (e.g., module 214 controls the decoder shown in figa and figv), which performs playback fast forward video with all the images from the images in the internal encoding to the image located at a specified distance from the images in the internal encoding in multiple images, which are preceded by images of internal coding and selected by the block select.

A method of processing data in accordance with the present invention includes the steps of reading the status information of multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, kotorii there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding are contained in video data and becomes operational points of access (for example, at step S391 a flowchart of the sequence of operations shown in Fig); select images in the internal encoding, and multiple images, which are preceded by images from the internal encoding, in accordance with the information symbols, which refers to images that are located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points online access to the video data read during read and recorded on the recording medium data, and multiple images, which are preceded by images from the internal encoding (for example, at step S395 a flowchart of the sequence of operations shown in Fig); and playback fast forward video with all images, from images in the internal encoding to the image located at a specified distance from the inner images in multiple images, which are preceded by images from the internal encoding, and which have been selected in the selection step (for example, at step S396 a flowchart of the sequence of operations shown in Fig).

As the recording media program and the program in accordance with the present invention is similar to the method of recording data, their description is not provided here.

Next, with reference to the accompanying drawings will be described embodiments of the present the th of the invention.

[Structure hardware]

Figure 1 shows a block diagram representing an example of the structure of the hardware device playback of the disc according to a variant implementation of the present invention.

The playback device of a disk shown in figure 1, can be applied, for example, in the disc player, game device, a car navigation system, etc.

In the playback device of a disk shown in figure 1, the disk 101 is an optical disc such as a DVD, a magneto-optical disk, magnetic disk or the like. Data content, such as video data, audio and subtitle data and the additional data necessary to reproduce these data recorded on the disc 101.

If necessary, the data recorded on the disk 101 includes a program that can be executed by the computer. In accordance with this alternative implementation of the present invention as a recording medium is disk 101, which is a recording medium in the form of a disc. Alternatively, the recording medium may consist, for example, semiconductor memory or the recording medium in a tape form. Data that is read from disk in a remote location, can be transmitted and entered into the unit disc, showing the Noah figure 1. In other words, data can be read from the disc 101 and another device connected to the playback device drive. Data that reads another device that can be accepted and processed by the playback device drive. In addition, the playback device drive can receive data from the server, or the like, in which the stored data similar to the data recorded on the disc 101, via the network, such as the Internet, and to process the received data. In addition, the playback device drive can also receive data from another device, such as a server or the like, to write the received data to the disk 101 and then to process the data recorded on the disc 101.

The disk 101 can be loaded in the drive disk 102 and unloaded from it. The drive disk 102 has a built-in interface (not shown). The drive disk 102 is connected to the interface 114 of the drive via built-in interface. The drive disk 102 performs a disk drive 101 reads data from the disk 101 in accordance with, for example, the read command and transmits the data to the interface 114 of the drive.

Bus 111 are connected to the CPU (Central processing unit) 112, a storage device 113, the interface 114 of the actuator, the input interface 115, the video decoder 116, a decoder 117 audio interface 118 video output and interface 119 sound output.

The CPU 112 and menausea device 113 constitute a computer system. In other words, the CPU 112 executes a group of software modules, which is a program stored in the storage device 113, to control all the playback device disk, and performs various processes which will be described below. In the storage device 113 also saved a group of software modules that can be performed by the CPU 112. In addition, in the storage device 113 is temporarily stored data necessary for operation of the CPU 112. The storage device 113 may be composed only of non-volatile memory or a combination of volatile and non-volatile memory. When the playback device of a disk shown in figure 1, contains the hard disk on which is recorded (set) group of software modules that performs a CPU 112, a storage device 113 may be composed only of non-volatile memory.

The program (group of software modules), which runs the CPU 112 may be pre-stored in the storage device 113, which is used as a recording medium, which is embedded in the playback device disk.

Alternatively, the program may be temporarily or permanently stored on the disk 101 or on a removable recording medium such as flexible disk, CD-ROM (CD-ROM, a persistent storage device on the CD-ROM that is intended only for scity the project), MO (MO, mo) disk, a magnetic disk or memory card. The removable recording media can be provided as so-called packaged software.

The program may be previously stored in the storage device 113, or may be installed from the above-described removable recording medium in the unit disc. Alternatively, the program may be transferred wirelessly transfer data from a download site into the unit disc via satellite, when performing digital broadcast satellite or cable may be passed to the playback device drive through a local area network (LAN) or a network such as the Internet. The playback device drive receives the program via the input interface 115 and installs the program in the internal storage device 113.

The program can be executed using a single CPU, or may be distributed using a lot of CPU.

The interface 114 of the actuator controls the actuator 102 of the disk under the control of the CPU 112. The drive disk 102 transmits the data read from the disc 101, a CPU 112, a storage device 113, the video decoder 116, a decoder 117 sound, etc. via the bus 111.

The input interface 115 receives signals in accordance with the operations p is Lisovets, running with the keys (buttons) and in accordance with commands transmitted via a remote control (not shown), and transmits the signals to the CPU 112 via the bus 111. The input interface 115 also functions as a data transmission interface for the modem (including modem ADSL (ADSL, asymmetric digital subscriber line)), NIC (XI, network interface card) or the like.

The video decoder 116 decodes the coded video data that have been read from the disc 101 through the drive disk 102, transmits them to the video decoder 116 through the interface 114 of the actuator and to the bus 111, and transmits the decoded video data to the CPU 112 and the interface 118 of the video output via the bus 111.

The decoder 117 decodes sound encoded sound data that has been read from the disc 101 using the actuator 102 and transmits them to the decoder 117 of sound through the interface 114 of the drive and the bus 111, and transmits the decoded audio data to the CPU 112 and the interface 119 sound output via the bus 111.

The interface 118 of the image output performs a specified processing video data transmitted via the bus 111, and outputs the processed video data at the output 120 of the video. Interface 119 audio output performs a specified processing audio data transmitted via the bus 111, and outputs the processed data to the sound output 121 of the sound.

Output 120 videosor the supply is connected to the output video such as CRT (CRT, cathode ray tube) or LCD panel (not shown). Thus, the video data output from the output of the output 120 of the video that receives the output of the video image and displays it. Output 121 of the sound connected with devices audio output, such as a loudspeaker and an amplifier (not shown). Thus, the sound data output from the output of the output 121 of sound, served in the sound output device, and thus output.

The data of the video image and sound data can be transmitted wirelessly or by cable from the unit disc to the video output and the audio output device.

[Structure of the group of software modules]

On figa and figv shows an example of the structure of the group of software modules that performs a CPU 112, shown in figure 1.

The group of software modules that performs a CPU 112, mainly consists of operating system (OS, OS) 201 and program 210 playback of video content that is used as an application program.

[Operating system 201]

When the playback device include disk, the operating system 201, which performs predetermined processing such as initial installation, and causes the program 210 playback of video content, komarapalayam an application program.

The operating system 201 provides services for infrastructure, such as service read file 210 playback of the content. In other words, the operating system 201 provides a service which provides the drive disk 102 via the interface 114 of the actuator, upon request of reading the file, taken from the program 210 playback of video content, reads data from the disc 101 and transmits the data to the program 210 playback of video content. In addition, the operating system 201 also interprets the file system.

The operating system 201 has the function of multi-task processing. In other words, the operating system 201 can provide the operation of a variety of software modules on a time sharing basis. In other words, although the program 210 playback of the video content consists of several software modules can run in parallel.

[210 playback of video content]

The program 210 playback of the video content consists of module 211 of the control scenarios module 212, the player control module 213 feed data content module 214 controls the decoding module 215 of the buffer management module 216 controls the scaler module 217 control decoder sound module 218 controls the decoder subtitle mod is La 219 processing of graphic images, module 220 video output module 221 sound output.

The program 210 playback of video content is a software that performs a key role in the reproduction data from the disc 101. When the disc 101 is loaded (inserted) into the drive disk 102, the program 210 playback of video content checks whether the disc 101 disc, on which was written the content in a specified format (which will be described below). The program 210 playback of video content reads the script file (which will be described below) from the disk 101, executes the script reads the file metadata that is required to play the content from the disk 101, and controls the playback of the content according to the metadata.

Next will be described a software module that is a program 210 playback of the video content shown on figa and figv. On figa and FIGU, in General, an arrow in a solid line indicated the data content, while the arrow of the dotted line marked data management.

[Module 211 management script]

Module 211 controls the script interprets and executes a script (script)recorded on the disc 101. The script may describe operations such as "create module 219 processing of graphic images to create is zobrazenie, such as menus, and display it, change the menu view, in accordance with the signal transmitted from the user interface (PI, UI), such as a remote control (for example, to move the cursor on menu)"and "control module 212 controls the player.

[Module 212 control the player]

Module 212 control the player accesses the metadata, etc. recorded on the disc 101, to control the playback of the content recorded on the disc 101. In other words, the module 212 control the player parses the PlayList() and Clip()recorded on the disc 101, and controls the module 213 feed data content module 214 controls the decoding module 215 controls the buffer in accordance with the results of the analysis. In addition, the module 212 control the player executes the change management process flow, which changes the flow of play in accordance with commands received from a module 211 management scenario and from the input interface 115, as will be described below. In addition, the module 212 control the player receives a time value from the module 214 controls the decoding, displays the value of time, and performs processing of the label (Mark()) (which will be described below).

[Module 213 of the filing data of contents]

Module 213 feed data content requests operating with the system 201 can read the data content, metadata, etc. from the disk 101 running module 212 control the player or in accordance with the amount of data stored in the module 215 controls the buffer.

Metadata, etc. that the operating system 201 believed disk 101 in accordance with the request received from the module 213 feed data content, transmit in the specified modules. On the other hand, the data content that the operating system 201 believed disk 101 in accordance with a request accepted with module 213 of the filing data of the content serving module 215 controls the buffer.

[Module 214 controls the decoding]

Module 214 controls the decoding performs the operations control module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles running module 212 control the playback. Module 214 controls the decoding unit has A counting time, which counts the time. Module 214 controls the decoding controls the timing of the output video data that are output under the control module 216 controls the video decoder, and output data are synchronized with the video data. In this case, the output data that must be synchronized with the output video data, represent the data of the sound, which displays running module 217 control the decoder sound.

Block A counting time can autonomously calculate the time by counting the reference clock pulses of the frequency transmitted from the outside or from the internal clock, which is synchronized with the decoder or the like.

However, in the decoder, based on software that is executed by controlling the various modules of the type shown in figa and FIGU, if the process time calculation is performed using software tools, in addition to these processes, the load on the processing of the CPU 112 is increased. Thus, the block A counting time must update the time value on the basis of the output video data from the decoder.

Figure 3 shows the relationship between the independent clock block A counting time and truly past. Figure 3, because the block A counting time counts the pulses of the clock frequency of 90 kHz, the time increases linearly to the right of 0 time. At the time of 33.3 MS, the count value of clock pulses is 3003.

Figure 4 shows an example of a time that updates the block A counting time based on the output video data of the video decoder. In the case shown in figure 4, when the 33 MS, the output clock frequency to update the values 3003. When are 83,3 MS, the output update time to 7507. When is 116 MS, the output is time to update values 10510. In this example, the output time is of 16.66 MS.

In figure 3, although the timing in a stepped form with a resolution of 1/90 kHz, the period of time represented by a straight line for comparison with figure 4. In the following description it is assumed that the block A counting time updates the time on the basis of the output video data, as described with reference to figure 4.

[Module 215 buffer management]

Module 215 controls the buffer has a buffer A as part of the space saved in the storage device 113, as shown in figure 1. Module 213 feed data content temporarily stores the data content read from the disc 101, the buffer A, in accordance with the request received from the operating system 201.

In addition, the module 215 controls the buffer transmits the data stored in the buffer A, module 216 controls the scaler module 217 control decoder sound or sound module 218 management subtitle decoder, in accordance with the request received from the module 216 controls the scaler module 217 control decoder or sound module 218 management subtitle decoder, respectively.

In other words, the module 215 of the control buffer is a block 233 reading video data block 234 reading sound block 235 reading subtitles, which will be described below with reference to figure 5. Block 233 fu the functions of reading the video data module 215 buffer management handles the request for data, obtained from module 216 controls the video decoder, the data stored in the buffer A module 216 controls the video decoder. Similarly, the block 234 the scanning of the sound module 215 buffer management processes the request received from the module 217 control decoder audio data stored in the buffer A, module 217 control decoder sound. Block 233 reading the video data in the module 215 buffer management processes the request received from the module 218 controls the decoder subtitle data stored in the buffer A, module 218 controls the decoder subtitles.

[Module 216 controls the video decoder]

Module 216 controls the video decoder provides the block 233 reading video data (figure 5) module 215 controls the buffer for reading the coded image data for one module access video at a time from the buffer A control module 215 buffer and transmits video data in a video decoder 116, shown in figure 1. In addition, the module 216 controls the decoder controls the video decoder 116 to decode the data for one module access video at one time. In addition, the module 216 controls the decoder transmits the video data decoded by the video decoder 116, module 219 education is otci graphic images.

Module video processing is, for example, one image (one frame or one field) of video data.

[Module 217 control decoder audio]

Module 217 controls the audio decoder provides the block 234 the scanning of the sound (figure 5) module 215 controls the buffer for reading the encoded audio data for one module access to the sound in the same time from the buffer A module 215 controls the buffer and transmits the coded data of the audio decoder 117 sound, shown in figure 1. Module 217 control decoder audio controls decoder 117 sound for decoding encoded audio data, for access module sound at the same time. In addition, the module 217 control decoder transmits sound data of the sound decoded by the decoder 117 sound module 221 sound output.

The access module sound is a specified number of audio data (for example, a quantity of data, which are output synchronously with one image). In accordance with this alternative implementation, it is assumed that one unit of access to the sound has a predetermined fixed length.

[Module 218 controls the decoder subtitle]

Module 218 controls the subtitle decoder provides the block 235 reading subtitles (figure 5) module 215 controls the buffer to read Cody is avannah subtitle data for a single access module subtitles at the same time from the buffer A module 215 buffer management. In addition, the module 218 controls the subtitle decoder is software decoding subtitles (not shown). Software decoding subtitle decodes data read from the buffer A. Module 218 controls the subtitle decoder transmits the decoded subtitle data (image data of the subtitle) in module 219 processing of graphic images.

The access module subtitles represents a specified number of subtitle data (for example, a quantity of data, which are output synchronously with one image). In accordance with this alternative implementation, it is assumed that the size of a single element access subtitles described in the beginning.

[Module 219 processing graphics]

Module 219 processing graphic images increases or decreases subtitle data received from module 218 management subtitle decoder, in accordance with the command module 212 control the player and imposes increased or decreased subtitle data on the video data received from module 216 controls the video decoder. Module 219 processing graphic images enlarges or reduces the image frame of video data, which were imposed by subtitle data, resulting in frame size-added video data corresponds to the screen device is and video output, connected to the output pin 120 of the video image shown in figure 1. Added video output module 220 output video data.

In addition, the module 219 processing graphic images generates a menu, message, etc. in accordance with the command module 211 management scenario and module 212 control the player and puts this menu, message, etc. on the video output.

In addition, the module 219 processing graphic images converts the ratio of the size of the video output module 220 video output, in accordance with the ratio of the dimensions of the image output, is connected to the output 120 of the video image shown in figure 1, and information that represents the aspect ratio of the video data recorded on the disc 101.

In other words, when the ratio of the output device of the video is 16:9, if the information, which represents the aspect ratio of the video is a 4:3, module 219 processing graphics processing compression, which reduces the video output module 220 video output, in the transverse (horizontal) direction, provides a view of the left and right edges of the video data in the form of black and outputs the resulting video is installed. When the ratio of the output device, the video image is 4:3, if the information, which represents the aspect ratio of the video is 16:9, module 219 processing graphics processing video compression, which displays in the module 220 video output, in the longitudinal (vertical) direction, provides a mapping of the upper and lower edges of the video data in the form of black and outputs the resulting video.

When the ratio of the dimensions of the image output and the size ratio, which represents the information for the video data are the same, such as 4:3 or 16:9, module 219 processing graphics displays video data without compression module 220 video output.

In addition, the module 219 processing graphics performs video capture, which is treated in accordance with the request received, for example, the module 212 control the player. In addition, the module 219 processing graphics stores the captured video data or transmits the video data to the module 212 control the player.

[Module 220 video output]

Module 220 video output only takes part of a storage device 113, shown in figure 1, as a buffer A PPO (FIFO, first the m arrived - first processed, and temporarily stores the video data received from the module 219 processing of graphic images. In addition, the module 220 video output often reads video data from PPO A and displays the video data on the output 120 of the video (figure 1).

[Module 221 audio output]

Module 221 audio output only takes part of a storage device 113, shown in figure 1, as a buffer A PPO and temporarily stores the sound data received from the module 217 control audio decoder (decoder 117 sound). In addition, the module 221 audio output often reads audio data from the buffer A and outputs these data to the sound output 121 of the sound (figure 1).

In addition, when the data is audio taken from the module 217 control decoder sound, represent sound data in the dual (bilingual) mode, which have left and right channels of data "basic sound" and data "auxiliary audio module 221 audio output outputs the sound data received from the module 217 control decoder sound, the output 121 of the sound in accordance with a predetermined mode of sound output.

In other words, if the "core sound" was identified as the mode of the audio output module 221 audio output copies left channel audio data received from the module 217 control decoder audio right channel audio data, and the output is it left and right channel audio data (data "basic sound") at the output 121 of the sound. If the "auxiliary audio" was identified as the mode of the audio output module 221 audio output copy right channel audio data received from the module 217 control decoder sound, left channel, and outputs the left and right channels (data, "auxiliary audio) on output 121 of the sound. If both the primary and secondary sounds" were identified as the mode of the audio output module 221 audio output directly outputs the sound data received from the module 217 control decoder sound output 121 of the sound.

If the data is audio taken from the module 217 control decoder sound, represent the data of the sound recorded in stereo mode, the module 221 audio output directly outputs the sound data received from the module 217 control decoder sound, the output 121 of sound, regardless of what mode of sound output was specified.

The user can interactively define the mode of the audio output on-screen menu generated by the program 210 playback of the video content using the remote control.

[Module structure 215 buffer management]

Figure 5 shows an example of the structure of the module 215 buffer management presented on figa and figv.

Module 215 buffer management exclusively uses part of a storage device 113 shown in Fig, as buffer A, and temporarily stores the data read from the disc 101 in the buffer A. In addition, the module 215 control buffer reads data from the buffer A and transmits the data to the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the subtitle decoder shown in figa and figv.

In addition to the buffer A module 215 of the control buffer is a block 231 save the pointer to the data, and the block 232 save the pointer to the data records that are part of a storage device 113. In addition, the module 215 of the control buffer is a block 233 reading the video data and 234 of the function of reading the sound and the block 235 reading subtitles as internal modules.

Buffer A represents, for example, a ring buffer that sequentially stores the data read from the disc 101. After the buffer A considers data constituting the volume preservation, the buffer A will continue to store data in the mode of so-called infinite loop, resulting in a buffer A overwrites the new data over the old data.

At block 231 save pointer to beginning of data contains the pointer to the data that represents the position (address) of the most recent data that have not been read from the buffer A, among the data stored in the buffer A.

At block 232 save the record pointer data contains a pointer that represents the position (address) in the buffer A the most recent data that has been read from the disk 101.

Whenever data is read from the disc 101, is stored in the buffer A, at this position, which represents the index data record update in the clockwise direction, shown in figure 5. Whenever data is read from the buffer A, a position that represents a pointer to the data update in the clockwise direction, shown in figure 5. Thus, reliable data stored in the buffer A start from the position that represents a pointer to the data, to the position which represents the index data record in the clockwise direction, shown in figure 5.

Block 233 function read reads the video data stream (elementary stream of video data) from the buffer A in accordance with the request received from the module 216 controls the video decoder shown in figa and FIGU, and transmits the video stream module 216 controls the video decoder. Block 234 reading sound reads the audio stream (elementary stream of audio data) from the buffer A in accordance with the request received from the module 217 controls the audio decoder shown in figa and FIGU, and transmits the sound of the howling stream module 217 control decoder sound. Similarly, block 235 reading the subtitle reads the subtitle stream (elementary stream subtitle data) from the buffer A in accordance with the request received from the module 218 controls the subtitle decoder shown in figa and FIGU, and transmits the subtitle stream in module 218 controls the decoder subtitles.

In other words, the program stream in accordance, for example, with the MPEG 2 (experts Group moving pictures) recorded on the disc 101, the software flow is called program stream MPEG2-system. In this thread, at least one elementary stream of the video stream, audio stream and subtitle stream was multiplexed on a time sharing basis. Block 233 reading the video data has the function of demuxing software thread. Block 233 reading further demultiplexes the video data stream from the program stream stored in the buffer A, and reads the video stream.

Similarly, the block 234 reading sound has the function of demuxing software thread. Block 234 reading sound demuxes the audio stream from the program stream stored in the buffer A, and reads the audio stream. Similarly, block 235 reading subtitles is a function demuxing software Potomac 235 reading subtitles demuxes the subtitle stream from the program stream, stored in the buffer A, and reads the subtitle stream.

Block 233 reading the video data has a unit 241 save the pointer of the read video data register 242 and a file_id register 243 au_information(), which are part of a storage device 113, as shown in figure 1.

At block 241 save the pointer of the read video data is stored a pointer to the read video data that represents the position (address) of the video stream in the buffer A. Block 233 reading video data reads the data as a video stream, from a position of the pointer of the read video data in the buffer A. Register 242 a file_id saves a file_id, which is used to analyze the program stream stored in the buffer A and to identify the video stream that reads from a stream program. In the register 243 au_information() saved au_information(), which is data necessary for reading the video stream from the buffer A.

Block 234 reading sound has block 251 save the pointer reading of the sound, the register 252 and a file_id register 253 private_stream_id that are part of a storage device 113, as shown in figure 1.

At block 251 save the pointer reading sound save the pointer reading sound that represents the position (address) of the audio stream stored in the buffer A. Block 234 functions sitiveni the sound reads the data in the audio stream from the position of the pointer reading of the sound in the buffer A. In the register 252 and a file_id register 253 private_stream_id stored a file_id and private_stream_id (which will be described later), respectively, used for analysis of program stream stored in the buffer A, and to identify the audio stream read from the program stream.

Block 235 reading subtitles is a block 261 saving function flag read subtitles, block 262 save the pointer reading subtitles, register 263 and a file_id register 264 private_stream_id, which form part of the storage device 113, as shown in figure 1.

At block 261 saving function flag read subtitles retain the function flag read subtitles. When the flag of the function of reading the subtitles stored in block 261 saving function flag read subtitles, equal, for example, "0", block 235 reading subtitles is not working. When the flag of the function of reading the subtitles stored in block 261 preserve the function of reading the subtitles are equal, for example, "1", block 235 reading subtitles works.

At block 262 save the pointer reading subtitles saved pointer reading subtitles, which represents the position (address) of the subtitle stream stored in the buffer A. Block 235 reading the subtitle reads the data in the form of a subtitle stream from the position of the pointer reading of the subtitles in the buffer A. In the egistra 263 and a file_id in the register 264 private_stream_id stored a file_id and private_stream_id (which will be described later), respectively, used for flow analysis programs stored in the buffer A, and to identify the subtitle stream that reads from a stream program.

[Description format of the data recorded on the disc 101]

Next will be described the format of the data recorded on the disc 101.

Figure 6 is a schematic representation of the directory structure of the disk 101.

The file system used for the disk 101, and is, for example, one of the systems defined by ISO (ISO international organization for standardization) 9660 and UDF (universal disk format) (http://www.osta.org/specs/). Data files recorded on the disc 101, is managed hierarchically in a directory structure. File system that can be used for disk 101, is not limited to these file systems.

Figure 6 presents the directory "VIDEO", located in the root directory, which is the basis of the file system. In the directory "VIDEO" has two directories: directory "CLIP" and the directory "STREAM".

In addition to these two directories that represent the directory "CLIP" and the directory "STREAM", there are two data files that represent the file "SCRIPT.DAT" and the file "PLAYLIST.DAT" in the directory "VIDEO".

File "SCRIPT.DAT" is a script file that describes the program script. In other words, the file "SCRIPT.DAT" describes the script program is which allows you to interactively play data, recorded on the disc 101. The script program stored in the file "SCRIPT.DAT", is interpreted and executed by the module 211 controls the script shown in figa and figv.

In the file "PLAYLIST.DAT" saved at least one playlist (PlayList(), which will be described below with reference to Fig.7). The playlist describes the procedure for playing back content, such as video data recorded on the disc 101.

In the directory "CLIP" has at least one information file of the clip. In the directory "STREAM" includes at least one stream file of the clip. In other words, figure 6 shows three information file clip "00001.CLP", "00002.CLP" and "00003.CLP" in the directory "CLIP". In the directory "STREAM" there are three files flux clip "00001.PS", "00002.PS" and "00003.PS".

File stream clip saved the flow of the program, which represents at least one stream of video data, audio data and subtitle data that has been compressed and encoded, multiplexed on the basis of time sharing.

In the information file of the clip stored metadata about the stream of the clip, such as its characteristic.

In other words, the file stream of the clip information file clip correlated in the relationship 1 : 1. Figure 6 file stream clip is called in accordance with rule names, consisting of a five-digit number + period + "PS", when the title is their information file of a clip in accordance with rule names consists of the same five-digit number, which corresponds to the flow of the clip + point + "CLP".

Thus, the file stream clip and an information file of a clip can be identified by the extension of the file name (on the right side after the point). In addition, you can determine whether a file is a stream of the clip and an information file of a clip correlated by their file names, in addition to their extensions (on the left side of the dot).

Next will be described the files recorded on the disc 101.

[PLAYLIST.DAT]

7 shows the internal structure of the file "PLAYLIST.DAT" in the directory "VIDEO"is shown in Fig.6.

7 file "PLAYLIST.DAT" has a field Syntax", which describes the data structure of the file "PLAYLIST.DAT"; and the No. of bits that describes the length in bits of each input data in the "Syntax"; and the field "Mnemonic", in which "bslbf" (first bit to the left of the string of bits) and "uimsbf" (the first is a senior significant bit integer unsigned) represent that the input data in the "Syntax" is shifted from the left bit and the input data in the "Syntax" is an integer without a sign and shifted from the most significant bit. These conditions apply to other lists similar to those shown in Fig.7.

File "PLAYLIST.DAT" begins with name_length (8 bits) and name_string (255 bytes)that describe information such as the name of the file).

In other words, name_length represents the size name_string, which should not be orestano after her, in bytes. name_string is the name (file name) for the file "PLAYLIST.DAT".

Bytes for name_length, from name_string, are used as the actual name. When the value name_length is 10, 10 bytes from the beginning name_string interpreted as the actual name.

After name_string should number_of_PlayLists (16 bits). number_of_PlayLists represents the number of PlayList(), before which follow name_string. After number_of_PlayLists should PlayList(), presents number_of_Play Lists.

PlayList() is a playlist that describes how to play the file stream of the clip recorded on the disc 101. PlayList() has the following internal structure.

PlayList() starts with PlayList_data_length (32-bit). PlayList_data_length represents the size of the PlayList().

After PlayList_data_length should reserved_for_word_aligninent (15 bits) and capture_enable_flag_PlayList (1 bit) sequentially. After reserved_for_word_alignment size of 15 bits should capture_enable_flag_PlayList size 1 bit to 16-bit alignment position in position capture_enable_flag_PlayList, for placing it in position 16 bits. capture_enable_flag_PlayList represent a flag the size of 1 bit, which indicates permitted to use a second video data (video data that belong to the PlayList())corresponding to the video stream, reproduced in accordance with the PlayList(), in the unit disc, which reproduces data from the disc 101. When the value capture_enable_fag_PlayList care for example, 1 (0 or 1), this means that the permitted secondary use of video data that belong to the PlayList(). When the value of the capture_enable_flag_PlayList is, for example, 0 (0 or 1), this means that it is not permitted secondary use of video data that belong to the PlayList().

7 shows capture_enable_flag_PlayList consisting of one bit. In the alternative, the capture_enable_flag_PlayList may consist of multiple bits. In this case, can be solved step-by-step use of video data belonging to the PlayList(), as a secondary use. In other words, the capture_enable_flag_PlayList may consist of two bits. When the value of the capture_enable_flag_PlayList is 00AM (where indicates that the preceding number is a binary number), video data is prohibited to use again. When the value of the capture_enable_flag_PlayList is V, video data that is reduced to a size 64×64 pixels or less may be used for secondary use. When the value of the capture_enable_flag_PlayList is 10V, video data is permitted to use again, without any size reduction.

Secondary use of video data may be limited in application programs, but not the size. In other words, when the value captwe_enable_flag_PlayList is V, only the program 210 playback of the video content (figa and figv) may be authorised what about the secondary use of video data. When the value of the capture_enable_flag_PlayList is 10V, any application program including a video 210 playback of the content in the playback device of a disk shown in figure 1, may be permitted secondary use of video data. In this example, another application program, the program 210 playback of video content in the playback device of a disk shown in figure 1, is an application program that displays a background image or screensaver.

When capture_enable_flag_PlayList consists of 2 bits, reserved_for_word_alignment, after which he should, therefore, consists of 14 bits and is used to align words.

Alternatively, when capture_enable_flag_PlayList video data may be permitted for recycling outside the unit disc. When using the capture_enable_flag_PlayList permitted secondary use of video data outside the unit disc, such video data is recorded, for example, on the recording media that can be loaded into the playback device or disk which can be connected to the playback device disk, or they can be transferred to another device through a network such as the Internet. In this case, information representing the number of times that the video m which may be recorded on the recording medium, or the number of times that the video can be transferred, can be added to image data.

After capture_enable_flag_PlayList consistently should PlayList_name_length (8 bits) and PlayList_name_string (255 bytes). PlayList_name_length represents the size PlayList_name_string in bytes. PlayList_name_string is the name of the PlayList().

After PlayList_name_string should namber_of_PlayItems (16 bits). number_of_Play Items indicates the number of PlayItem().

After number_pf_PlayItems should PlayItem(), presents number_of_PlayItems.

One PlayList() can describe the procedure of playing back content in the form of a module PlayItem().

Identification codes (ID, ID), which are unique in the PlayList(), added to the PlayItem(), presents number_pf_PlayItems. In other words, the first PlayItem() in the PlayList() identified by the number 0. Other PlayItem() is consistently identified by the numbers 1, 2 ... etc

After PlayItem(), which presents number_of_PlayItems, should one PlayListMark(). PlayListMark() represents the set Mark() in the form of labels on the time axis during playback in accordance with the PlayList(). PlayListMark() will be described in detail below with reference to Fig.9.

[Description PlayItem()]

On Fig shows the internal structure of the PlayItem()contained in the PlayList(), shown in Fig.7.

PlayItem() starts with a length (16 bits). Here the length field represents the size of the PlayItem(), including the size of the field length.

After the field length should consistently Clip_Information_file_name_length (16 bits) Clip_Information_file_name (variable length). Clip_Information_file_name_length represents the size of the Clip_Information_file_name in bytes. Clip_Information_file_name represents the file name of the information file clip (file with extension CLP shown in Fig.6), which corresponds to the file stream clip (file with the extension PS, shown in Fig.6), reproduced according to PlayItem(). In accordance with the above rules for file stream clip files clip information file name in the information file of the clip being played in accordance with the PlayItem(), can be recognized Clip_Information_file_name, and the file stream of the clip can be identified.

After Clip_Information_file_name consistently follow IN_time (32-bit) and OUT_time (32 bits).

IN_time and OUT_time present time information and the present position of the playback start position and the playback is finished file stream clip identified by Clip_Information_file_name.

IN_time can refer to a middle position (including the beginning) of the file stream of the clip, as the start position of the playback. OUT_time can refer to a middle position (including the end) of the file stream of the clip as the position of the playback is finished.

PlayItem() plays back the content from the IN_time to OUT_time of the file stream clip identified Clip_hubrmation_file_name. Content reproduced according to PlayItem(), sometimes called a clip.

[Description PlayLitMark()]

Figure 9 shows the internal structure of the PlayListMark()contained in the PlayList(), shown in Fig.7.

As described above, PlayListMark() represents the set of labels Mark()that represent the labels on the time axis of the reproduction in accordance with the PlayList() (Fig.7). The number of Mark() is 0 or more. One label Mark() has at least time information, which represents a single point in time on the time axis of the play, performed in accordance with the PlayList(), type information, which represents the type of Mark()and argument representing an event argument, when the information type is the type of event that occurs.

In other words, PlayListMark() begins with a field length (32 bits). The length field represents the size of the PlayListMark(), including the size of the length.

After the field length should number_of_PlayList_marks (16 bits). number_of_PlayList_marks represents the number of Mark()in front of them should number_of_PlayList_marks. After number_of_PlayList_marks should Mark()that is represented by a number_of_PlayList_marks.

Mark() starts with mark_type (8 bits). mark_type represents the information of the previous type and represents the type of Mark(), which belongs mark_type.

In accordance with this option run Mark() has three types, such as, for example, a section index, and events.

When the type of Mark() represents a partition (sometimes called the partition label), it is a m the weave of the initial position of the partition, which is a search engine as part of PlayList(). When the Label type() is an index (sometimes called an index mark), it is the label of the initial position of the index, which is a module of the unit section. When the type of Mark() represents an event (sometimes called the label of the event), Mark() represents the position marker in which the event occurs, during playback of the content according to the PlayList(). Module 211 controls the script informs you that an event has occurred in accordance with the marking event.

Figure 10 shows the interdependence between the value mark_type and the type of Mark(). Figure 10 mark_type partition label is equal to 1; mark_type label index equal to 2; and mark_type the event mark is equal to 3. Figure 10 other values represented by 8 bits mark_type, namely, 0 and 4-255 reserved for future expansion.

As again shown in Fig.9, after mark_type should mark_name_length (8 bits). Mark() ends mark_name_string. mark_name_length and mark_name_string used to describe the names of Mark(). mark_name_length represents the actual size mark_name_string. mark_name_string represents the name of Mark(). Thus, the bytes in the field mark_name_length since mark_name_string represent the actual name of Mark().

After mark_name_length follow the four elements ref_to_PlayItem_id (16 bits), mark_time_stamp (32-bit), entry_ES_stream_id (8 bits) and entry_ES_private_sream_id (8 bits), they correlate Mark(), defined in the PlayList(), file stream clip.

ref_to_PlayItem_id describes ID as a serial number set for PlayItem(), which belongs to Mark(). ref_to_PlayItem_id identifies PlayItem() (Fig)belongs to Mark(). Thus, as described on Fig, an information file of the clip and the file stream clip identified.

mark_time_stamp is a position that represents the Mark() in the file stream clip identified by the ref_to_PlayItem_id.

Figure 11 presents the interdependence of the PlayList(), PlayItem(), clips, and program streams stored in the stream file of the clip.

Figure 11 PlayList() consists of three PlayItem(), which are numbered sequentially as ID#0, ID#1 and ID#2. In the following description of the PlayItem(), numbered as ID#i presented PlayItem#i.

Figure 11 clips, as the content reproduced according to PlayItem#0, PlayItem#1, PlayItem#2, presented by the clip And the clip In and clip respectively.

The module clip is from the IN_time to OUT_time of program stream stored in the stream file of the clip identified by the Clip_Information_file_name in the PlayItem(), shown in Fig. Figure 11 software threads represented as a clip And clip In and clip presented as a program stream And program stream and program stream, respectively.

Figure 11 in Mark() as timestamp t0 on the time axis reproduced the I in accordance with the PlayList(), ref_to_PlayItem_id and mark_time_stamp is described as follows.

Figure 11, since the time t0 represents the time at which reproduce the PlayItem#1, ref_to_PlayItem_id describes 1 as ID for PlayItem#1. Since at time t0 is reproduced program stream In as the object of a clip In, mark_time_stamp describes the file time stream of the clip, in which the stored program stream, corresponding to the time t0.

And again, as shown in Fig.9, when the Mark() is correlated with a particular elementary stream, entry_ES_stream_id and entry_ES_private_stream_id used to identify the elementary stream. In other words, entry_ES_stream_id describes a file_id elementary stream, which is correlated with the Mark(). On the other hand, when it is required, entry_ES_private_stream_id describes private_stream_id of the elementary stream, which is correlated with Mark().

When reproducing video stream#1 clip in which were multiplexed video stream#1 and the video stream#2, and when the time of the partition you want to modify, during the playback of the video stream#2, and a file_id private_stream_id video stream#1 described in entry_ES_stream_id and entry_ES_private_stream_id Mark(), at the time of the partition label, while the reproduced video stream#2. In addition, a file_id and private_stream_id for video stream#2 described in entry_ES_stream_id and entry_ES_private_stream_id Mark() at the time of the partition label, during the playback of the video stream#2.

entry_ES_stream_id and entry_ES_private_stream_id Mark(), which is not correlated with specific lementary flow, equal to 0.

After entry_ES_private_stream_id should mark_data (32-bit). When Mark() represents the label of the event, mark_data is the info argument is used as the event argument, which is the label of the event. When Mark() is a Chapter mark or index mark, mark_data can be used as a room partition or index number that represents the volume label or the label of the index.

[Description Clip()]

Next will be described the internal structure of the information file of a clip having an extension CLP and stored in the directory "CLIP"shown in Fig.6.

Figure 6 shows three information file clip "00001.CLP", "00002.CLP" and "00003.CLP" in the directory "CLIP". These media clip contain metadata that represent characteristics of the file stream clip "00001.PS", "00002.PS" and "00003.PS"stored in the directory "STREAM".

On Fig shows the internal structure of the information file clip Clip().

File Clip() clip begins with presentation_start_time and presentation_end_time (32 bits each). presentation_start_time and presentation_end_time represent the start time and end time of the file stream of the clip corresponding to the file Clip() information of the clip. Time file stream clip is described as a set of pulses with a repetition rate of 90 kHz is used as the time in the MPEG2 system.

After presentation_end_time should reserved_for_word_alignment (7 bits) and capture_enable_lag_Clip (1 bit). reserved_for_word_alignment size of 7 bits is used to align words. capture_enable_flag_Clip is a flag that indicates whether the secondary use of video data, as well as capture_enable_flag_PlayList shown in Fig.7.

However, the capture_enable_flag_PlayList shown in Fig.7 indicates whether the secondary use of video data that belong to the PlayList(), the corresponding video stream playing in accordance with the PlayList(). In contrast capture_enable_flag_Clip shown in Fig indicates whether the secondary use of video data corresponding to the elementary stream of video data stored in the stream file of the clip, the corresponding file Clip() information of the clip. Thus, the capture_enable_flag_PlayList shown in Fig.7, is different from capture_enable_flag_Clip shown in Fig, the number of video modules that can be recycled.

As capture_enable_flag_PlayList described with reference to Fig.7, capture_enable_flag_Clip described with reference to Fig, may consist of multiple bits rather than one bit.

After capture_enable_flag_Clip should number_of_streams (8 bits). number_of_streams describes the number of StreamInfo(). Thus, after number_of_streams should StreamInfo(), presents number_of_streams.

StreamInfo() begins with a field length (16 bits), the length represents the size of the StreamInfo(), which includes the size in length. After field length streamed (8 bits) and private_stream_id (8 bits). a file_id and private_stream_id identify the elementary stream, which is correlated with the StreamInfo().

On Fig interdependence of a file_id, private_stream_id and elementary streams identified with their help.

a file_id represents the same field that is defined in accordance with standard MPEG2-System. Standard MPEG2-System determines the value of a file_id field for each attribute of the elementary stream. Thus, the attribute of the elementary stream defined in the standard MPEG2-System, can be identified only by a file_id.

This option may work with the attributes of the elementary streams that are not defined in the standard MPEG2-System. private_stream_id is information that identifies the attribute of the elementary stream, which is not defined in the standard MPEG2-System.

On Fig interdependence of a file_id and private_stream_id elementary streams, with four attributes, which represent the elementary video stream encoded according to the encoding system defined in MPEG elementary audio stream encoded in accordance with the ATRAC system (AQAP, acoustic coding with adaptive transformation) (below sometimes called ATRAC audio data stream), an elementary stream of audio encoded in accordance with the system LPCM (LICM, linear pulse-code modulation) (below sometimes nazivaes audible stream LPCM), and elementary subtitle stream (below, sometimes referred to as the subtitle stream).

Standard MPEG2-System determines that the elementary stream of video data encoded in accordance with the encoding system defined in MPEG muxed using values in the range from he to 0xEF (where h indicates that a string of characters, following after him, are represented using the notation in hexadecimal). Thus, 16 elementary streams of video data encoded in accordance with the encoding system defined in MPEG and identified a file_id range from he to 0xEF, can be multiplexed with a program stream.

Since elementary streams of video data encoded in accordance with the encoding system defined in MPEG can be identified by a file_id range from he to 0xEF, private_stream_id is not required.

On the other hand, in the MPEG2-System a file_id is not defined for an ATRAC audio stream, LPCM audio stream and subtitle stream.

Thus, in accordance with the option perform for elementary streams, a file_id which is not defined in the MPEG2-System, is used 0xBD, which is a value representing the attribute private_stream_1 in the MPEG2-System. In addition, as shown in Fig, these elementary streams identified private_stream_id.

In other words, sound is Otok ATRAC identified by private_stream_id in the range from h to 0x0F. Thus, 16 ATRAC audio streams can be multiplexed with a program stream. Audio stream LPCM identified private_stream_id in the range from h to 0x1F. Thus, 16 LPCM audio streams can be multiplexed with a program stream. The subtitle stream identified private_stream_id in the range from h to 0x9F. Thus, 32 subtitle streams can be multiplexed with a program stream.

a file_id and private_stream_id will be described in detail below.

Returning to Fig, after private_stream_id consistently should StaticInfo() and reserved_for_word_alignment (8 bits). StaticInfo() describes information that does not change while playing elementary stream identified by a file_id and private_stream_id described in the StreamInfo(), including the StaticInfo(). StaticInfo() will be described below with reference to Fig.

reserved_for_word_alignment is used to align words.

After reserved_for_word_alignment should number_of_DynamicInfo (8 bits). number_of_DynamicInfo represents the number of sets of pts_change_point (32 bits each) and DynamicInfo(), which follow after number_of_DynamicInfo.

Thus, after number_of_DynamicInfo follow sets of pts_change_point and DynamicInfo(), presents number_of_DynamicInfo.

pts_change_point is the time at which information DynamicInfo(), paired with the pts_change_point is valid. pts_change_point, which represents the start time of the elementary stream, as well presentation_start_time described is in the beginning of the file Clip() information of the clip, which corresponds to the file stream of the clip where you saved elementary stream.

DynamicInfo() describes the so-called dynamic information that changes while playing elementary stream identified by a file_id and private_stream_id. The information described in the DynamicInfo()becomes valid during playback, which presents pts_change_point, paired with DynamicInfo(). DynamicInfo() will be described below with reference to Fig.

After sets of pts_change_point and DynamicInfo(), presents number_of_DynamicInfo should Ermer(). Ermer() will be described below with reference to Fig.

[Description StaticInfo()]

Next, with reference to Fig will be described in detail StaticInfo()presented on Fig.

On Fig presents the syntax of the StaticInfo().

The content of the StaticInfo() is changed depending on the attribute of the corresponding elementary stream. The attribute of the elementary stream corresponding to the StaticInfo(), defined on a file_id and private_stream_id contained in the StreamInfo(), shown in Fig, including StaticInfo().

When the elementary stream corresponding to the StaticInfo()is a stream (stream == VIDEO), StaticInfo() consists of picture_size (4 bits), frame_rate (4 bits), cc_flag (1 bit) and reserved_for_word_alignment to align words.

picture_size represents the size of the image displayed video data corresponding to the video stream. frame_rate is the repetition frequency personnel VideoLAN is s, corresponding to the video stream. cc_flag indicates whether the video data of the subtitle. When the stream contains the data of the subtitle, cc_flag is set to 1. When the video stream does not contain subtitle data cc_flag is set to 0.

When the elementary stream corresponding Staticmfo()represents the audio stream (stream == AUDIO), StaticInfo() consists of audio_language_code (16 bits), channel_configuration (8 bits), lfe_existence (1 bit), sampling_frequency (4 bits) and reserved_for_word_alignment to align words.

audio_language_code describes code that represents the language of the audio data contained in the audio stream. channel_configuration represents an attribute, such as monaural (mono), stereo, multichannel, etc. sound data contained in the audio stream. lfe_existence indicates whether the audio stream channel effect of the low frequency. When the audio stream contains the channel effect of low frequency, lfe_existence is set to 1. When the audio stream does not contain the channel effect of low frequency, lfe_existence equal to 0. samplmg_frequency is information that represents the sampling rate of the sound data contained in the audio stream.

When the elementary stream corresponding to the StaticInfo()is a subtitle stream (stream == SUBTITLE), StaticInfo() consists of subtitle_language_code (16 bits), configurable_flag (1 bit) and reserved_for_word_alignment to align words.

subtitle_language_code describes code that p is establet language subtitle data, contained in the subtitle stream. configurable_flag is information that indicates whether the change of the display mode subtitle data from the display mode, the default. When allowed to change the display mode, configurable_flag is set to 1. When changing the display mode is not allowed, configurable_flag is set to 0. The display mode subtitle data includes the display size of the subtitle data, position, display, color display, display structure (e.g., blinking), the direction display (vertical or horizontal), etc.

[Description DynamicInfo()]

Next, with reference to Fig will be described in detail DynamicInfo(), shown in Fig.

On Fig shows the syntax DynamicInfo().

DynamicInfo() starts with reserved_for_word_alignment (8 bits) to align words. Items preceded by reserved_for_word_alignment depend on the attribute of the elementary stream corresponding to the DynamicInfo(). The attribute of the elementary stream corresponding to the DynamicInfo(), defined a file_id and private_stream_id contained in the StreamInfo(), shown in Fig, which includes DynamicInfo(), as well as StaticInfo()is described with reference to Fig.

As shown in Fig, DynamicInfo() describes the dynamic information that changes when reproduced and elementary stream. Dynamic information is not specific. However, in an embodiment shown in Fig, the data of the elementary stream corresponding to Dynamichifo(), namely the output attribute data that is output during the processing of the elementary stream as described in the DynamicInfo().

In particular, when the elementary stream corresponding to the DynamicInfo()is a stream (stream == VIDEO), DynamicInfo() consists of display_aspect_ratio (4 bits) and reserved_for_word_alignment to align words. display_aspect_ratio describes the output mode of the video data stream, for example the aspect ratio of the video data. In other words, display_aspect_ratio describes the information that represents the aspect ratio of 16:9 or 4:3. DynamicInfo() of the stream of video data may describe, for example, the image size of video data (X pixels × Y pixels), and the ratio of sizes.

When the elementary stream corresponding to the DynamicInfo()is an audio stream (stream == AUDIO), DynamicInfo() consists of channel_assignment (4 bits) and reserved_for_word_alignment to align words. When the audio stream contains two channels of audio data, channel_assignment describes the output mode for two channels. In other words, channel_assignment describes the information that is the purpose of the stereo channels or output dual audio (in two languages).

When the elementary stream corresponding to the DynamicInfo()is a subtitle stream (stream = SUBTITLE), DynamicInfo() consists of reserved_for_word_alignment to align words. In other words, in accordance with a variant of execution, shown in Fig output attribute as dynamic information is not defined for the subtitle stream.

[Description Ermer()]

Next, with reference to Fig will be described in detail Ermer()presented on Fig.

On Fig shows the syntax Ermer().

Ermer() describes information point, which may begin decoding (entry point), which can be decoded each of the elementary streams multiplexed with a program stream stored in the stream file of the clip corresponding to the file Clip() of the clip (shown in Fig), which includes Ermer().

The point at which it can begin decoding stream having a fixed speed, can be obtained using calculations. However, for the stream, the size of which varies in each of the modules access the video data such as video data stream encoded in accordance with MPEG standard, the point of potential start decoding cannot be obtained by means of calculations. The point of a possible beginning of the decoding cannot be obtained as long as the thread will not be analyzed. To ensure prompt access to the data necessary to quickly identify the probable point of beginning decoded who I am. When using Ermer() point to the possible beginning of the decoding cannot be detected quickly.

In accordance with the standard MPEG2 video start image with internal coding, including Sequence_header() and so on, represents a point of possible start decoding.

Ermer() starts with reserved_for_word_alignment (8 bits) to align words. After reserved_for_word_augnment should number_oi_stream_id_entries (8 bits). number_of_stream_id_entries represents the number of elementary streams that describe the information possible start decoding in Ermer().

After number_of_stream_id_entries follow a set of information that identifies the elementary stream and the possible beginning of the decoding of the elementary stream. They are repeated a certain number of times, presents number_of_stream_id_entries.

In other words, after number_of_stream_id_entries should a file_id (8 bits) and private_stream_id (8 bits) as information that identifies the elementary stream. After private_stream_id should number_of_EP_entries (32-bit). number_of_EP_entries represents the number of points possible start decoding identified a file_id and private_stream_id, followed number_of_EP_entries.

After number_of_EP_entries should information points possible start decoding of the elementary stream identified by a file_id and private_stream_id. Information is repeated a certain number of times, the cat is PoE presents number_of_EP_entries.

In other words, when the current elementary stream is a video stream, set index_N_minus1 and N-th_Ref_picture_copy. Then placed PTS_EP_start (32-bit) and RPN_EP_start (32-bit) as information possible start decoding. This sequence is repeated. index_N_minus1 and N-th_Ref_picture_copy will be described below.

When the current elementary stream is a another thread, in addition to the video data stream, after reserved_for_future_use size 16 bits should PTS_EP_start (32-bit) and RPN_EP_start (32-bit) as information points allowed start decoding. This sequence is repeated.

One of 1stRef_picture, 2ndRef_picture, 3ndRef_picture and 4thRef_picture recorded in the private_stream_2, presents RPN_EP_start, which will be described below, copy the N-th_Ref_picture_copy. In addition, information indicating which field was copied, recorded in index_N_minus1 the value shown on Fig. In other words, as shown in Fig when lstRef_picture copied in index_N_minus1 write 0. When copying 2ndRef_picture, index_N_minus1 record 1. When copying 3ndRef_picture, index_N_minus1 recorded 2. When copied 4thRef_picture, index_N_minus1 recorded 4.

PTS_EP_start as one piece of information about points possible start decoding represents the time (playback time) possible start decoding the stream file of the clip, in which the stored program stream, multiplex is consistent with the elementary stream, identified a file_id and private_stream_id.

RPN_EP_start that is another piece of information about points possible start decoding, describes the position of the point of possible start decoding the stream file of the clip, in which the stored program stream, multiplexed with an elementary stream identified a file_id and private_stream_id as the value is the same as the number of pack() program flow. In accordance with this option run size of pack() is 2048 bytes, and is fixed. In addition, in accordance with this option run one sector of the disk 101 (figure 1) is 2048 bytes.

Immediately before the point of possible early decoding (entry point) of the video data stream is a private_stream_2 packet (PES_packet() as attribute private_stream_2). In the private_stream_2 packet stored information used for decoding the video data stream stored between two adjacent packages private_stream_2. Thus, for a stream of video data RPN_EP_start as information possible start decoding describes the initial position of the private_stream_2 packet, after which directly follows the real point of possible start decoding.

Sets PTS_EP_start and RPN_EP_start as information about points possible start decoding pre-sorted in ascending order of each element is REGO thread identified a file_id and private_stream_id in Ermer(). Thus, in sets PTS_EP_start and RPN_EP_start as information points possible start decoding, you can perform a binary search.

The way online access for flows with variable speed and flow, the size of which differs in the modules access the video described, for example, in laid publication No. 2000-341640 Japanese patent (Application No. HEI 11-317738 in Japanese patent).

[Description file stream clip]

Next will be described the internal structure of the file stream clip having an extension PS and saved in the directory "STREAM"shown in Fig.6 ("00001.CLP", "00002.PS" and "00003.PS" in Fig.6).

The file stream of the clip based on MPEG2_Program_Stream(), which is defined in the MPEG2-System (ISO/IEC 13818-1).

On figa and figv shows a table 2-31 table 2-32 and table 2-33 described in the standard MPEG2-System (ISO/IEC 13818-1:20000).

Program stream stored in the stream file of the clip, is a MPEG2_Program_Stream(), defined in Table 2-31 standard MPEG2 System. This program stream consists of at least one pack() and one MPEG_program_end_code. MPEG2_Program_Stream() described in Japanese patent No. 2785220.

One pack() consists of one Pack_header() and any number of PES_packet(), as defined in Table 2-32 standard MPEG2-System. Pack_header() is described in Table 2-33 standard MPEG2-System.

In standard MPEG2-System pack() is the size of the R variable length. However, as shown in Fig, it is assumed that the size of pack() is 2048 bytes, and is fixed. In this example, the number of PES_packet() for one pack() is 1, 2 or 3. When Pack() begins with a private_stream_2 packet, after it usually follows immediately PES_packet() of the corresponding data stream. In addition, as the third PES_packet() can be used padding_packet (data packets fill). Package private_stream_2 is usually presented at the beginning of Pack().

When Pack() does not begin with a private_stream_2 packet, Pack() begins with a PES_packet()that contains the data content of video data, audio data, subtitle data or the like. Second PES_packet() can be a padding_packet (package of data information).

On figa and FIGU - figa - FIGU shown PES_packet(), defined in Table 2-17 standard MPEG2-System.

PES_packet()basically consists of packet_start_code_prefix, a file_id, PES_packet_length (they are shown on figa and figv), blocks header (which includes stuffing_byte), which vary in accordance with a file_id or the like (these blocks are shown on figa and FIGU - figa and figv), and PES_packet_data_byte (shown in figa and figv). When PES_packet() is a padding_packet (a file_id=padding_stream), while the required number padding_byte (0xFF) (figa and figv) repeat instead PES_packet_data_byte repeatedly.

As shown in figa and FIGU - figa, figv and figs, title blocks PES_packet() can describe the information, which represents the synchronization of the display, called PTS (VMP, the timestamp of the presentation), and information that represents the time of decoding, called DTS (AMD, timestamp decoding). In accordance with this option run PTS added to each of all modules access modules decoding, which constitute the elementary stream defined in the MPEG2 - System). When it is specified in the MPEG2-System, is added DTS.

Elementary stream multiplexed with a program stream stored in PES_packet_data_byte (figa and figv) PES_packet(). streamJd in PES_packet() describes the value in accordance with the attribute of the elementary stream to identify the elementary stream stored in PES_packet_data_byte.

The interdependence of the values described in a file_id PES_packet(), and attributes (types) of elementary streams, defined in Table 2-18 standard MPEG2-System. On figa and figv shows a table 2-18 standard MPEG2-System.

In accordance with a variant implementation of the present invention, for example, the values shown in Fig used as a file_id defined in the standard MPEG2-System, as shown in figa and figv.

In other words, in accordance with this option run five structures V, W, V, HV and HV value is used stream_d, where "x" represents any of the values 0 and 1.

In accordance with the table shown in Fig, a file_id in PES_packet() of the elementary stream having an attribute private_stream_1 is V. In accordance with the table shown in Fig, a file_id PES_packet() for padding_packet is 10111110 Century In accordance with the table shown in Fig, a file_id PES_packet() of the elementary stream having an attribute private_stream_2 is W.

a file_id in PES_packet() of the audio stream (elementary stream audio)defined in MPEG is HIV. Five low-order bits of xxxxx from HIV represent the number of the audio stream, which identifies the audio stream. 32 (=25) audio stream (audio streams defined in MPEG), which can be identified by the number of the audio stream may be multiplexed with a program stream.

streamed in PES_packet() of the video data stream (elementary stream of video data), as defined in MPEG is HIV. Four bits of the least significant bits xxxx of HIV represents the number of a video data stream that identifies the stream. 16 (=24) video stream (stream defined in the MPEG)can be multiplexed with a program stream.

PES_packet(), which is a file_id HIV, is used to store video data stream defined in the MPEG. PES_packet(), a file_id equal HIV, is use to save the audio stream, defined in MPEG. On the other hand, a file_id PES_packet() of the elementary stream in accordance with the coding system (for example, the ATRAC system) is not defined in MPEG. Thus, as the video stream and the audio stream defined in the MPEG elementary stream in accordance with a coding system that is not defined in MPEG, cannot be stored in the PES_packet() with a file_id.

Thus, in accordance with this option run PES_packet_data_byte for PES_packet() of private_stream_1 expanded, so that it retains elementary stream in accordance with the coding system, which is not defined in MPEG.

Advanced PES_packet_data_byte for PES_packet() of private_stream_1 described as private_stream1_PES_payload().

[Description private_stream1_PES_payload()]

On Fig shows the syntax private_stream1_PES_payload().

private_stream1_PES_payload() consists of private_header() and private_payload(). In private_payload() saved elementary stream, such as an ATRAC audio stream, the LPCM audio stream, a subtitle stream, or the like, encoded in accordance with a coding system that is not defined in the MPEG system.

private_header() starts with private_stream_id (8 bits). private_stream_id represents identification information that identifies the elementary stream stored in private_payload(). private_stream_id has the following value in accordance with the attribute of the elementary stream.

On Fig interdependence of values and private_stream_id of attribute elements is ary thread stored in private_payload().

On Fig shows three patterns hah, 0001xxxxB and HIV, as the value private_stream_id, where "x" represents any value from 0 and 1, as in the case shown in Fig.

In accordance with the table shown in Fig, private_stream_id for private_stream1_PES_payload() in private_payload() thread ATRAC is HIV. Four bits of the least significant bits xxxx HH represent the number of the audio stream, which identifies the ATRAC audio stream. 16 (=24threads of sound ATRAC, which can be identified by the number of the audio stream may be multiplexed with a program stream (MPEG2_Program_Stream()).

In accordance with the table shown in Fig, private_stream_id for private_stream1_PES_payload() in private_payload() LPCM audio stream is 0001xxxxB. Four bits of the least significant bits xxxx 0001xxxxB represent the number of the audio stream, which identifies the audio stream LPCM. 16 (=24), which can be identified by the number of the audio stream may be multiplexed with a program stream.

In accordance with the table shown in Fig, private_stream_id for private_stream1_PES_payload() private_payload() subtitle stream is HIV. Five bits of the least significant bits xxxxx in HIV are a stream number of the subtitle, which identifies the subtitle stream. 32 (=25) subtitle streams can be multiplex ofany software thread.

On Fig interdependence Fig and Fig.

Returning to Fig, elements that are preceded by private_stream_id in private_stream1_PES_payload(), change depending on the attribute of the elementary stream stored in private_payload(). The attribute of the elementary stream stored in private_payload(), private_stream_id is determined at the beginning of private_header().

When the elementary stream stored in private_payload(), is an ATRAC audio stream (private_stream_id == ATRAC), reserved_for_future_use (8 bits) is described for future expansion. After reserved_for_future_use should AU_locator (16 bits). AU_locator represents the initial position of the access unit to sound an ATRAC audio stream stored in private_payload(), on the basis of the provision, in front of which is located directly AU_locator. When private_payload() does not save the access module sound, 0xFFFF described, for example, in AU_locator.

When the elementary stream stored in private_payload()represents the audio stream LPCM (private_stream_id == LPCM), fs_flag (1 bit), reserved_for_future_use (3 bits), ch_flag (4 bits) and AU_locator (16-bit) is described in the specified order.

fs_flag represents the sampling rate of the audio stream LPCM stored in private_payload(). When the sampling rate of the audio stream LPCM 48 kHz, fs_flag set to 0. When the sampling rate of the audio stream LPCM 44.1 kHz, fs_flag is set to 1.

ch_flag represents the number of channels of LPCM audio stream stored in private_payload(). When the audio stream LPCM is monofonicheskie, ch_flag is set to 1. When the audio stream is a stereo LPCM stream, ch_flag is set to 2.

AU_locator represents the initial position of the access unit to sound LPCM audio stream stored in private_payload() based on the position, in front of which is located directly AU_locator. When private_payload() does not save the access module sound AU_locator described, for example, 0xFFFF.

When the elementary stream stored in private_payload()is a subtitle stream (private_stream_id == SUBTITLE), reserved_for_future_use (8 bits) is described for future expansion. After reserved_for_future_use immediately AU_locator (16 bits). AU_locator represents the initial position of the access module of the subtitle in the subtitle stream stored in private_payload(), on the basis of the provisions of the following directly after AU_locator. When private_payload() does not contain the access module subtitles, AU_locator described, for example, 0xFFFF.

[Description private_stream2_PES_payload()]

On Fig shows the syntax private_stream2_PES_payload().

private_stream2_PES_payload() is an extension PES_packet_data_byte (figa and figv) private_payload() for private_stream_2, it is a extension PES_packet_data_byte for PES_packet() of private_stream_2. private_stream2_PES_payload() describes information used to decode the data stream.

In accordance with this option run before PES_packet() of private_stream_2 directly point is the possible beginning of the decode is the generation of the data stream. Thus, in accordance with this option run when the PES_packet() of private_stream_2 detects in the program stream, the stream immediately preceding PES_packet(), can be decoded.

RPN_EP_start in Ermer(), shown in Fig, represents the initial position of the PES_packet()of private_stream_2 for the data stream.

private_stream2_PES_payload() starts with reserved_for_future_use (8 bits) for future expansion. After reserved_for_future_use should video_stream_id (8 bits), 1stRef_picture (16 bits)2ndRef_picture (16 bits)3rdRef_picture (16 bits)4thRef_picture (16 bits)au_information() and VBI()sequence.

video_stream_id describes the same value as the stream-id PES_packet() of the data stream immediately after the PES_packet() in the private_stream_2. video_stream_id identifies the PES_packet()in which you saved the video stream is decoded using the information stored in private_stream2_PES_payload() in PES_packet() of private_stream_2.

1stRef_picture, 2stRef_picture, 3rdRef_picture and 4thRef_picture represent the relative values of the positions in the last pack(), including the first, second, third and fourth reference image of the PES_packet() for private_stream_2 in PES_packet() of the next private_stream_2 video stream identified video_stream_id, respectively. Detail 1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4stRef_picture described as bytes_to_first_P_pic and bytes_to_second_P_pic in tiled publication No. HEI 09-46712 Japanese patent (Application for Japanese patent application No. HEI 07-47211420).

au_information () describes information about the module of the video stream of video data from the PES_packet() of private_stream_2 in PES_packet() private stream_2. au_information() will be described in detail with reference to Fig.

VBI() is used to describe information about closed caption (??the subtitles).

PES_packet() of private_stream_2, which contains private_stream2_PES_payload() described for the possible start of the decoding of each data stream.

On Fig shows the syntax au_information()presented on Fig.

au_information() begins with a field length (16 bits), the length represents the size of the au_information(), which includes the size in length. After length should reserved_for_word_alignment (8 bits) and number_oi_access_unit (8 bits) in the specified sequence. reserved_for_word_alignment is used to align words.

number_of_access_unit represents the number of modules of the video, saved from the PES_packet() private_stream2 for PES_packet() of the next private_stream2.

In other words, number_of_access_unit represents the number of access modules (images)contained in the video data stream presented video_stream_id of the au_information() for the next au_information, namely, until the end of the file stream of the clip, when the au_information() represents the last for him au_information(), PES_packet() of private_stream_2, video_stream_id which private_stream2_PES_payload(), shown in Fig is the same.

After number_of_access_unit should the contents of the cycle in accordance with number_of_access_unit. In other words, the described information about at least one module of the video of the PES_packet() of private_stream_2, including number_of_access_unit the La PES_packet() of the next private_stream_2.

The information described for the cycle (information about the modules of the video)is as follows.

The loop contains pic_struct_copy (4 bits), au_ref_flag (1 bit), AU_length (21 bits), and reserved.

pic_struct_copy describes a copy of pic_struct(), defined in ISO/IEC 14496-10, D.2.2. for a module of the video corresponding to the MPEG4-AVC (ISO/IEC 14496-10). pic_struct() is information that indicates, for example, the image is displayed as a frame or after will be displayed in the top field of the picture is displayed in the bottom of his box.

On Fig shows a table pic_stract.

pic_struct is used as information symbols display mode, which indicates how the image is displayed.

In table pic_struct shown in Fig, when the designated image to display one frame, as listed in the left margin of the values shown in Fig, 0 is set to pic_struct of the picture. Similarly, when the image is marked so that it displays the top field or the bottom field, 1 or 2 are set as values pic_struct of the picture, respectively. When the image is marked so that it displays the top field and the bottom field 3 is set as the value of pic_struct of the picture. When the image is marked so that sequentially displays the bottom field and the top field 4 is installed in ka is este values pic_struct of the picture. When the image is marked so that sequentially displays the top field, bottom field and the top field, as the value of pic_struct for image set 5. When the image is marked so that sequentially displays the bottom field, top field and the bottom field as the value pic_struct for image set 6. When the image is marked so that re-displays one frame twice or three times, as the value of pic_struct of the picture set 7 or 8 respectively.

au_ref_flag indicates whether the corresponding access module reference image that is referenced when decode another module access. When the corresponding access module is a reference image, au_ref_flag is set to 1. When the appropriate module access is not a reference image, au_ref_flag set to 0.

AU_length represents the size of the corresponding access module in bytes.

[Specific example of data recorded on the disc 101]

On Fig - Fig shows specific examples of data that have the above format and recorded on the disc 101, as shown in figure 1.

On Fig - Fig used the video stream in accordance with MPEG2-video and audio stream in accordance with ATRAC. However, the video data stream and the audio stream used in the present invention, not Ogre is nicolalde these threads. In other words, you can use the video data stream in accordance with MPEG4-Visual, the video data stream in accordance with MPEG4-AVC or the like. On the other hand, you can also use the audio stream in accordance with the MPEG1/2/4 audio, the audio stream in accordance with sound LPCM or the like.

Unlike video data stream and audio stream, the subtitle stream may not be sequentially decoded and may not appear in the same intervals. In other words, the subtitle stream is fed intermittently from a module 215 controls the buffer shown in figa and FIGU, module 218 controls the decoder subtitles. Module 218 controls the subtitle decoder decodes the subtitle stream.

On Fig - Fig shows specific examples of file "PLAYLIST.DAT", three information file clip "00001.CLP", "00002.CLP" and "00003.CLP", etc. when these three information file clip "00001.CLP", "00002.CLP" and "00003.CLP" stored in the directory "CLIP" and three file stream clip "00001.PS", "00002.PS" and "00003.PS", corresponding to the three media clip "00001.CLP", "00002.CLP" and "00003.CLP"stored in the directory "STREAM" on the disk 101, as shown in Fig.6. However, Fig - Fig not shown part of the data, such as the file "PLAYLIST.DAT" etc.

In other words, Fig shows a specific example of the file "PLAYLIST.DAT"presented on Fig.7.

On Fig number_of_PlayLists is set to 2. Thus, the number of PlayList(), saved the file "PLAYLIST.DAT", equal to 2. On Fig the first and second PlayList is a PlayList#0 and PlayList#1, respectively.

capture_enable_flag_PlayList of the first PlayList(), namely PlayList#0 is equal to 1. Thus, allowed the re-use of video data reproduced in accordance with the PlayList#0. On the other hand, number_of_PlayItems PlayList#0 is equal to 2. Thus, the number of PlayItem()contained in the PlayList#0 is 2. On Fig specific examples of PlayItem#0 and PlayItem#1 as two PlayItem() is described below PlayList#0".

In PlayItem#0 as the first PlayItem()contained in the PlayList#0, Clip_Information_file_name described with reference to Fig is "00001.CLP", IN_time is a 180090, OUT_time is 27180090. Thus, the clip is reproduced according to PlayItem#0 of PlayList#0, starts from the time 180090 to time 27180090 file stream clip "00001.PS", which corresponds to the file information of the video "00001.CLP".

In PlayItem#1 as the second PlayItem(), which is contained in PlayList#0, Clip_Information_file_name described with reference to Fig is "00002.CLP", In-Time is 90000, OUT_time is 27090000. Thus, the clip is reproduced according to PlayItem#1 of PlayList#0, starts from the time 90000 to time 27090000 file stream clip "00002.PS", which corresponds to the file information of the video "00002.CLP".

On Fig in PlayList#1 as the second PlayList(), capture_enable_flag_PlayList is set to 0. Thus, it is not permitted secondary use is with video reproduced in accordance with the PlayList#1. In PlayList#1 number_of_PlayItems is 1. Thus, the number of PlayItem()contained in the PlayList#1, equal to 1. On Fig a specific example of PlayItem#0 as one PlayItem() is described below PlayList#1".

In PlayItem#0 as one of the PlayItem()contained in the PlayList#1, Clip_Information_file_name described with reference to Fig is "00003.CLP", IN_time is 90000, OUT_time is 81090000. Thus, the clip is reproduced according to PlayItem#0 of PlayList#1, starts from the time 90000 to time 81,090,000 file stream clip "00003.PS", the corresponding file information of the video "00003.CLP".

Next, figa and figv shows a specific example of the information file of the clip, Clip(), which is described with reference to Fig. In other words, figa and figv shows specific examples of the media clip "00001.CLP", "00002.CLP" and "00003.CLP", shown in Fig.6.

In the information file of the clip "00001.CLP", presentation_start_time equal to 90000 and presentation_end_time is 27,990,000. Thus, the program stream stored in the stream file clip "00001.PS", the corresponding file information of the video "00001.CLP", you may use the content for 310 seconds (27990000-90000/90 kHz).

In the information file of the clip "00001.CLP", capture_enable_flag_Clip is set to 1. Thus, allowed the re-use of the stream of video data multiplexed with a program stream stored in the file "00001.PS" thread clip is sootvetstvuyuschego the file information of the video "00001.CLP".

In addition, figa and figv in the information file of the clip "00001.CLP" number_of_streams equal to 4. Thus, four elementary stream multiplexed with a program stream stored in the stream file clip "00001.PS".

If we assume that the four elementary stream presents stream#0, stream#1 and stream#2 and stream#3, figa and FIGU, specific examples of StreamInfo() (Fig) the four elementary streams, which represent a stream#0, stream#1 and stream#2 and stream#3, are described below field "00001.CLP".

In the first elementary stream stream#0 of the file stream clip "00001.PS" represents a file_id he. Thus, as described with reference to Fig and pig (or Fig), the elementary stream stream#0 is a stream of video data. In accordance with this option run private_stream_id is not correlated with the data stream. On figa and figv private_stream_id equal h.

In the video stream stream#0 as the first elementary stream file stream clip "00001.PS", picture_size for StaticInfo() (Fig)contained in the StreamInfo()is equal to "720×480", frame_rate is "29,97 Hz", cc_flag is set to "Yes". Thus, the video stream stream#0 is a video with 720×480 pixels and a frame period 29.97 Hz. In addition, the video stream stream#0 contains the data of the closed caption (??subtitles).

In the video stream stream#0, which is used as the first elementary the second stream file stream clip "00001.PS", number_of_DynamicInfo for StreamInfo() (Fig) is 0. However, there is no pair of pts_change_point and DynamicInfo().

In the second elementary stream stream#1 stream file "00001.PS", a file_id equal OxBD, private_stream_id is h. Thus, as described with reference to Fig and Fig, elementary stream stream#1 is an ATRAC audio stream.

In the ATRAC audio stream stream#1, which is the second elementary stream file stream clip "00001.PS", audio_language_code for StaticInfo() (Fig)contained in the StreamInfo()is "Japanese", channel_configuration is a "STEREO", lfe_existence is "NO", sampling_frequency is "48 kHz". Thus, the ATRAC audio stream stream#1 is a stereo sound data in Japanese. In addition, the ATRAC audio stream stream#1 does not contain the channel effect of low frequency, and the sampling rate is 48 kHz.

In addition, the ATRAC audio stream stream#1 as the second elementary stream file stream clip "00001.PS", because number_of_DynamicInfo for StreamInfo() (Fig) is 0, there is no pair of pts_change_point and DynamicInfo().

In the third elementary stream stream#2 stream file clip "00001.PS" is a file_id 0xBD, private_stream_id is h. Thus, as described on Fig and Fig, elementary stream stream#2 is a subtitle stream.

In the subtitle stream stream#2 as the third elementary stream file stream clip "00001.PS" subtitle_language_code for StaticInfo) (Fig), contained in the StreamInfo()is a "Japanese", configurable_flag set to 0. Thus, the subtitle stream stream#2 is data of subtitles in Japanese. In addition, the mode display is not permitted to change.

In the subtitle stream stream#2 as the third elementary stream file stream clip "00001.PS", because number_of_DynamicInfo for StreamInfo() (Fig) is set to 0, there is no pair of pts_change_point and DynamicInfo().

In the fourth elementary stream stream#3 stream file clip "00001.PS" a file_id installed in 0xBD, private_stream_id is h. Thus, as described with reference to Fig and Fig, elementary stream stream#3 is a subtitle stream.

In order to distinguish the subtitle stream stream#2 as the third elementary stream file stream clip "00001.PS" from the subtitle stream stream#3 as the fourth elementary stream, private_stream_id installed h and h respectively.

In the subtitle stream stream#2, as in the fourth elementary stream file stream clip "00001.PS", subtitle_language_code for StaticInfo() (Fig)contained in the StreamInfo()is "Japanese", while configurable_flag is set to 1. Thus, the subtitle stream stream#3 represents data of subtitles in Japanese. This allowed changing the display mode of the subtitle stream stream#3.

In the subtitle stream stream#3, as in the fourth elementary stream file stream clip "0001.PS", because number_of_DynamicInfo for StreamInfo() (Fig) is 0, there is no pair of pts_change_point and DynamicInfo().

On figa and FIGU, in the information file of the clip "00002.CLP", presentation_start_time is 90000, presentation_end_time is 27090000. Thus, the program stream stored in the stream file clip "00002.PS"corresponding to the file information of the video "00002.CLP", you may use the content for 300 seconds ((27090000-90000)/90 kHz).

In the information file of the clip "00002.CLP" capture_enable_flag_Clip equal to 0. Thus, it is not allowed to re-use the stream of video data multiplexed with a program stream stored in the stream file clip "00002.PS"corresponding to the file information of the video "00002.CLP".

On figa and FIGU, in the information file of the clip "00002.CLP" number_of_streams equal to 4. Thus, as in the previous file stream clip "00001.PS", the four elementary stream multiplexed with a program stream stored in the corresponding file stream clip "00002.PS".

If we assume that the four elementary stream is represented as a stream#0, stream#1 and stream#2 and stream#3, figa and figv specific examples of StreamInfo() (Fig) the four elementary streams, which represent a stream#0, stream#1 and stream#2 and stream#3, are described below field 00002.CLP".

On figa and FIGU, the content of the StreamInfo() first to fourth elementary streams stream#0-#3 file stream clip "00002.PS" the two who are the same, as elementary streams from the first to the fourth stream#0-#3, file stream clip "00001.PS". Thus, their description will not be here given.

As described above, the content of the StreamInfo() first to fourth elementary streams, stream#0-#3, file stream clip "00002.PS" are the same as the contents of the first to fourth elementary streams, stream#0-#3 file stream clip "00001.PS". Thus, the first elementary stream stream#0 of the file stream clip "00002.PS" is a stream of video data. The second elementary stream stream#1 is an ATRAC audio stream. The third and fourth elementary streams stream#2 and stream#3 are subtitle streams.

Next, figa and FIGU, in the information file of the clip "00003.CLP", presentation_start_time is 90000, presentation_end_time is 81090000. Thus, the program stream stored in the stream file clip "00003.PS", the corresponding file information of the video "00003.CLP", can use the content within 900 seconds ((81090000-90000)/90 kHz).

In the information file of the clip "00003.CLP" capture_enable_flag_Clip equal to 1. Thus, for a stream of video data multiplexed with a program stream stored in the stream file clip "00003.PS", the corresponding file information of the video "00003.CLP"permitted secondary use.

In addition, figa and FIGU, in the information file of the clip "00003.CLP" number_of_streams equal to 3. Thus, three elementary the x stream multiplexed with a program stream, stored in the file stream clip "00003.PS".

If we assume that the three elementary stream is represented as a stream#0, stream#1 and stream#2, figa and figv specific examples of StreamInfo() (Fig) these three streams, which represent a stream#0, stream#1 and stream#2 are described below field "00003.CLP".

In the first elementary stream stream#0 of the file stream clip "00003.PS" a file_id equal he. Thus, as described with reference to Fig and pig (or Fig), the elementary stream stream#0 is a stream. As the first elementary stream stream#0 of the file stream clip "00001.PS", private_stream_id equal h.

In the video stream stream#0, just as in the first elementary stream file stream clip "00003.PS", picture_size for StaticInfo() (Fig)contained in the StreamInfo()is "720×480", frame_rate is "29,97 Hz", cc_flag is set to "No". Thus, the video stream stream#0 is a video with 720×480 pixels and a frame period 29.97 Hz. The video stream stream#0 does not contain the closed caption data.

In the video stream stream#0, as in the first elementary stream file stream clip "00003.PS", number_of_DynamicInfo for StreamInfo() (Fig) is equal to 2. Thus, two sets of pts_change_point and DynamicInfo() is described in the StreamInfo().

In the second elementary stream stream#1 stream file clip "00003.PS" is streamed Age. Thus, as described with reference to Fig and pig (or Fig), the elementary is the first stream#1 is a video stream. To distinguish between the video stream stream#0 as the first elementary stream file stream clip "00003.PS" from the video stream stream#1 as the second elementary stream them a file_id set to Ohio and Age respectively. As the first elementary stream stream#0 of the file stream clip "00001.PS", private_stream_id equal h.

In the video stream stream#1, which is used as the second elementary stream file stream clip "00003.PS", picture_size, frame_rate and cc_flag for StaticInfo() (Fig)contained in the StreamInfo()are the same as in the video stream stream#0 in the first elementary stream. Thus, the video stream stream#1 as the second elementary stream file stream clip "00003.PS" is a video with 720 x 480 pixels and a frame period 29.97 Hz. The video stream stream#1 does not contain the closed caption data.

In the video stream stream#1 as the second elementary stream stream clip file "00003.PS", because number_of_Dynamidnfo for StreamInfo() (Fig) is 0, there is no pair of pts_change_point and DynamicInfo().

In the third elementary stream stream#2 stream file clip "00003.PS" a file_id equal 0xBD, private_stream_id is h. Thus, as described with reference to Fig and Fig, elementary stream stream#2 is an ATRAC audio stream.

In the ATRAC audio stream stream#2 as the third elementary stream file stream clip "00003.PS", audio_language_code, channel_configuration, lfe_existence and sampling_frequency on what I StaticInfo() (Fig), contained in the StreamInfo()are the same as the corresponding parameters of the ATRAC audio stream for stream#1 as the second elementary stream file stream clip "00001.PS". Thus, the ATRAC audio stream stream#2 as the third elementary stream file stream clip "00003.PS" is a stereo sound data in Japanese. In addition, the ATRAC audio stream stream#2 does not contain the low-frequency channel effect. ATRAC audio stream stream#2 is the sampling rate of 48 kHz.

In the ATRAC audio stream stream#2 as the third elementary stream file stream clip "00003.PS" number_of_DynamicInfo for StreamInfo() (Fig) is equal to 3. Thus, the StreamInfo() describes three sets of pts_change_point and DynamicInfo().

On Fig shows a specific example Ermer() file information of the clip, Clip(), described with reference to Fig. In other words, Fig shows a specific example Ermer(), shown in Fig, file information of the video "00001.CLP", shown in Fig.6.

On Fig, Erma(), number_of_stream_id_entries is 1. Thus, Erma() describes information possible start decoding of one elementary stream.

In Ermer(), shown in Fig, a file_id is he. Thus, as described on Fig and Fig, EP_map() describes the PTS_EP_start and RPN_EP_start (Fig), as information RAPI (Information point online access), which represents a point of possible start decoding stream VI is dannyh, identified by a file_id, which is Ohio. In other words, Fig, EP_map() is an information file of the clip "00001.CLP". As described with reference to figa and figv file stream clip "00001.CLP", which corresponds to the file information of the video "00001.CLP", elementary stream, a file_id which is equal to Ohio, represents the first video stream, stream#0 file stream clip "00001.CLP". Thus, the information described in Ermer(), shown in Fig is PTS_EP_start and RPN_EP_start possible start decoding the video stream stream#0.

On Fig first five PTS_EP_start and RPN_EP_start points possible start decoding the first video stream stream#0 of the file stream clip "00001.CLP" is described, but the sixth and later PTS_EP_start and RPN_EP_start not presented.

RPN_EP_start, PTS_EP_start, 1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture represent the initial position of all RAPI, multiplexed with the multiplexed stream and end positions of all images in the internal encoding, before which directly follows each of the RAPI, and second, third and fourth reference image, to whom should the image with the internal encoding.

The position of the upper RAPI 0 (sector). PTS images with internal coding, before which directly follows the upper RAPI is 90000. The target position of the image is agenia with the internal encoding of the second, the third and fourth reference image is 28, 37, 48 and 58 as the relative value of the sum of count of sectors from the beginning of the RAPI, respectively.

The position of the second RAPI is 244 (sector). PTS images with internal coding, before which directly follows the second RAPI is 135045. The target position of the image with the internal encoding of the second, third and fourth reference images are 10, 18, 25 and 31 as the relative value of the sector count from the beginning of the RAPI, respectively.

The position of the third RAPI is 305 (sectors). PTS images with internal coding, before which directly is the third RAPI is 180090. The target position of the image with the internal encoding of the second, third and fourth images are 25, 44, 50 and 54, which represent the relative value of the count of sectors from the beginning of the RAPI, respectively.

The position of the fourth RAPI is 427 (sectors). PTS images with internal coding, before which directly is the fourth RAPI is 225135. The target position of the image with the internal encoding of the second, third and fourth reference images are 8, 15, 22 and 29 as the relative magnitude of the count of sectors from the beginning of the RAPI, respectively.

The position of the fifth RAPI equal 701 (is Ktorov). PTS images with internal coding, before which directly placed fifth RAPI, as well 270180. The target position of the image with the internal encoding of the second, third and fourth reference images are 26, 32, 41 and 48 as the relative magnitude of the count of sectors from the beginning of the RAPI, respectively.

A value close to the set value of count sectors (the value of count sectors, which together can be read in the encoding process termination provisions of the four reference images (1stRef_Picture, 2ndRef_Picture, 3rdRef_Picture and 4thRef_Picture stored in the N-th_Ref_picture_copy. On Fig selected value closest to the value of count sectors "30".

For example, the upper point of entry PTS_EP_start=90000 and RPN_EP_start=0. For N-th_Ref_picture_copy, 28, which is the nearest value to "30", selected out of 28, 37, 48 and 58. Thus, "0", which represents 1stRef_picture stored in index_N_minys1.

Further, for the second point of entry PTS_EP_start=135045 and RPN_EP_start=244. For N-th_Ref_picture_copy a value of 31, which is the closest to "30", selected from 10, 18, 25 and 31. Thus, the value "3"which represents 4thRef_picture stored in index_N_minus1.

Thus, the values(0, 28), (3, 31), (0, 25), (3, 29) and (1, 32)stored in index_N_minus1 and N-th_Ref_picture_copy five points inputs, are presented as examples on Fig.

This is the selected algorithm presented solely with regard to the quality of the VA playback device playback. Thus, in this embodiment, the selected value is closest to the relatively small magnitude of count sectors "30". Instead, you may have selected a different value of count sectors. When the value index_N_minus1 is a small value, if the data read out RAPI for size N-th_Ref_picture_copy, the number of the reference images, which are contained, is a small value. In contrast, when the value index_N_minusl large number of reference images that are contained, large.

In this example, although the terminal positions of the reference images is described for five data points, there is the possibility, in accordance with which the number of the reference images is equal to four or less, depending on the method of encoding video or intervals within images. In this case, can be described end position for the maximum number of reference images, at most, four reference images.

In Ermer(), shown in Fig, private_stream_id is h. When a file_id represents the video stream, as described above, the private_stream_id is ignored.

On Fig shows specific examples of PlayListMark() for PlayList#0 and PlayList#1, described with reference to Fig (PlayList(), shown in Fig.7).

The top presents on pig table is PlayListark() (Fig.9) for PlayList#0.

In the upper table shown in Fig, number_of_PlayList_marks for PlayListMark() in the PlayList#0 is 7. Thus, the number of Mark()contained in the PlayListMark() of PlayList#0 is 7.

In the upper table shown in Fig, mark_type (Fig.9) to Mark#0, as the first Mark() for the seven Mark()contained in the PlayList#0 is "Chapter". Thus, Mark#0 is a marker. In addition, since ref_to_PlayItem_id (Fig.9) is 0, Mark#0 belongs to PlayItem#0 for two PlayItem#0 and #1, shown in Fig. In addition, mark_time_stamp in Mark#0 is 180090. Thus, Mark#0 is a mark of time (playback time) 180090 file stream clip, reproduced according to PlayItem#0 contained in PlayList#0. As entry_ES_stream_id and entry_ES_private_stream_id for Mark#0 is 0. Thus, Mark#0 is not correlated with any of the elementary streams. In addition, mark_data for Mark#0 is 1. Thus, Mark#0 is the partition number 1.

File stream clip, reproduced according to PlayItem#0 contained in PlayList#0 is a file stream clip "00001.PS", identified by "00001.CLP", described in Clip_Information_file_name in the PlayItem#0 (Fig). Thus, the time 180090 presented mark_time_stamp in Mark#0 is a point in time of the file stream clip "00001.PS".

In the upper table shown in Fig, Mark#4 as the fifth Mark() of the seven Mark()contained in the PlayList#0 is the label section is a, which coincides with the same first Mark#0.

In other words, mark_type (Fig.9) to Mark#4 as the fifth Mark()is a "Chapter". Thus, Mark#4 is a marker. In addition, ref_to_PlayItem_id (Fig.9) to Mark#4 is 1. Thus, Mark#4 belongs to PlayItem#1 for two PlayItem#0 and #1, shown in Fig contained in PlayList#0. mark_time_stamp in Mark#4 is 90000. Thus, Mark#4 is a timestamp 90,000 file stream clip, reproduced according to PlayItem#1, contained in PlayList#0. In addition, as entry_ES_stream_id and entry_ES_private_stream_id for Mark#4, equal to 0. Thus, Mark#4 is not in correlation with any of the elementary streams. In addition, mark_data for Mark#4 is 2. Thus, Mark#4 is the partition number is equal to 2.

In this example, a sound file stream clip according to PlayItem#1, contained in PlayList#0 is a file stream clip "00002.PS"identified "00002.CLP described in Clip_Information_file_name to PlayItem#1, described with reference to Fig. Thus, the time 90000 presented mark_time_stamp in Mark#4, represents the time of the file stream clip "00002.PS".

In the upper table shown in Fig, 32, mark_type (Fig.9) to Mark#1, the second Mark() of the seven marks Mark()contained in the PlayList#0 is an "Index". Thus, Mark#1 is the label of the index. In addition, ref_to_PlayItem_id (Fig.9) to Mark#1 is equal to 0. Still the way Mark#1 belongs to PlayItem#0 of the two PlayItem#0 and #1, shown in Fig contained in PlayList#0. In addition, mark_time_stamp to Mark#1 is 5 580090. Thus, Mark#1 is a timestamp 5580090 file stream clip, reproduced according to PlayItem#0 contained in PlayList#0. In addition, as entry_ES_stream_id and entry_ES_private_stream_id to Mark#1 is equal to 0. Thus, Mark#1 is not correlated neither one of the elementary streams. In addition, mark_data to Mark#1 is 1. Thus, Mark#1 is the index number 1.

In this example, the file stream clip, reproduced according to PlayItem#0 contained in PlayList#0 is a file stream clip "00001.PS", as described above. Thus, time 5 580 090 presented mark_time_stamp in Mark#1, represents the time of the file stream clip "00001.PS".

In the upper table shown in Fig, Mark#2, Mark#5 and Mark#6, such as the third, sixth and seventh Mark() of the seven Mark()contained in the PlayList#0, represent the labels of the index, such as the second Mark#1.

In the upper table shown in Fig, mark_type (Fig.9) to Mark#3 as the fourth Mark() of the seven Mark() is contained in PlayList#0 is an "Event". Thus, Mark#3 is the label of the event. In addition, ref_to_PlayItem_id (Fig.9) Mark#3 is 0. Thus, Mark#3 belongs to PlayItem#0 of the two PlayItem#0 and #1, shown in Fig contained in PlayList#0. In addition, mark_time_stamp for Mark#3 is 1638009. Thus, Mark#3 is a timestamp, 16380090 file stream clip, reproduced according to PlayItem#0 contained in PlayList#0. entry_ES_stream_id and entry_ES_private_stream_id for Mark#3 is equal to 0. Thus, Mark#3 is not correlated with any of the elementary streams. In addition, mark_data for Mark#3 is 0. Thus, Mark#3 ensures that an event occurs, the argument is equal to 0.

As described above, the file stream clip, reproduced according to PlayItem#0 contained in PlayList#0 is a file stream clip "00001.PS". Time 16380090 presented mark_time_stamp of Mark#3, represents the time of the file stream clip "00001.PS".

In the upper table shown in Fig, time PlayItem(), which belongs to Mark(), described in the left margin on the right side of the table PlayListMark() for PlayList#0. Time PlayList#0 is described in the box on the right side of the table.

The lower table shown in Fig is PlayListMark() for PlayList#1 (Fig.9).

In the lower table shown in Fig, number_of_PlayList_marks for PlayListMark() of PlayList#1 is 3. Thus, the number of Mark()contained in the PlayListMark() for PlayList#1, equal to 3.

In the lower table shown in Fig, mark_type (Fig.9) to Mark#0 as the first Mark() of the three Mark()contained in the PlayList#1 is a "Section". Thus, Mark#0 is a marker. In addition, ref_to_PlayItem_id (Fig.9) to Mark#0 Rav is about 0. Thus, Mark#0 belongs to one of the PlayItem#0 shown in Fig contained in PlayList#1. mark_time_stamp for Mark#0 is 90000. Thus, Mark#0 is a timestamp 90,000 file stream clip, reproduced according to PlayItem#0 contained in PlayList#1. As entry_ES_stream_id and entry_ES_private_stream_id in Mark#0 is equal to 0. Thus, Mark#0 is not correlated with any of the elementary streams. In addition, mark_data for Mark#0 is 0. Thus, Mark#0 is the partition number 0.

File stream clip, reproduced according to PlayItem#0 contained in PlayList#1, is a file stream clip "00003.PS", identified by "00003.CLP", described in Clip_Information_file_name to PlayItem#0 described with reference to Fig. Thus, the time 90 000 presented mark_time_stamp in Mark#0 is a file stream clip "00003.PS".

In the lower table shown in Fig, mark_type (Fig.9) to Mark#1 as the second Mark() of the three Mark()contained in the PlayList#1 is an "Event". Thus, Mark#1 is the label of the event. In addition, ref_to_PlayItem_id (Fig.9) to Mark#1 is 0. Thus, Mark#1 belongs to PlayItem#0 shown in Fig contained in PlayList#1. In addition, mark_time_stamp to Mark#1 is 27090000. Thus, Mark#1 is a timestamp 27090000 file stream clip, reproduced according to PlayItem#0 contained in PlayList#1. In addition to the CSO, in Mark#1 entry_ES_stream_id equal he and entry_ES_private_stream_id equal to 0. Thus, Mark#1 is correlated with the elementary stream, a file_id equal he, namely Mark#1 is correlated with the stream of video data, which is described with reference to Fig and Fig. In addition, mark_data to Mark#1 is 1. Thus, Mark#1 provides event attribute equal to 1.

As described above, the file stream clip, reproduced according to PlayItem#0 contained in PlayList#1 is "00003.PS". Thus, the time 27090000 presented mark_time_stamp in Mark#1, represents the time of the file stream clip "00003.PS".

The video stream, a file_id equal he correlated with the Mark#1 is a video stream, which is a file_id he described in "00003.CLP", described in Clip_Information_file_name to PlayItem#0 contained in PlayList#1 (pig)belongs to Mark#1, namely the first elementary stream (video stream) stream#0 for the three elementary streams stream#0-#2, multiplexed with the file stream clip "00003.PS", identified by the file information of the video "00003.CLP", shown in figa and figv.

In the lower table shown in Fig, mark_type (Fig.9) to Mark#2, as the third Mark() of the three Mark()contained in the PlayList#1 is an "Event". Thus, Mark#2 is the label of the event. In addition, ref_to_PlayItem_id (Fig.9) to Mark#2 is 0. Thus, Mark#2 belongs PlyItem#0, which is one of the PlayItem shown in Fig contained in PlayList#1. In addition, mark_time_stamp to Mark#2 is 27540000. Thus, Mark#1 is a timestamp 27540000 file stream clip is played according to PlayItem#0 contained in PlayList#1. In addition, Mark#2 entry_ES_stream_id is OxEl and entry_ES_private_stream_id equal to 0. Thus, Mark#2 is a flow element, a file_id equal 0xE1, namely correlated with the stream of video data, as described with reference to Fig and Fig. In addition, mark_data to Mark#2 is equal to 2. Thus, Mark#2 provides execution of the event, the argument is equal to 2.

In this example, as described above, the file stream clip, reproduced according to PlayItem#0 contained in PlayList#1, is a file stream clip "00003.PS". Thus, the time 27540000, presents Mark#2, represents the time of the file stream clip "00003.PS".

The video stream, a file_id equal 0xE1 correlated with the Mark#2, is a video stream, which is a file_id 0xE1, described in "00003.CLP", described in Clip_Information_file_name to PlayItem#0 contained in PlayList#1 shown in Fig, namely in the second elementary stream (video stream) stream#1 of the three elementary streams stream#0-#2, multiplexed with the file stream clip "00003.PS", recognized by the file information of the video "00003.CLP", shown in figa and IGV.

In the lower table shown in Fig, time for PlayItem(), which belong to Mark(), described on the right side of the table PlayListMark() for PlayList#1.

On Fig, although mark_data describes the partition number and index that represent the partition label and index, they may not necessarily be described in the mark_data. Instead, by counting the labels section and index PlayListMark() can recognize the number of the section and index.

[Description of operation of the device the disc]

Next will be described the operation of the device the disc shown in figure 1, assuming that the data described in Fig - Fig were recorded on the disc 101, as shown in figure 1.

In accordance with the definition MPEG2-System for the system multiplexing is not necessary to add a timestamp to all modules access. Instead, this definition specifies that timestamps can be added after a period of 0.7 seconds or less. In other words, there are access panels with a timestamp, and access modules that do not have a timestamp.

This example assumes that the module access position of the beginning of the decoding of the video data stream typically has a timestamp. In other words, as will be described in the section "preparation for playing" max PTS_EP_start, which satises PTS_EP_start ≤ IN_time with the EP-map(), ucaut as the start position of decoding using the method of binary search. The access module, immediately following the start position of the playback video data recorded in Ermer(), usually has a timestamp.

In addition, the definition indicates that there are no paired fields. In other words, immediately after the access module pic_struct=1, the access module pic_struct=2. In addition, immediately after the access module pic_struct=2, posted by the access module pic_struct=1.

In this example, assume that the access modules pic_struct=7 and 8 are not used.

When the disc 101 is loaded into the drive disk 102, a message is passed through the interface 114 of the drive and the operating system 201 shown in figa and FIGU, the program 210 playback of video content. When the program 210 playback of content received from the operating system 201, a message that indicates that the disk 101 has been loaded in the drive disk 102, the program 210 playback of the video content begins the process of pre-processing, playback, shown in Fig.

[Pre-treatment process play]

On Fig shows the block diagram of the sequence of operations describing a process of pre-processing of the playback performed by the program 210 playback of video content.

It should be noted that the playback device drive not necessarily in the carry out of the operation or processing in time sequence, the corresponding block diagram of the sequence of operations. Alternatively, the playback device drive can perform these operations or processing in parallel or individually. However, in the description for convenience of operation or processing performed by the playback device drive will be described in accordance with the flowchart of the sequence of operations.

In the pre-treatment process of the play on stage S101, the program 210 playback of video content checks the disk 101 using the file system of the operating system 201, and determines whether the disc 101 normal disc space 210 playback of video content.

As indicated above, although here the disk 101 would it read the files using the file system of the operating system 201, the description here will not be shown.

When the result determined at step S101, is that the disc 101 is not a normal disk, the file system used on the disc 101, does not match the operating system 201, or in the root directory of the disk 101 is not contained in the directory "VIDEO", the program 210 playback of video content determines that the program 210 playback of the video content does not match the disk 101, the processing proceeds to step S 102. At step S102 module 219 processing the graphics performs error processing and completing the pre-processing of playback.

In other words, the module 219 processing graphic images generates an error message that indicates that the disk 101 is not a normal disk, due to processing errors, and provides the output module 220 video output error messages, resulting in the displayed error message. Processing errors can occur, for example, by outputting a sound signal through the module 221 audio output or by unloading of the disk 101 from the drive disk 102.

When the result determined at the step S101 indicates that the disk 101 is a normal disc, the processing proceeds to step S103. At step S103, the program 210 playback of video content provides transmission request from a module 213 data feed content into the operating system 201 to read two data files "SCRIPT.DAT" and "PLAYLIST.DAT"stored in the directory "VIDEO" disk 101 (6). After that, the processing moves to step S104. At step S104 file "SCRIPT.DAT" served in module 211 management scenario. In addition, the file "PLAYLIST.DAT" served in module 212 control the player.

Thereafter, the flow proceeds from step S104 to step S105-S107. At steps S105-S107 module 212 control the player performs the initialization process. Module 211 controls the script waits until the module 212 control the player finishes the initialization process.

[About the ECC initialization module 212 control the player]

In the initialization process, at step S105, the module 212 control the player parses the file "PLAYLIST.DAT" and checks the number of files of information of the clip described in the file "PLAYLIST.DAT", and their file names.

In other words, because the file "PLAYLIST.DAT" represents the file, as shown in Fig, and number_of_PlayLists in the file "PLAYLIST.DAT"shown in Fig, is equal to 2, the module 212 control the player recognizes that there are two PlayList(), which constitute the PlayList#0 and PlayList#1. In addition, since number_of_PlayItems of the first PlayList#0 in file "PLAYLIST.DAT"shown in Fig, is equal to 2, the module 212 control the player recognizes that the PlayList#0 contains two PlayItem(), which represent the PlayItem#0 and PlayItem#1. After this module 212 control the player accesses the Clip_Information_file_name of the first PlayItem#0 and the second PlayItem#1, contained in PlayList#0 file "PLAYLIST.DAT"shown in Fig and recognize that an information file of a clip of the first PlayItem#0 contained in PlayList#0 is a "00001.CLP", and an information file of the clip of the second PlayItem#1 is a "00002.CLP".

Similarly, the module 212 control the player recognizes that the second PlayList#1 contains one PlayItem() (PlayItem#0)because number_of_Play Items is equal to 1 and that the information file of the clip PlayItem#0 is a "00003.CLP" because Clip_Information_file_name PlayItem#0.

Thereafter, the flow proceeds from step S105 to step S106. At step S106 fashion is 212 control the player reads the file information of the clip, detected at step S105, namely three information file clip "00001.CLP", "00002.CLP" and "00003.CLP" from the directory "CLIP"in the directory "VIDEO" on the disk 101.

At step S106 is required to consider only the information file of the clip PlayItem in the PlayList(), which first reproduce. In accordance with this option run, however, as described above, all the media clip PlayItem() of PlayList() pre-read.

After step S106, the processing flow goes to step S107. At step S107 module 212 control the player determines whether the file information of the clip, detected at step S105, successfully read. In addition, the module 212 control the player determines whether the file stream of the clip corresponding to the file information of the clip on the disc 101. In other words, at step S107 module 212 control the player determines whether the file information of the video "00001.CLP", "00002.CLP" and "00003.CLP" successfully read files and stream video "00001.PS", "00002.PS" and "00003.PS", which correspond to the file information of the video "00001.CLP", "00002.CLP" and "00003.CLP", which is present in the directory "STREAM"in the directory "VIDEO" on the disk 101.

When the result determined at step S107 indicates that the media clip is detected at step S105, were not successfully read or the file stream of the clip corresponding to the file information of the clip, not present on the disk 101, and the military files of clip information files and stream video to the file "PLAYLIST.DAT" were not recorded on the disc 101, is determined that the program 210 playback of the video content does not match the disk 101. After that, the processing flow goes to step S102. At step S102 is performed, the above processing errors, and then the pre-treatment process of the playback ends.

In contrast, when the result determined at step S107, shows that the media clip is detected at step S105, were read successfully and that the file stream of the clip corresponding to the file information of the clip, are present on the disk 101, the module 212 control the player finishes the initialization process. After that, the processing flow goes to step S108.

At step S108 module 211 management scenario analyses and executes the file "SCRIPT.DAT".

When the module 211 control script performs the file "SCRIPT.DAT", assuming that ensures compliance module 212 control the player playback of the first PlayList () (PlayList#0). At this point is the playback process shown in Fig.

[Playback]

On Fig shows the block diagram of the process of reproduction, which executes the program 210 playback of video content.

[Processing training to playback]

At steps S121 and S122 module 212 control the player performs the processing of the preparation for the play to layList(), the reproduction of which is provided by module 211 management scenario, namely the first PlayList() (PlayList#0).

In other words, at step S121 module 212 control the player checks IN_time (Fig) of the first PlayItem#0 contained in the first PlayList#0. After that, the processing flow goes to step S122. At step S122 module 212 control the player checks the position of the beginning of playback corresponding to IN_time to PlayItem#0 of the file stream clip "00001.PS", the playback of the first PlayItem#0 contained in the first PlayList#0.

When IN_time (Fig) PlayItem() is the beginning of the file stream clip, program stream read from the beginning of the file stream clip. However, when IN_time is place other than the beginning of the file stream of the clip module 211 control the player should detect the position corresponding to IN_time, and read the file stream clip from this position.

In particular, Fig, IN_time of the first PlayItem#0 contained in the first PlayList#0 is 180090. Module 212 control the player searches Ermer(), shown in Fig, file stream clip "00001.CLP", playing the first PlayItem#0 contained in the first PlayList#0, to position the beginning of the play, where IN_time PlayItem#0 is 180090.

In other words, the module 212 control the player searches for the maximum PTS_EP_start that meets PTS_EP_start Ȥ IN_time, where PTS_EP_start represents a point of possible start decoding described in Ermer(), using the method of binary search or the like. This is due to the fact that the situation presented IN_time, is not usually a point of possible start decoding.

In this case, as described above, the IN_time is 180090. In addition, Erma(), shown in Fig file stream clip "00001.CLP", playing the first PlayItem#0 contained in the first PlayList#0, the maximum value PTS_EP_start that meets PTS_EP_start ≤ IN_time is 180090. Thus, the module 212 control the player searches Ermer(), shown in Fig, for PTS_EP_start, which is 180090.

In addition, the module 212 control the player reads 305 (sectors), the search for RPN_EP_start and determines the position presented RPN_EP_start file stream clip "00001.PS", as the start position of playback.

After the module 212 control the player will determine the position of the beginning of the play, the processing flow proceeds from step S122 to step S123. At step S123 module 212 control the player performs control module 219 processing of graphic images to display the time code. Module 219 processing graphic images generates time code running module 212 control the player and displays the belt code in the module 220 of the output video. The output displays the time code.

Time code displayed in step S123, represents, for example, on the basis of which the beginning of the PlayList() is converted to 00:00:00 (hours:minutes:seconds). In addition to or instead of the time code can be displayed by the section number and the index number.

[Processing analysis PlayListMark()]

After displaying the time code at the step S123, the processing flow goes to step S124. At step S124 module 212 control the player performs an analysis process that analyzes the PlayList(), playback of which is provided by module 211 management scenario, in order, namely, PlayListMark() (Fig.9)described in the first PlayList() (PlayList#0).

In particular, in the upper table shown in Fig, number_of_PlayList_marks for PlayListMark() of the first PlayList#0 file "PLAYLIST.DAT", which was previously read, is 7. Thus, the module 212 control the player recognizes that the number of Mark()contained in the PlayList#0 is 7.

In addition, the module 212 control the player analyzes the seven Mark() in the upper table shown in Fig, and recognizes that the four Mark() from the first to the fourth Mark() of the seven Mark() belong to the first PlayItem() (PlayItem#0) PlayList#0.

After this module 212 control the player receives mark_time_stamp for four Mark(), which belong to the first PlayItem#0 PlayList#0, and passes them as chetyrehballnoe the matrix module 214 controls the decoding. Thus, the four values of time{180090}, {5580090}, {10980090} and {16380090} as mark_time_stamp of the four Mark() from the first to the fourth Mark() of the seven Mark() in the upper table shown in Fig, passed from module 212 player control module 214 controls the decoding. At this point, the attribute processing "label" these time values are also passed from module 212 player control module 214 controls the decoding. When the time counted by this time block A count corresponds to the time having the attribute of "handling label"module 214 controls the decoding relays the message that represents this situation, the time associated with this attribute processing tags, and attribute processing "label" module 212 control the player.

[Treatment decision in respect of the elementary stream that is intended for playback]

After that, the processing flow proceeds from step S124 to step S125. At step S125 module 212 control the player makes a decision in respect of the elementary stream that is intended for playback.

In other words, in the information file of the clip "00001.CLP", shown in figa and FIGU, the name of the file which is described in Clip_Information_file_name of the first PlayItem#0 (Fig) of the first PlayList#0 as PlayList(), playback of which is provided by the module 211 controls the of the script, number_of_streams equal to 4. Thus, the module 212 control the player recognizes that the four elementary stream has been multiplexed with the corresponding file stream clip "00001.PS". In addition, the module 212 control the player checks a file_id and private_stream_id for StaticInfo() file information of the video "00001.CLP", shown in figa and FIGU, four elementary streams, and recognizes that the four elementary stream is a single video stream, one audio stream ATRAC and two subtitle streams. In other words, the module 212 control the player recognizes the number of elementary streams having individual attributes, multiplexed with the file stream clip "00001.PS".

Information on the number of elementary streams having individual attributes, multiplexed with the file stream clip is used to change one elementary stream to another elementary stream that is intended for playback (one sound mode to another mode or sound from one of the closed caption mode to another mode subtitle). When the file stream of the clip does not contain the file subtitle stream (namely, the content does not include subtitle data), determines whether the subtitle stream with information about the number of elementary streams having the attribute "subtitle stream".

Module 212 controls the player the winner chooses and decides against playing elementary stream in accordance with the verification result StaticInfo(). In this case, four elementary stream multiplexed with the file stream clip "00001.PS", contain one elementary stream having the attribute "data stream", and one elementary stream having an attribute of "audio stream". Thus, the elementary stream having the attribute "data stream", and elementary stream having an attribute of "audio stream" (ATRAC audio stream)undoubtedly are defined as elementary streams for replay.

On the other hand, four elementary stream multiplexed with the file stream clip "00001.PS", contain two elementary stream having the attribute "subtitle stream". Thus, one of these two threads subtitle select and define as the elementary stream that is intended for playback. In this example, select the subtitle stream, which first comes in two streams of the subtitle information file clip "00001.CLP".

When you recognize the attributes and the number of elementary streams multiplexed with the file stream clip "00001.PS", the four elementary stream must be identified. Module 212 control the player identifies these four elementary stream multiplexed with the file stream clip "00001.PS" using a file_id and private_stream_id.

In other words, the module 212 control the player identifies the elementary is the first thread having the attribute of "data stream" of the four elementary streams multiplexed with the file stream clip "00001.PS" with a file_id, which is he, as described in the information file of the clip "00001.CLP", shown in figa and figv.

In addition, the module 212 control the player identifies the ATRAC audio stream, which is an elementary stream having an attribute of "audio stream", of the four elementary streams multiplexed with the file stream clip "00001.PS" with a file_id, which is 0xBD and private_stream_id that is h, as described in the information file of the clip "00001.CLP", shown in figa and figv.

In addition, the module 212 control the player identifies two subtitle streams, which represent the elementary streams having the attribute "subtitle stream", of the four elementary streams multiplexed with the file stream clip "00001.PS" with a file_id, which is 0xBD and private_stream_id that represents h, and with a file_id, which is 0xBD and private_stream_id that is equal h, as described in the information file of the clip "00001.CLP" figa and figv respectively.

As described above, the elementary stream multiplexed with the file stream clip, can be identified by a file_id and private_stream_id described as metadata information file of the clip corresponding to the file is Otok clip.

The combination of a file_id and private_stream_id is a mechanism provided to extend muxing MPEG2-System. When the combination of a file_id and private_stream_id is used as metadata elementary stream can be reliably identified. In addition, when the private_stream_id is extended so that it increased the number of attributes of the respective elementary streams, the current mechanism can be used without any changes. Thus, the combination of a file_id and private_stream_id has a high possibility of extension.

In other words, for example, in the standard drive type blu-ray (BD) uses PID (PID, ID, package) transport stream standard MPEG2 to identify data. Thus, the standard BD is limited to standard MPEG2. On the other hand, the standard DVD-Video defines sub_stream_id, which is similar private_stream_id. However, sub_stream_id cannot be described in the database to identify a flow. sub_stream_id is described in a fixed region for information, consisting of only eight to 32 threads (see VI4-49, table 4.2.1-2 (VTS_AST_ATRT) and VI4-52, table 4.2.1-3 (VTS_SPST_ATRT)). Thus, sub_stream_id does not have a high possibility of extension.

On the other hand, the combination of a file_id and private_stream_id can be described using metadata. For example, in the information file of the clip, Clip(), shown in Fig, the combination of a file_id and private_stream_id can be described to define elenoa number of times, presents number_of_streams. Thus, the elementary streams multiplexed file stream clip, can be identified by the combination of a file_id and private_stream_id as metadata, described in the information file of the clip, Clip(), regardless of the number of elementary streams (in the range presented number_of_streams).

In accordance with this alternative implementation, the combination of a file_id and private_stream_id is used to identify the elementary stream multiplexed with the file stream of the clip corresponding to the information file of the clip shown in Fig. In addition, this combination can be used to identify the elementary stream that correlates Mark() as a combination of entry_ES_stream_id and entry_ES_private_stream_id for PlayListMark()shown in Fig.9. In addition, the combination of a file_id and private_stream_id is used to identify the elementary stream that describes the information possible start decoding in Ermer(), shown in Fig.

[Processing control output attribute]

Next, the processing flow goes to step S125 to step S126. At step S126 module 212 control the player performs processing to control the output attribute of the elementary stream defined for playback at step S125.

In particular, the module 212 control the player checks number_of_DynamicInfo (Fig), which provide the focus of a number of DynamicInfo() (peg), which describe the output attributes of the video data stream, the ATRAC audio stream, and subtitle stream defined for playback at step S125.

In this case, a video stream, the ATRAC audio stream, and subtitle stream, intended to play, represents an elementary streams multiplexed with the file stream clip "00001.PS". In the information file of the clip "00001.CLP", shown in figa and FIGU, number_of_DynamicInfo's are all equal to 0. When number_of_DynamicInfo all equal to 0, the module 212 control the player does not perform the process control output attribute for the output attributes of the elementary streams that are intended for playback.

When number_of_DynamicInfo elementary streams intended for playback, not 0, the process control output attribute for elementary streams. The process control output attributes will be described below.

[Process start playback]

After step S126, the processing flow proceeds to step S127. At step S127 module 212 control the player performs the processing start playing for elementary streams intended for playback.

In other words, the module 212 control the player passes the file name for the file stream clip "00001.PS", with which the elementary stream that is designed to vosproizvedeny is, was multiplexed, and RPN_EP_start (=305), described in Ermer()as the start position of the playback specified at the step S122, the module 213 feed data content.

In addition, the module 212 control the player initializes module 215 controls the buffer before the program stream stored in the stream file clip "00001.PS", with which elementary stream, intended to play, was multiplexed transmit module 215 controls the buffer.

In particular, the module 215 control buffer (figure 5) sets the same for the pointer to the data stored on the block 231 of the memory pointer to the data pointer write data stored in block 232 of the drive of the record pointer data pointer reading video data stored in block 241 of the memory read pointer video data, the read pointer of the audio stored on the unit 251 of the memory pointer read audio and pointer read subtitles stored at block 262 of the memory pointer read subtitles.

Thus, the pointer to the data stored on the block 231 of the memory pointer to the data, and a pointer to the write data stored in block 232 of the memory pointer data records contain the same position in the buffer A module 215 buffer management. This means that in the buffer A were not stored valid data.

In addition, the module 212 control the player passes a file_id and, if necessary, private_stream_id as identification information for the elementary stream that is intended for playback module 215 controls the buffer.

In other words, as described above, the video stream having the attribute of "data stream" in the elementary streams intended for playback, is identified by a file_id, which is equal geo. The ATRAC audio stream having the attribute of "audio stream", is identified by a file_id, which is set to 0xBD and private_stream_id that has a value of h. The subtitle stream having an attribute of "subtitle stream", is identified by a file_id, which is 0xBD and private_stream_id that is h. Module 212 control the player passes these a file_id and private_stream_id module 215 controls the buffer.

In module 215 control buffer (figure 5) unit 233 reading video data saves a file_id, which is geo for the stream of video data received from the module 212 control the player, in the register 242 a file_id. In addition, the block 234 reading sound retains a file_id, which is 0xBD and private_stream_id that is h obtained from module 212 control the player, in the register 252 and a file_id register 253 private_stream_id respectively. In addition, the block 235 reading subtitles saves a file_id, kotoryhsostavlyaet 0xBD and private_stream_id, which is h taken from module 212 player control register 263 and a file_id in the register 264 private_stream_id respectively.

Module 212 control the player saves a file_id and private_stream_id for the elementary stream that is intended for playback passed to module 215 controls the buffer for further processing. In module 212 control the player uses a file_id and private_stream_id, when identified by the receipt of a request message for changing the flow or playing stream, during processing of the label, which will be described below.

To initialize module 215 control buffer (figure 5) module 212 control the player sets a flag reading subtitles, having a value in accordance with the file stream clip, multiplexed with an elementary stream intended for playback, block 261 saving function flag read subtitles.

In other words, in this case, because the file stream clip "00001.PS", which were multiplexed elementary streams for replay, contains a subtitle stream, the function flag read subtitles, the value of which is 1, is selected at block 261 of the drive flag reading subtitles to activate block 235 reading subtitles. When the file stream to the IPA, which was multiplexed elementary stream, intended to play, does not contain the subtitle stream, the function flag read subtitles, the value of which is 0, is set to block 261 of the drive flag reading subtitles. In this case, the block 235 the function of reading the subtitles does not perform any processing.

In addition, the module 212 control the player passes IN_time whose value is 180 090 and OUT_time, the value of which is 27 180 090 first PlayItem#0 (Fig)contained in the first PlayList#0, the playback module 212 control the player was provided by the module 211 management script for module 214 controls the decoding module 214 controls the decoding uses the IN_time to start decoding of the clip being played in accordance with the PlayItem(), and OUT_time to stop the decoding of the clip and to control the process of changing the Play Item, which will be described below.

Module 212 control the player initializes the display mode of the subtitle stream in which the module 219 processing graphic images display the subtitle stream. In other words, the module 212 controls player controls module 219 processing graphics for display of the subtitle stream in a default display mode.

[Started the data is read]

After that, the processing flow proceeds to step S127 to step S128. Module 212 controls player controls module 213 of the filing data of the content to read the file stream of the clip that contains the program stream, which was multiplexed elementary stream that is intended for playback using the operating system 201. In other words, the module 213 feed data content refers to the file stream clip "00001.PS" in the directory "STREAM", which is located in the directory "VIDEO" disk 101 (6), denotes the sector 305, which represents the position of the beginning of the play, which was determined at step S122, and provide reading this file, the operating system 201. Module 213 feed data content provides the data flow through the operating system 201 to supply the data that has been read from the disk 101, the module 215 controls the buffer.

Thus, the program stream file stream clip "00001.PS" is read from the disk 101. Program stream is served in module 215 controls the buffer.

Module 215 control buffer (figure 5) writes the program stream that has been read from the disc 101, in the position represented by the index data recording block 232 of the memory pointer write data buffer A, and performs a sequential increase the record pointer data size recorded the data.

Unless specified otherwise, when the buffer A module 215 controls the buffer has free space, the module 213 feed data content reads data from the disc 101, transmits and stores these data in the buffer A module 215 buffer management. Thus, in the buffer A usually saved enough data.

[Start control decoder]

When data is read from the disc 101 and the data remain in the buffer A module 215 buffer management, flow proceeds from step S128 to step S129. At step S129 module 214 controls the decoding controls module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles for starting reading data from the buffer A as a pre-decoding operation.

Thus, the module 216 controls the decoder requests the data block 233 reading video data module 215 control buffer (figure 5). Module 216 controls the video decoder receives one module access to the video data stored in the buffer A, PTS and DTS (sometimes referred to as a time stamp) is added to the module of the video, pic_struct_copy, au_ref_flag and AU_length, which is information (sometimes referred to as additional information)described in the PES_packet() of private_stream_2, which immediately precedes the point of potential is achala decoding, etc. from module 215 controls the buffer in accordance with the request. Timestamp passed from module 216 controls the decoder module 214 controls the decoding whenever the module 216 controls the decoder module receives access to image data.

pict_struct_copy used to update the time, passed from block 233 reading video data. Instead, it can be used pic_struct contained in the bit stream obtained as a result of the analysis.

On the other hand, the module 217 controls the audio decoder requests a block 234 reading sound module 215 control buffer (figure 5) to receive data. Module 217 controls the audio decoder receives one access module sound (ATRAC)stored in the buffer A, and time stamp (PTS, DTS), added to the access module sound module 215 buffer management, in accordance with the request. Timestamp passed from module 217 controls the audio decoder module 214 controls the decoding whenever the module 217 controls the audio decoder module receives access to the sound.

In addition, the module 218 controls the subtitle decoder requests a block 235 reading subtitles from module 215 control buffer (figure 5) to receive data. Module 218 controls the subtitle decoder receives one access module subtitles stored in bufera, and a timestamp added to the access module subtitles module 215 buffer management, in accordance with the request. Timestamp passed from module 218 controls the subtitle decoder module 214 controls the decoding whenever the module 218 controls the subtitle decoder receives the access module subtitles. When the elementary stream that is intended for playback, does not contain the subtitle stream or buffer A not saved module access to the subtitles, the data will not be transferred from the module 215 of the buffer management module 218 controls the decoder subtitles.

Whenever the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitle requests module 215 controls the buffer to receive the data, they deliver the results in accordance with their requests for data in the module 214 controls the decoding.

Detail data read from the buffer A when the module 215 controls the buffer sends data to the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the subtitle decoder will be described below.

[Start decoding data]

When the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles start to read data the e buffer A module 215 buffer management, the flow moves from step to step S129 S130. At step S130 these modules begin to decode data that has been read.

In other words, the module 214 controls the decoding ensures the decoding module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the subtitle decoder in accordance with IN_time, which is a 180090, the first PlayItem#0 contained in PlayList#0 is transferred from the module 212 control the player at step S127 in accordance with a time stamp transmitted from the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder of subtitles in step S129 and, if necessary, modified to ensure synchronization.

The way to begin decoding the data with the change of time for synchronization are described, for example, in Japanese patent No. 3496725. Briefly, the minimum value of the timestamp passed from module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles are set as the initial value of the time that counts the block A counting time. Block A counting of time starts counting the time from the set time. When the time counted by the block A counting time, corresponds to the belt label, module 214 controls the decoding ensures the data is decoded by these modules.

Module 216 controls the video decoder receives the first command of the decoding module 214 controls the decoding, transmits one module access to image data received from the block 233 reading video data module 215 control buffer (figure 5), the video decoder 116 (Fig 1) and provides the decoding by the video decoder module 116 access to image data. In addition, the module 216 controls the decoder transmits the video data decoded by the video decoder 116, module 219 processing of graphic images.

After this module 216 controls the video decoder provides the subsequent decoding by the video decoder 116 modules access to image data received from the block 233 reading module 215 controls the buffer one by one and transmits the decoded access module video data as the video data module 219 processing of graphic images.

At this point in the video decoder 116 change the order in which the decoded video data and output video data. For example, as shown in Fig, video decode in the order I1, B0, P3, B2, and B4 P5, while the video data can be output in the order of B0, I1, B2, P3, B4, and P5. Thus, the video decoder 116 contains a buffer of decoded images, which retain decoder the private image. On Fig In represents the n-th image I, Bn is the n-th image, and Pn represents the n-th image R.

Module 217 controls the audio decoder receives the first command of the decoding module 214 controls the decoding, passes the one access module sound obtained from block 234 reading sound module 215 control buffer (figure 5), the decoder 117 sound (figure 1) and provides the decoding by the decoder 117 sound module access to the sound. Module 217 control decoder transmits sound data of the sound decoded by the decoder 117 sound module 221 sound output.

After this module 217 controls the audio decoder provides sequential decoding by the decoder 117 sound modules access to the sound received from block 234 reading sound module 215 controls the buffer one by one, and transmits the decoded modules access to the sound data as sound module 221 sound output.

In addition, the module 218 controls the subtitle decoder receives the first command of the decoding module 214 controls the decoding decodes using internal software decoding subtitles one access module subtitles, obtained from block 235 reading subtitles, module 215 control buffer (figure 5) in accordance with the command, and transmits dekodyrovanyya access the subtitles as subtitle data (image data of the subtitle) in module 219 processing of graphic images.

After this module 218 controls the subtitle decoder provides sequential decoding using internal software decoding modules access the subtitles, obtained from block 235 reading subtitles, module 215 controls the buffer one by one and transmits the decoded access module subtitles data in subtitle in module 219 processing of graphic images.

[Graphics processing]

After that, the processing flow proceeds from step S130 to step S131. At step S131 module 219 processing graphics performs the processing of graphic images for the video data transferred from the module 216 controls the video decoder, and, if necessary, subtitle data transmitted from the module 218 controls the decoder subtitles.

In other words, the module 219 processing graphics can handle subtitles, which, for example, increases or decreases subtitle data transmitted from the module 218 management subtitle decoder, in accordance with the display mode received from the module 212 control the player. When the module 219 processing graphic images not accept the command mode of the display module 212 control the player or took from him the command that sets the default mode C is Azania, module 219 processing graphics keeps unchanged subtitle data received from module 218 controls the decoder subtitles.

In addition, the module 219 processing graphics adds video data received from module 216 controls the video decoder, and subtitle data received from module 218 controls the decoder subtitle or subtitle data that have been processed, receives the video output, which were imposed by subtitle data, and transmits the video overlay module 220 video output.

When the module 219 processing graphics takes command on the view menu information containing the message, the time code, the section number or index number, module 211 management script or module 212, the player control module 219 processing graphic images generates information, puts it on the output video data and transmits the data superimposition module 220 output video data.

[Processing]

After step S131, the processing flow goes to step S132. At step S132 module 220 output video data sequentially stores the output video data transmitted from a module 219 processing of graphic images in PPA A, and sequentially outputs the video data stored in PPO A at a given rate of output.

To the da PPO A has sufficient capacity to store data (free space), module 220 video output takes the video output module 219 processing of graphic images. When PPPO A not have adequate capacity for the storage module 220 video output provides stop receiving module 219 processing graphics output video data. As a result, the module 219 processing graphics does not accept the output. In addition, the module 219 processing graphics provides the stop processing module 216 controls the video decoder module 218 controls the decoder subtitles. Thus, the module 216 controls the video decoder module 218 controls the decoder subtitle stop its processing.

After the module 220 output video provides stop receiving module 219 processing graphics output video data, and PPO A will display video when PPPO A has sufficient capacity to store data module 220 video output provides reception module 219 processing graphics output video data. As well as a request for stopping the output video module 219 processing graphics informs module 216 controls the video decoder module 218 controls the decoder subtitle of this request. Thus, the module 219 is the processing of graphic images, module 216 controls the video decoder module 218 controls the decoder subtitles resume a stopped process.

On the other hand, the module 221 audio output also provides a consistent saving in PPO A the sound data transmitted from the module 217 control audio decoder, described in step S130, and the serial data output sound with a given output speed (sampling rate).

While PPO A has sufficient capacity to store data (empty space), module 221 audio output receives sound data from the module 217 control decoder sound. However, when PPPO A does not have sufficient capacity to store data module 221 audio output provides stop receiving module 217 control decoder sound data of the sound. Thus, the module 217 control decoder sound stops its processing.

After the module 221 audio output provides stop receiving module 217 control decoder sound data of the sound and PPO 21A output sound when PPPO A has sufficient capacity to store data module 221 audio output provides reception module 217 control decoder sound data of the sound. Thus, the module 217 control decoder sound resumes the stopped process.

As described above, the module 220 video output and mo is ul 221 audio output output data, decode elementary streams.

[Description of the internal structure of the video decoder 116]

On Fig shows the internal structure of the video decoder 116. In this example, the video decoder 116 consists of a mechanism 116A decoding video and DPB (BDI, buffer the decoded image) V. DPB 116 also comprises DPB M-1-DPB M-n (lower together they are called DPB V, unless specified otherwise). In addition, as shown in Fig, DPB V consists of buffer 301 video and buffer 302 for more information.

In the mechanism 116A decoding video uses the buffer 301 video DPB B, which temporarily contains the decoded video data, and provides video data as a future reference image. At this point, additional information obtained from the block 233 the scanning of the video, and the parameter (for example, pic_struct)obtained by analyzing access module, written in the buffer 302 additional information, which corresponds to the video data stored in the buffer 301 video.

Although the overall processes and their threads processing in which the playback device of a disk shown in figure 1, reproduces data from the disc 101 shown in Fig and Fig will be described in other processes or operations that the device performs playback of a disc, while it is playing is it the data from the disk 101.

[Information transfer time module 214 controls the decoding]

Next will be described the update clock frequency (block A counting time). Module 216 controls the video decoder decodes, using the video decoder 116, an input module access to image data. After the video decoder 116 performs processing for decoding and reordering, video data for one frame (two fields) to display in the module 219 processing of graphic images. In addition, the time stamp (PTS/DTS) and information pic_struct video data is passed from module 216 controls the decoder module 214 controls the decoding.

If the value of pic_struct of the access module is 1 or 2, the access module has one field. Thus, when take two access module, pic_struct previously received field and the timestamp, if the previously received field is a timestamp, passed from module 216 controls the decoder module 214 controlling the decoding so that the two modules access are treated as one. If previously received field does not have a timestamp, information that indicates that previously received field is the timestamp passed to the module 214 controls the decoding. Because isolated field is not permitted, immediately after the field in which pic_struct is 1 or 2, is a field that is La which is pic_struct is 2 or 1, respectively. When these fields are treated as one, the timestamp of the previously received field is used as the representative value.

If pic_struct of the access module is 0, 3, 4, 5 or 6, when the output of one module access, pic_struct and the timestamp, if the access module it has passed from module 216 controls the decoder module 214 controls the decoding. When the access module does not have a timestamp, information that indicates that the AP has no time stamp, transfer module 214 controls the decoding.

Module 214 management updates the decoding unit A counting time using the received timestamp and information pic_struct.

Next, with reference to the block diagram of the sequence of operations shown in Fig, you will learn how to update block A counting time.

Module 214 controls the decoding determines whether the received access module timestamp (step S141). If the access module has a timestamp, module 214 controls the decoding sets the value of the time stamp (PTS) for the block A counting time (step S142). As described above, immediately after the decoding process will begin for access module, because it usually has a time stamp, no abnormal events may not occur in the initial position. what if the AP does not have a timestamp, is in accordance with previous pic_struct add to the current time (step S144). After that, the current value of pic_struct retain for further processing and then the current processing is finished (step S143).

As shown in Fig, the value corresponding to the stored pic_struct, 0, 3, or 4, block A counting time totals time values of these two fields. If pic_struct is 5 or 6, block A counting time summarizes the time value of the three fields. If stored pic_struct has values of 1 or 2, block A counting time totals time values of the two fields.

When this process changes the time value of the time unit A counting time is the start time of the display module access (for one frame), which was withdrawn from the module 216 controls the video decoder module 219 processing of graphic images. In other words, if video data have a time stamp, PTS replace the unit A counting time. If the video does not have a timestamp, add the display interval of the immediately preceding video data in the display order.

In this example, the AVC is used as a system for encoding video. In the system MPEG-2 Video, for example, with repeat_first_field, can be obtained duration display module access.

As described above, in this case the e, if the capacity of storing data in PPO A not sufficient, the video output module 216 controls the decoder stops. In this case, the updating unit A counting time is automatically stopped. When an update video in PPO A resumes, update block A counting time is automatically restored.

In other words, when the playback mode is changed to the paused state in accordance with the command sent by the user, because the update module 220 video output is stopped, the module 216 controls the decoder is stopped, and thus, stops the counting time (block A counting time). When the pause switch to the normal playback, update module video output is allowed. As a result, the module 216 controls the video decoder and the output of a video resume. In addition, the update time (block A counting time) also resumed.

It is expected that such operations are performed during slow playback. In other words, slow play is a state in which the state of pauses and normal playback is permitted in turn. At this point in time (block A counting time) update in sync with the output is Imami video data.

In this example, the block A counting time update synchronously with the video data output module 216 controls the video decoder. However, if the delay that occurs in the module 216 controls the video decoder and after it, in this case, module 219 processing of graphic images and module 220 of the video output will be large, there is a possibility that the relationship between the video data transmitted by the user, and time (block A counting time) will be rejected. In this case, when the update time (block A counting time) is synchronous with the video data output from the module 220 video output, you can eliminate the deviation of the dependency.

In particular, the buffer 302 additional information described with reference to Fig add to the block of video data processing module 219 processing graphics module 220 video output, PPO A, resulting in a video and more information is processed in pairs, up until the video will not be displayed. In addition, when the video output module 220 video output, additional relevant information is passed to the module 214 controls the decoding. The control module updates the decoding time (block A counting time)using the above Sal the rhythm.

In this way, regardless of whether the delay that occurs in the decoder or after him, big or small, video data that is displayed, and time (block A counting time) can be synchronized.

As a result, even if the device that reproduces the stream data does not include hours exercising independent calculation of time, streaming data can be accurately reproduced. Thus, the load applied to the CPU 112, which performs processing can be reduced.

[Edit PlayItem]

As described on Fig and Fig, played first PlayItem#0 of the first PlayList#0 shown in Fig. In accordance with the PlayList#0 after playback of the first PlayItem#0 plays the second PlayItem#1. In other words, is the process of changing the Play Item, which replaces the PlayItem with PlayItem#0 on the PlayItem#1.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing changes PlayItem.

As described on Fig and Fig, after playing a clip of the first PlayItem#0 PlayList#0 shown in Fig, during playback of the first PlayItem#0 module 214 controls the decoding (figa and 2) checks the time counted by the block A counting time.

[End of playback of the PlayItem#0]

When the time has counted down the block A estimating the time article is becomes equal to 27 180 090 (Fig), which is the OUT_time of the first PlayItem#0 is transferred from the module 212 control the player at step S127 shown in Fig module 214 controls the decoding performs management cancellation decoding for playback finishes PlayItem#0 in the step S151.

If the block A counting time is not running at a frequency of 90 kHz, namely the time update in accordance with the output video data, the time which counts the output unit A may not accurately correspond to the OUT_time to PlayItem#0. In this case, at the time when the time OUT_time PlayItem#0 becomes close to the time that counts the block A counting time, the decoding process is stopped and thus, the reproduction of the PlayItem#0 complete. This processing will be described below with reference to Fig and Fig.

In other words, the module 214 controls the decoding provides the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles, so they stop their work when decoding. In addition, the module 214 controls the decoding module manages 220 video output, serial output video data.

In addition, the module 214 controls the decoding transmits a message that indicates that the first PlayItem#0 has been reproduced, the module 212 upravlenii.problem.

[Start playing PlayItem#1]

As described above, the module 212 control the player recognizes that the first PlayList#0 contains the first PlayItem#0 and the second PlayItem#1, at step S105 shown in Fig. When the module 214 controls the decoding receives a message indicating that the first PlayItem#0 has been reproduced from module 212 player control flow goes from step S151 to step S152. At step S152 as well as the first PlayItem#0, the module 212 control the player starts to play the second PlayItem#1.

As well as appearing on stage S122 shown in Fig, as in the processing of playback of the second PlayItem#1, module 212 control the player determines one of RPN_EP_start described in Ermer()as the start position of the playback of the second PlayItem#1.

In addition, the module 212 control the player recognizes Mark(), which belong to the second PlayItem#1, described in step S124 shown in Fig, and the number of elementary streams having the attributes that are multiplexed with the file stream clip "00002.PS"reproduced according to PlayItem#1, described in step S125 shown in Fig, and identifies the elementary stream for playback.

Module 212 control the player performs the same processing as in the step S127 shown in Fig.

In other words, the module 212 control the player passes RP_EP_start EP_map(), defined as the position of the beginning of the play, and the name of the file stream clip, multiplexed with an elementary stream, intended to play, namely the file name of the thread clip "00002.PS"appropriate "00002.CLP described in Clip_Information_file_name of the second PlayItem#1 (pig), the module 213 feed data content.

In addition, the file stream clip "00002.PS", multiplexed with an elementary stream, intended to play, stored program stream, which enters the module 215 of the buffer management module 212 control the player initializes module 215 controls the buffer.

In other words, the module 215 control buffer (figure 5) sets the same value for a pointer to the beginning of data block 231 save pointer to start of data pointer write data stored in block 232 save the record pointer data pointer reading video data stored in block 241 save the pointer of the read video data, pointers reading sound stored on the unit 251 save the pointer reading of the sound, and signs reading subtitles stored at block 262 save the pointer reading subtitles.

In addition, the module 212 control the player passes a file_id and, if necessary, private_stream_id as identification information, which is th identifies the elementary stream, designed for the playback module 215 controls the buffer.

Block 233 reading video data module 215 control buffer (figure 5) takes a file_id video stream consisting of elementary streams intended for playback module 212 control the player, and stores it in the register 242 a file_id. In addition, the block 234 reading sound takes a file_id and private_stream_id of the stream of audio elementary streams intended for playback module 212 control the player, and stores them in the register 252 and a file_id register 253 private_stream_id respectively.

Because the file stream clip "00002.PS", multiplexed with an elementary stream intended for playback, contains a subtitle stream, a file_id and private_stream_id thread subtitle elementary streams, intended to reproduce, transmit module 212 control the player at block 235 the function of reading the subtitles. Block 235 reading subtitles saves a file_id and private_stream_id register 263 and a file_id in the register 264 private_stream_id respectively.

Module 212 control the player sets a flag reading subtitles, which has a value corresponding to the file stream clip, multiplexed with an elementary stream that is intended for playback on the block 261 save the log reading subtitles to initiate module 215 control buffer (figure 5).

In this case, because the file stream clip "00002.PS", multiplexed elementary streams intended for playback, contains a subtitle stream, the function flag read subtitles, the value is set to 1, set at block 261 saving function flag read subtitles to activate block 235 reading subtitles.

Module 212 control the player passes the value 90000 as IN_time and is 27090000 as OUT_time of the second PlayItem#1, intended for playback (pig) module 214 controls the decoding.

In addition, the module 212 control the player initializes the command mode display the subtitle stream module 219 processing of graphic images. In other words, the module 212 controls player controls module 219 processing graphics for display of the subtitle stream in display mode, the default.

When configurable_flag (Fig) subtitle stream that is intended for playback is set to 1, which indicates that the display mode is permitted to change the command mode display the subtitle stream that module 212 control the player passes in module 219 processing of graphic images can be maintained by using the command t is the current display mode.

In the same way as the first PlayItem#0, plays the second PlayItem#1. While playing the second PlayItem#1, module 214 controls the decoding checks the time, which calculates the block A counting time. When the time calculated block A calculation time becomes equal 27090000 (Fig), that is, OUT_time of the second PlayItem#1 transmitted from the module 212 control the player at step S152 (Fig), the module 214 controls the decoding performs the same control cancellation decoding, as on the stage S151, to stop playback PlayItem#1. As described above, the time that counts block A counting time may not exactly match the OUT_time PlayItem#0. In this case, at the time, in which time OUT_time PlayItem#0 becomes close to the time value, which is calculated unit A counting time will be processed the cancellation decoding and thus ends the playback of the PlayItem#0. This process will be described below with reference to Fig and Fig.

[Display time code]

Further, as described above, at step S123 shown in Fig represented by the time code. Displaying the time code is consistently updated.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing display time code.

When the unit A counting time is EIW, built-in module 214 controls the decoding (figa and figv), calculates a second, and the processing flow goes to step S171. At step S171 module 214 controls the decoding transmits a message, which indicates that took one second, and the current time, which is calculated unit A of time calculation module 212 control the player. After that, the processing flow goes to step S172. At step S172 module 212 control the player receives a message and the current time of the module 214 controls the decoding and converts the current time in the time code. Thereafter, the flow goes to step S173.

At step S173 module 212 controls player controls module 219 processing of images for displaying a time code obtained in step S172. After that, the processing flow returns to step S171.

Thus, the time code is updated at intervals of one second. The update intervals of the time code are not limited to one second.

[Change threads]

File stream clip "00001.PS", played the first PlayItem#0, which is the first PlayList#0, is described with reference to Fig, and file 00002.PS" thread clip that plays the second PlayItem#1, multiplexed with two subtitle streams, as described with reference to figa and figv.

When multiple elementary streams, the ima is related to the same attribute, multiplexers file stream clip, elementary streams for replay can be changed from one elementary stream to another elementary stream.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing shifts stream.

When the team shifts stream is described as a script program, for example, in the file "SCRIPT.DAT" (6), and module 211 controls the script executes the script program, or the user can manipulate the remote control to change threads, team change flow enters the module 212 control the player.

When the module 211 controls the script executes the script program, which describes the request to change the thread module 211 controls the script sends a request message for changing the flow module 212 control the player. When the user enters a command to change the thread using the remote control, the input interface 115 receives a command signal to change the flow of the remote controller and transmits the request message to change the flow module 212 control the player.

When the request message to change the subtitle stream, which ensures the change of subtitle streams module 212 control the player enters the module 212 controls proigryvatel who eat it checks the number of threads subtitle elementary streams intended for playback, which were detected at the step S125 shown in Fig.

When the number of threads subtitles, which tested the module 212 control the player, is 1 or less, the module 212 control the player ignores the request message to change the subtitle stream. Thus, the module 212 controls the player performs the processing from step S192 to step S194.

In contrast, when the number of subtitle streams is two or more, the processing flow goes to steps S192-S194. At this stage, the module 212 control the player replaces the subtitle stream that is playing on another subtitle stream.

In other words, at step S192 module 212 control the player identifies the current subtitle stream in the information file of the clip. In particular, if we assume that the subtitle stream, a file_id equal 0xBD and private_stream_id equal h and which is multiplexed with the file stream clip "00002.PS", is reproduced in accordance with the second PlayItem#1, which is the first PlayList#0, is described with reference to Fig module 212 control the player identifies the subtitle stream which is reproduced as stream#2, which represents the third subtitle stream file, the clip information is "00002.CLP", shown in figa and FIGU, of two streams of subtitles muxed file stream clip "00002.PS" at step S192.

After that, the processing flow proceeds to step S193. At step S193 module 212 control the player identifies the following subtitle stream information file of the clip that you identified in step S192, as the subtitle stream that is designed to playback the following. On figa and figv next subtitle stream of the third subtitle stream stream#2 is the fourth subtitle stream stream#3 in the information file of the clip "00002.CLP". Thus, at step S193 module 212 control the player recognizes the fourth subtitle stream stream#3 as a subtitle stream that is designed to playback the following.

When the subtitle stream which is reproduced identified as stream#3, which represents the fourth subtitle stream file information of the video "00002.CLP"shown in figa and FIGU, of two streams of subtitles muxed file stream clip "00002.PS"module 212 control the player recognizes, for example, the third subtitle stream stream#2 as a subtitle stream that is designed to playback the following.

Thereafter, the flow goes to step S194. At step S194 module 212 control the player passes a file_id and private_stream_id subtitle, detected at step S193, ka is the subtitle stream designed to play next, at block 235 reading subtitles module 215 control buffer (figure 5) so that the block 235 reading subtitles uses a file_id and private_stream_id to read the next module access subtitles from the buffer A.

Block 235 reading subtitles module 215 control buffer (figure 5) again sets a file_id and private_stream_id passed from module 212 control the player at step S194, the register 263 and a file_id in the register 264 private_stream_id respectively. Block 235 reading the subtitle reads the following access module subtitles, identified by a file_id and private_stream_id, again set in the register 263 and a file_id in the register 264 private_stream_id respectively.

As described above, the subtitle stream, which reproduce, modify to another subtitle stream to be played next.

[Processing module 215 buffer management]

Next, with reference to Fig - Fig will be described processing performed by module 215 control buffer (figure 5), the processing of write data in the buffer A and processing of reading data from it.

As described with reference to figure 5, the module 215 controls the buffer has five pointers that are used to read from the buffer A and write data into it.

In other words, as shown in Fig and Fig module 215 in the manage buffer is the pointer to the data saved at block 231 save pointer to start of data pointer write data stored in block 232 save the record pointer of the data block read video data stored on the block 241 save the pointer of the read video data, the read pointer of the audio stored on the unit 251 save the pointer reading of the sound, and the pointer reading subtitles stored at block 262 save the pointer reading subtitles.

On Fig and Fig not represented register 242 and a file_id register 243 au_information() block 233 reading the video data shown in figure 5, the register 252 and a file_id register 253 private_stream_id block 234 the function of reading the sound and the block 261 saving function flag read subtitles, register 263 and a file_id register 264 private_stream_id block 235 reading subtitles.

Pointer to start of data block 231 save the pointer to the data that represents the position of the oldest data (which should be read and which have not yet been read, stored in the buffer A. The record pointer data stored in block 232 save the record pointer data represents the recording position of the data in the buffer A. This position is a position, which record the new data.

A pointer to the read video data stored on the block 241 save pointer autom who I am video represents the position of the video data stream that is read from the buffer A. The read pointer of the audio stored on the unit 251 save the pointer reading of the sound, is the position of the audio stream read from the buffer A. The pointer reading subtitles stored at block 262 save the pointer reading subtitles, represents the position of the subtitle stream read from the buffer A.

As described with reference to figure 5, the pointer to the data record pointer data pointer of the read video data, the pointer reading of the sound and the pointer reading subtitles is moved in a clockwise direction inside the buffer A.

As shown in Fig, in accordance with this option perform the pointer to the data usually update so that it represents the same position as the position of the oldest data pointer of the read video data pointer read audio and pointer read subtitles. On Fig index reading sound represents the position of the oldest data in the read pointer video data, the read pointer of the sound or the pointer reading subtitles. A pointer to the data corresponds to the read pointer of the sound.

In module 215 control buffer, which has a pointer to the data record pointer data pointer is cityware video the pointer reading of the sound and the pointer reading subtitles when new data is read from the disc 101, and writes into the buffer A, the record pointer update data in the clockwise direction so that the pointer data record represents the position immediately after the newly recorded data.

When a video stream, an audio stream or subtitle stream read from the buffer A, the read pointer video data, the read pointer of the sound or the pointer reading subtitles update in the clockwise direction, the amount of data that was read. The amount of data that has been read, is the amount of video data, audio data or subtitle data that were actually read, and the data block another thread, which is wedged in the read data, bypass when they are read.

When the pointer of the read video data, the read pointer of the sound or the pointer reading subtitles update the pointer to the update data so that it represents the position of the oldest data provided by the read pointer video data, the read pointer of the sound or the pointer reading subtitles.

Module 215 controls the buffer controls the write operation of data in the buffer A so that the record pointer data not published before the pointer to the data.

Until the eye of the pointer data records will not be released before the pointer to the data module 215 control buffer writes the data read from the disc 101, the position in the buffer A presented by the record pointer data, and updates the pointer to the data record. On the other hand, if the record pointer data out pointer before the beginning of the data, the module 215 buffer management provides the termination of the data reading module 213 of the filing data of the content from the disk 101, and stops recording data in the buffer A. The result prevents buffer overflow A.

As described above, the data read from the disc 101, the write buffer A in accordance with the interdependence of the two pointers, the pointer to the data and pointer data record.

On the other hand, the module 215 controls the buffer controls the read operation of data in the buffer A so that the pointer reads the video data, the pointer reading of the sound and the pointer reading subtitles, and the pointer to the beginning of the data does not come before the record pointer data.

In other words, up until the pointer reads the video data, the read pointer of the sound or the pointer reading the subtitles do not go before the record pointer data module 215 control buffer reads data from the buffer A represented by the read pointer video data, the read pointer of the sound or the index considered is ivania subtitle in accordance with the request, obtained from module 216 controls the scaler module 217 control decoder or sound module 218 management subtitle decoder, and updates the read pointer video data, the read pointer of the sound or the pointer reading subtitles, and, if necessary, a pointer to the data. On the other hand, if the read pointer video data, the read pointer of the sound or the pointer reading the subtitles out before the record pointer data module 215 controls the buffer provides a termination request transmission module 216 controls the scaler module 217 control decoder or sound module 218 controls the decoder subtitles, as long as the buffer A not accumulate enough data. The result prevents buffer overflow A.

Thus, in the buffer A stored data intended for submission to the module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles in the area (shaded in Fig and Fig) in the clockwise direction from the position represented by a pointer to the data, in the direction represented by the record pointer data. In addition, the read pointer video data, the pointer reading of the sound and the pointer reading of the subtitles in this area.

p> In the above case, the pointer to the data is updated so that it represents the position of the oldest data represented by the read pointer video data, the read pointer of the sound or the pointer reading subtitles. Alternatively, the pointer to the data can be updated so that it will represent the position of the data that follow before a specified time (e.g. one second)than the position of the oldest data.

It is expected that the pointer of the read video data and the read pointer of the audio read pointer video data, the read pointer of the sound and the pointer reading subtitles represent the position of the oldest data.

Thus, when the pointer to the update data so that it represents the position of data that follow before, for example, for one second, than the position of the oldest data that represents a pointer to the read video data or the pointer reading of the sound, as shown in Fig, data, following earlier one second than the old data, which is a pointer to the read video data or the pointer reading sound can be saved in the buffer A. On Fig index reading sound represents the position of the oldest data, while the pointer to the data before the hat position data, following earlier for one second, than the oldest data.

When the pointer to the update data so that it represents the position of data that follow before one second, than the position of the oldest data, it is possible to improve the response of the playback device from disk.

In other words, as shown in Fig, when the pointer to the update data so that it represents the position of the oldest data, which represents the pointer reading of the sound, if the command is given to the special reproduction, for example, the command playback in the reverse direction, data that has been read from the buffer A, you need to re-re-read from the disk 101. Thus, after the command is issued on a special playback until then, until the operation of the special playback, it would take some time.

In contrast, as shown in Fig, when the pointer to the update data so that it represents the position of data that follow before one second, than the position of the oldest data, which represents the pointer reading of the sound, if the command is given to the special reproduction, for example the team to play in reverse order, and the data necessary to begin operation of the special playback, follow the earlier one second, than the oldest data and stored in the buffer A, the special playback operation can be quickly started without re-reading data from the disc 101.

When the pointer to the data is updated so that it represents the position of data that follow before one second, than the position of the oldest data, which is a pointer to the read sound data necessary for the start of the operation of the special playback may not be stored in the buffer A. In this case, the data necessary to begin operation of the special playback, re-read from the disk 101.

Next will be described in detail below reading for a video stream, audio stream and subtitle stream from the buffer A.

As described in step S127 shown in Fig when operation begins playback of the file stream of the clip module 215 controls the buffer initializes the pointer to the data record pointer data pointer of the read video data, the pointer reading of the sound and the pointer reading of the subtitles so that they represent the same position in the buffer A.

When the program stream (program stream MPEG2-System)stored in the stream file of the clip, read from the disc 101 and transmit module 215 controls the buffer, it is stored program stream position, which represents the t pointer write data buffer A. In addition, the record pointer data is updated in the clockwise direction.

In addition, the block 233 reading video data module 215 control buffer (figure 5) analyzes the program stream stored in the buffer A, selects the video from the program stream stored in the buffer A in accordance with the request accepted from module 216 controls the video decoder, and transmits the selected access module the video module 216 controls the video decoder.

Similarly, the block 234 reading sound analyzes the program stream stored in the buffer A, allocates access module sound from the program stream stored in the buffer A in accordance with the request received module 217 control decoder sound, and transmits the access module sound module 217 control decoder sound. Block 235 reading subtitles analyzes the program stream stored in the buffer A, allocates access module subtitles from the program stream stored in the buffer A in accordance with the request accepted from module 218 management subtitle decoder, and transmits the access module of subtitles in module 218 controls the decoder subtitles.

[Read data stream]

Next, with reference to the block diagram of the sequence of operations shown in Fig will detail the op is Sana processing of reading the video data stream to buffer A unit 233 reading video data (figure 5).

At step S211 block 233 reading video data searches for the program stream stored in the buffer A for PES_packet() of private_stream_2. In other words, as described with reference to Fig, PES_packet() of private_stream_2, a file_id is W (=>=0xBF). Block 233 reading video data searches in the PES_packet(), which is a file_id W.

If we assume that the elementary stream multiplexed with a program stream stored in the stream file clip "00001.PS", is an elementary stream that is intended for playback, when the program stream read from the disc 101 and stored in the buffer A, at step S122 shown in Fig, sector 305 is defined as the start position of the playback point information possible start decoding described in Ermer() (Fig) file stream clip "00001.PS". At step S128 shown in Fig denoted by sector 305, which represents the starting point of playback. Block 233 reading video data provides read through the operating system 201, a program stream from a file stream clip "00001.PS".

Information about the probable point of the beginning of the decoding described in Ermer() stream of video data represents the position of the PES_packet() of private_stream_2, after which immediately follows the real point of possible start decoding.

Thus, the non-is certain after as a program stream stored in the stream file clip "00001.PS", will be read from the disk 101 and stored in the buffer A, PES_packet() of private_stream_2 will be saved in the position represented by a pointer to the data and a pointer to the read video data in the buffer A.

When the block 233 reading video data finds PES_packet() for private_stream_2 at step S211, the flow goes to step S212. At step S212 block 233 reading video highlights video_stream_id of private_stream2_PES_payload() (peg), which is a PES_packet_data_byte for PES_packet() of private_stream_2. At step S127 shown in Fig, block 233 reading video data determines whether video_stream_id a file_id playing stream of video data, which is stored in the register 242 a file_id (figure 5).

When the result determined at step S212 indicates that video_stream_id described in private_stream2_PES_payload()that does not match a file_id stored in the register 242 a file_id, namely PES_packet() of private_stream_2, found at the step S211, is not in point possible start decoding a stream of video data intended for playback, the processing flow returns to step S211. At step S211 block 233 reading video data searches for the program stream stored in the buffer A for PES_packet() of the other private_stream_2, and repeats the same processing.

In contrast, when the result determined at the S212, indicates that video_stream_id described in private_stream2_PES_payload(), corresponds to a file_id stored in the register 242 a file_id, namely PES_packet() for private_stream_2 found in step S211, is located at the point of a possible start of decoding a stream of video data intended for playback, the processing flow proceeds to step S213. At step S213 block 233 function read reads video data au_information(), described in private_stream2_PES_payload() PES_packet() of private_stream_2 buffer A and stores the au_information() register 243 au_information() (figure 5). After that, the processing flow goes to step S214.

At step S214 block 233 reading video data updates the pointer of the read video data stored on the block 231 save the pointer to the data for the size of the PES_packet() of private_stream_2, found at the step S211 (PES_packet() of private_stream_2, in which video_stream_id (Fig) corresponds to a file_id stored in the register 242 a file_id (5))

In other words, in the file stream clip after PES_packet() for private_stream_2 immediately followed by a stream (PES_packet()), which corresponds to a file_id video_stream_id. Thus, at step S214 block 233 reading video data updates the pointer of the read video data so that it represents the real position of the point of a possible beginning of the decoding of the video data stream.

Thereafter, the flow goes from step S214 to step S215. Block 233 reading the form of the data determines passed if the module 216 controls the decoder request to read data. When the result determined at step S215, indicates that the block 233 reading video data did not pass the request to read data, the processing flow returns to step S215. At step S215 module 216 controls the decoder repeats the same processing.

In contrast, when the result determined at step S215, indicates that the module 216 controls the video decoder transferred the request to read data, the processing flow proceeds to step S216. At step S216 block 233 reading video data analyses program stream from the position represented by the pointer of the read video data in the buffer A, reads the data byte, described in AU_length in the au_information()stored in the register 243 au_information(), namely one module access to image data from the buffer A, transmits the data to the module 216 controls the video decoder and updates the pointer of the read video data on the size of a single module access video data that has been read from the buffer A.

In other words, as described with reference to Fig, au_information() describes number_of_access_unit, which represents the number of modules access to image data (images)obtained from the PES_packet() of private_stream_2, containing au_information() in PES_packet() of the next private_stream_2.

In addition, as described with reference to Fig, au_iformation() describes pic_struct_copy, au_ref_flag and AU_length as information about each of the modules access to image data represented number_of_access_unit.

As described on Fig, because each of AU_length described in the au_information() in accordance with number_of_access_unit, represents the size of each of the modules access to image data represented number_of_access_unit of PES_packet() of private_stream_2, containing au_information, PES_packet() of the next private_stream_2, block 233 reading video data can allocate access modules with AU_length without the need of analysis of the video data stream.

In other words, when isolated modules access MPEG2-Video or MPEG4-AVC, it is necessary to know the structure of the stream of video data and then analyze it. However, the program stream stored in the stream file of the clip recorded on the disc 101, contains PES_packet() of private_stream_2, which describes AU_length, which represents the size of the video and after which is immediately followed by at least one point of a possible beginning of the decoding of the video data stream. Thus, block 233 reading the video data can be read modules access the video data (video stream as modules of the video) from the buffer A and pass modules to the video module 216 controls the video decoder in accordance with AU_length described in the PES_packet() of private_stream_2, without the need of analysis of the video data stream.

At step S216, when the block 233 reading view the data passes the access units of video data in the module 216 controls the video decoder, block 233 reading video data also passes pic_struct_copy, au_ref_flag and AU_length described in the au_information(), and timestamp (PTS, DTS), is added to each of the modules of the video, as the information about the modules access to image data in the module 216 controls the video decoder.

After the unit 233 reading video data considers one module access to image data from the buffer A and submit them to the module 216 controls the video decoder at step S216, the processing flow proceeds to step S217. At step S217 block 233 reading the video data is processed whether it is access modules presented number_of_access_unit for au_information() (Fig)stored in the register 243 au_information().

When the result determined at step S217, indicates that the block 233 reading the video data has not yet processed the access modules presented number_of_access_unit, namely block 233 reading the video data has not yet considered the access modules presented number_of_access_unit buffer A, and not handed them in module 216 controls the video decoder, the flow returns to step S215. At step S215 block 233 reading video repeats the same processing.

In contrast, when the result determined at step S217, indicates that the block 233 reading the video data already processed the access modules presented number_of_access_unit, namely block 233 functions please take the of the video already believed modules access presents number_of_access_unit buffer A, and gave them into the module 216 controls the video decoder, the processing flow returns to step S211. At step S211 block 233 reading video data searches PES_packet() of the next private_stream_2 and repeats the same processing.

[Read the audio stream]

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described the processing of reading the audio stream to buffer A block 234 the scanning of the sound (figure 5).

At step S230 block 234 the scanning of the sound determines whether a file_id audio stream that is intended for playback, which was stored in the register 252 a file_id (figure 5) in step S127 shown in Fig, PES_packet() of private_stream_1.

When the result determined at step S230 indicates that a file_id stored in the register 252 a file_id, is not PES_packet() of private_stream_1, exactly as described with reference to Fig, a file_id stored in the register 252 a file_id, is HIV installed in the audio stream that has been encoded according to the MPEG standard, the processing flow goes to step S231. At step S231 block 234 reading sound searches in the program stream stored in the buffer A synchronous code that represents the start of a frame of sound, defined in MPEG Audio. Since the position of the synchronous code is in the beginning of the th frame of the sound, block 234 reading sound updates the read pointer of the sound so that it represents the position of the start frame of the sound. After that, the processing flow goes to step S231 to step S232. At step S232 block 234 reading sound searches in the program stream stored in the buffer A, PES_packet(), which corresponds to a file_id stored in the register 252, a file_id, which corresponds to the position represented by the read pointer of the sound, and gets PES_packet(). After that, the processing flow goes to step S233.

At step S233 block 234 reading sound updates the read pointer of the audio stored on the unit 251 save the pointer reading of the sound, so that the pointer reading sound represents the beginning PES_packet_data_byte for PES_packet() (figa and FIGU - figa and figv), which was found at step S232. After that, the processing flow proceeds to step S237.

At step S237 block 234 reading sound determines passed if the module 217 control decoder audio request data. When the result determined at step S237, indicates that the module 217 control decoder sound was not passed to the request for data, the processing flow returns to step S237. At step S237 block 234 reading sound repeats the same processing.

In contrast, when the result determined in step 237, indicates that the module 217 control decoder transferred the request to receive data, the processing flow proceeds to step S238. At step S238 block 234 reading sound software analyzes the stream from the position represented by the read pointer of the sound in the buffer A, reads one access module sound having a predetermined fixed length buffer A, and transmits the access module sound together with the time stamp (PTS, DTS), added to the access module sound module 217 control decoder sound.

Block 234 reading sound updates the read pointer of the sound to the size of one module access to the sound that is read from the buffer A. After that, the processing flow returns to step S237. At step S237 block 234 reading sound repeats the same processing.

In contrast, when the result determined at step S230, indicates that a file_id stored in the register 252 a file_id, is PES_packet() for private_stream_1, namely a file_id stored in the register 252 streamed represents V (=0xBD) and is PES_packet() for private_stream_1, as described with reference to Fig, the processing flow goes to step S234. At step S234 block 234 reading sound searches in the program stream stored in the buffer A PES_packet() in the private_stream_1, and receives PES_packet(). In other words, the block 234 reading sound performs p is the claim of the PES_packet(), a file_id which is V and gets PES_packet().

When the block 234 reading sound finds the PES_packet() in the private_stream_1 at step S234, the processing flow goes to step S235. At step S235 block 234 reading sound allocates private_stream_id of private_stream1_PES_payload() (peg), which is a PES_packet_data_byte for PES_packet() of private_stream_1, and determines whether this private_stream_id private_stream_id of the audio stream that is intended for playback, which was stored in register 253 private_stream_id (figure 5) in step S127 shown in Fig.

When the result determined at step S235, indicates that private_stream_id described in private_stream1_PES_payload(), does not match the private_stream_id stored in the register 253 private_stream_id, namely PES_packet() of private_stream_1 found in step S234, not an audio stream, intended to play, the processing flow returns to step S234. At step S234 block 234 reading sound searches in the program stream stored in the buffer A PES_packet() of the other private_stream_1. After that, the block 234 reading sound repeats the same processing.

In contrast, when the result determined at step S235, indicates that private_stream_id described in program_stream_PES_payload()corresponds to the private_stream_id stored in the register 253 private_stream_id, namely PES_packet() of private_stream_1 found in step S234, represents the audio stream, intended the military to play, the processing flow proceeds to step S236. At step S236 block 234 reading sound reads AU_locator described in private_stream1_PES_payload() (Fig) for PES_packet() of private_stream_1 buffer A, summarizes the position directly following after AU_locator, and a value that is AU_locator, and receives the initial position of the access unit to sound.

In other words, as described on Fig, AU_locator represents the initial position of the access module sound module or access subtitles stored in private_payload() in private_stream1_PES_payload() based on the provisions of the immediately following after AU_locator. Thus, by summing the values, which is AU_locator, and the provisions of the immediately following after AU_locator, you can get the initial position of the access unit to sound.

At step S236 block 234 reading sound updates the read pointer of the audio stored on the unit 251 save the pointer reading of the sound, as a result, the read pointer of the sound is the initial position of the received access unit to sound. After that, the processing flow proceeds to step S237.

At step S237 block 234 reading sound determines passed if the module 217 control decoder audio request data. When the result determined at step S237, indicates that the module 217 control decoder sound was not passed to the query is and getting data, the processing flow returns to step S237. At step S237 block 234 reading sound repeats the same processing.

In contrast, when the result determined at step S237, indicates that the module 217 control decoder sound conveyed a request to retrieve data, the processing flow proceeds to step S238. At step S238 block 234 reading sound software analyzes the stream from the position represented by the read pointer of the sound in the buffer A, reads one access module sound, having the specified length from the buffer A, and transmits the access module sound together with the time stamp added to the access module sound module 217 control decoder sound.

Block 234 reading sound updates the read pointer of the sound on the size of one access module sound read from the buffer A. After that, the processing flow returns to step S237. At step S237 block 234 reading sound repeats the same processing.

[Read subtitle streams]

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described the processing of reading the subtitle stream from the buffer A block 235 reading subtitles (figure 5).

At step S251 block 235 reading subtitles defines the function flag reading subtitles, which was stored in the module 216 controls in what detectaron at step S127, shown in Fig. When the result determined at step S251, indicates that the function flag reading subtitles is set to 0, namely the file stream clip, multiplexed with an elementary stream for playback, does not contain the subtitle stream and 0 was set at block 261 saving function flag read subtitles at the step S127 shown in Fig, block 235 the function of reading the subtitles does not perform any processing.

In contrast, when the result determined at step S251, indicates that the function flag reading subtitles is set to 1, namely the file stream clip, multiplexed with an elementary stream intended for playback, contains a subtitle stream, and 1 was set at block 261 saving function flag read subtitles at the step S127 shown in Fig, the processing flow goes to step S252. At step S252 block 235 reading subtitles searches in the program stream stored in the buffer A, PES_packet(), which corresponds to a file_id subtitle streams intended for playback, which was stored in the register 263 a file_id (figure 5).

As described in step S127 shown in Fig, streamed subtitle streams intended for playback is stored in the register 263 a file_id (figure 5). On the other hand, as described with reference to Fig, a file_id sweat the ka subtitles equal V (=0xBD), which is PES_packet() for private_stream_1.

Thus, at step S252, block 235 reading subtitles searches in the program stream stored in the buffer A PES_packet() for private_stream_1.

When the block 235 reading subtitles searches PES_packet() in the private_stream_1 and receives it, the processing flow goes to step S253. At step S253 block 235 reading subtitles allocates private_stream_id of private_stream_PES_payload() (peg), which is a PES_packet_data_byte for PES_packet() of private_stream_1 and determines whether the private_stream_id private_stream_id of the subtitle stream that is intended for playback, which was stored in the register 264, private_stream_id (figure 5) in step S127 shown in Fig.

When the result determined at step S253, indicates that private_stream_id described in private_stream_PES_payload(), does not match the private_stream_id stored in the register 264 private_stream_id, namely PES_packet() of private_stream_1, which was found in step S252, it is not a subtitle stream that is intended for playback, the processing flow returns to step S252. At step S252 block 235 reading subtitles searches in the program stream stored in the buffer A PES_packet() of the other private_stream_1. After this block 235 reading subtitle repeatedly performs the same processing.

In contrast, when the result determined at step S253, indicates that rivate_stream_id, described in private_stream1_PES_payload()corresponds to the private_stream_id stored in the register 264, private_stream_id, namely PES_packet() of private_stream_1, which was found in step S252, is a subtitle stream that is intended for playback, the processing flow goes to step S254. At step S254 block 235 reading the subtitle reads AU_locator described in private_stream1_PES_payload() (Fig) PES_packet() of private_stream_1 buffer A, summarizes the position directly following after AU_locator, and a value that is AU_locator, and receives the initial position of the access module subtitles.

As described on Fig, AU_locator represents the initial position of the access module subtitles (or module access to the sound stored in the private_payload() private_stream1_PES_payload() based on the provisions of the immediately following AU_locator. Thus, by summing the values, which is AU_locator, and the provisions of the immediately following AU_locator, you can get the initial position of the access module subtitles.

In addition, at step S254 block 235 reading subtitles updates the pointer reading subtitles stored at block 262 save the pointer reading subtitles, resulting in the pointer reading subtitles is the initial position of the received access module subtitles. After that, the processing flow goes to step S255.

On this the e S255 block 235 reading subtitles determines developed whether the module 218 controls the decoder subtitle request data. When the result determined at step S255, indicates that the block 235 reading subtitles not developed a request for data, the processing flow returns to step S255. At step S255 block 235 reading subtitle repeatedly performs the same processing. In contrast, when the result determined at step S255, indicates that the module 218 controls the subtitle decoder has developed a request for data, the processing flow goes to step S256. At step S256 block 235 reading subtitles analyzes the program stream from the position represented by the pointer reading of the subtitles in the buffer A, reads one access module subtitles size described at the beginning of the module access subtitles from the buffer A and transmits the access module subtitles together with the time stamp added to the access module of subtitles in module 218 controls the decoder subtitles. As described on figa and FIGU, the size of the access module subtitles described in the beginning. Block 235 reading the subtitle reads data relating to the size, position, represented by the pointer reading subtitles from the buffer A and transmits the access module subtitles together with the time stamp added to the access module of subtitles in module 218 control the direction decoder subtitles.

Block 235 reading subtitles updates the pointer reading subtitles on the size of one module access subtitles read from the buffer A. After that, the processing flow returns to step S255. At step S255 block 235 reading subtitle repeatedly performs the same processing.

[Processing resynchronization]

Next will be described the control of synchronizing video data and audio data provided by the module 214 controls the decoding shown in figa and figv.

As described in step S130 shown in Fig module 214 controls the decoding ensures the decode their data module 216 controls the scaler module 217 of the control decoder and sound module 218 controls the decoder subtitles. If necessary, the module 214 controls the decoding ensures the decode their data between modules in different moments of time for synchronization. For example, when the video decoder 116 and 117 of sound perform the decoding processing, depending on their States obtained in the course of work, they can output video data and sound data at different points in time.

Thus, the module 214 management performs decoding processing resynchronization, which compensates for the difference between the output time video is the R and audio data and provides simultaneous output of the decoder decoder 116 and 117 audio video data and audio data.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing is repeated synchronization.

During the processing of re-synchronization, in step S271, the module 214 controls the decoding determines whether the a large difference between the timestamp of module access to image data, which is output from the module 216 controls the video decoder, and timestamp of module access to the sound that is output from the module 217 control decoder sound.

In other words, as described in step S129 shown in Fig, whenever module 216 controls the decoder module receives access to image data from the module 215 of the buffer management module 216 controls the decoder transmits the timestamp module access to image data in the module 214 controls the decoding. Similarly, whenever the module 217 controls the audio decoder receives the access module sound module 215 of the buffer management module 217 controls the audio decoder transmits the time stamp of the access module sound module 214 controls the decoding.

At step S271 module 214 controls the decoding compares the timestamps received from the module 216 controls the video decoder, and module 217 control decoder sound in a given period of time, which is considered as the same points lying is neither, and determines whether the difference between the time stamps large.

When the result determined at step S271, indicates that the difference between the timestamp of module access to image data received from module 216 controls the video decoder, and timestamp of module access to the sound received from the module 217 control decoder sound, small, namely, when the difference between the timestamp of module access to image data and a time stamp of an access module sound is within a specified range within which the modules can be considered as synchronized, for example, two frames (approximately 66 milliseconds), the processing flow returns to step S271. At step S271 module 214 controls the decoding determines the difference between the timestamp.

In contrast, when the result determined at step S271, indicates that the difference between the timestamp of module access to image data obtained module 216 controls the decoder and time stamp of an access module sound received from the module 217 control decoder sound great, and it is this difference falls outside a specified range so that the modules access cannot be considered as synchronized, the processing flow proceeds to step S272. At step S272 module 214 controls the decoding compare the AET timestamp module access to image data, obtained from module 216 controls the video decoder, and a timestamp module access to the sound received from the module 217 control audio decoder, for determining which of the output video data and output data of the sound lagging behind the others.

When the result determined at step S272, indicates that the video output is delayed relative to the output audio data, the processing flow proceeds to step S273. At step S273 module 214 controls the decoding provides a stop decoding module 216 controls the video decoder and display module access video data, namely miss processing module access video data to speed on one processing module access to image data. After that, the processing flow proceeds to step S274.

At step S274 module 216 controls the video decoder receives a request to pass the module 214 controls the decoding and checks au_ref_flag (Fig)supplied with the module access to image data from the module 215 controls the buffer.

In other words, the au_information() (Fig)stored in private_stream2_PES_payload() (Fig) PES_packet() for private_stream_2, contains au_ref_flag as information about the module access. As described in step S129 shown in Fig, and at step S216 shown in Fig, together with module access to image data module 215 controls the buffer passes it au_ref_flag in this module 216 control the scaler.

At step S274 module 216 controls the decoder checks au_ref_flag access module supplied with the access module.

After that, the processing flow goes from step to step S274 S275. At step S275 module 216 controls the decoder determines that the module access to image data is not a reference image which is accessed by encoding another image, in accordance with the verification result au_ref_flag module access to image data that was transferred from module 215 controls the buffer.

As described with reference to Fig, au_ref_flag module video indicates whether the access module reference document image. When the access module is a reference image, au_ref_flag equal to 1. In contrast, when the access module is not a reference image, au_ref_flag equal to 0.

When the result determined at step S275, indicates that the module of the video transmitted from the module 215 controls the buffer is not a module of the video, which is not the reference image, namely the module of video transmitted from the module 215 of the control buffer is a reference image, the processing flow proceeds to step S276. At step S276 module 216 controls the video decoder provides normal processing by the video decoder module 116 access to image data. After m is module 216 controls the decoder will receive the following module access to image data from the module 215 buffer management, the processing flow returns to step S274.

In contrast, when the result determined at step S275, indicates that the module of the video transmitted from the module 215 controls the buffer is not a reference image, the processing flow proceeds to step S277. At step S277 module 216 controls the video decoder provides skipping the decoder processing module 116 access to image data. After the module 215 controls the buffer will give the following module access to image data, the processing flow returns to step S271.

Since the processing of a single module access to image data was skipped, the processing will advance approximately one module access to image data. This will result in faster output video data, which was late compared to the output audio data.

In contrast, when the result determined at step S272, indicates that the video output is not delayed relative to the data output sound output sound is delayed relative to the video data, the processing flow proceeds to step S278. At step S278 module 214 controls the decoding outputs continuous output command module 216 controls the video decoder for continuous output video data corresponding to the module access to image data, which decode to maintain module 216 control the population by the decoder in standby mode for processing of the next module access to image data. After that, the processing flow proceeds to step S279.

At step S279 module 216 controls the video decoder receives a continuous request for withdrawal from the module 214 controls the decoding and constantly displays the video module access to image data, decoding is performed by a video decoder 116 in module 219 processing graphical images in accordance with the continuous withdrawal request. After the module 215 controls the buffer will give the following module access to image data, the processing flow proceeds to step S271.

As described above, the module 214 controls the decoding determines lags if the output video data in comparison with data output sound. When the video output is delayed relative to the output of the sound module 214 controls the decoding provides the skip processing module 216 controls the video decoder of one module access. Module 216 controls the decoder determines whether the access module, which will be skipped, the reference image or not a reference image in accordance with au_ref_flag access module. When the access module is not a reference image, the module 214 controls the decoding provides the skip processing by the video decoder 116 of the access module. Thus, the output video data and output audio data can easily sinkhronezirovat is.

In other words, when the access module, which should be skipped, is a reference image, video data of the access module must be decoded so that these video data can be referenced by other AP when it is decoded. Thus, when controlling timing at which the output video data and outputting the sound data is synchronized, if will be skipped processing module access the reference image, another access module that references this reference image cannot be decoded. As a result, when displaying the video data synchronized with the data of the sound, noises appear.

Thus, preferably, the access module, which is not the reference image, that is not the reference image was skipped.

On the other hand, when a search is performed in the usual elementary stream access module, which is not a reference picture, the elementary stream must be analyzed. Elementary stream encoded in accordance with, for example, the system MPEG4-AVC is very difficult. Thus, when does the analysis of this elementary stream, it requires a significant investment.

In contrast, a program stream stored in the stream file of the clip recorded on the disc 101, the multi is lexilove with PES_packet() for private_stream_2, which contains private_stream2_PES_payload() (peg), which is an extension PES_packet_data_byte, in addition to the PES_packet() (figa and FIGU - figa and figv)with PES_packet_data_byte that contains the module access to image data. au_information() (Fig) for private_stream2_PES_payload() describes au_ref_flag, which indicates whether the module access to image data of the reference image or not a reference image. au_ref_flag comes together with the appropriate module access to image data from the module 215 of the buffer management module 216 controls the video decoder. Thus, the module 216 controls the video decoder may determine whether the module of the video reference image or not a reference image, by checking au_ref_flag module of the video, almost without any additional costs.

[Processing label]

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing tag based Mark(), described in the PlayListMark() (Fig.9).

Module 214 controls the decoding usually checks the current time calculated by the embedded block A counting time. Phase spider S301 demonstration module 214 controls the decoding determines whether the current time mark_time_stamp any Mark(), described in the PlayListMark() (Fig.9).

As described in step S124 shown in Fig when the module 212 of the control is to be placed by the player plays the first PlayItem#0 of the first PlayList#0, shown in Fig module 212 control the player recognizes that the four Mark(), which represent the first through the fourth Mark() of the seven Mark()contained in the PlayListMark()in the upper table shown in Fig, belong to the first PlayItem#0 PlayList#0, and passes the values{180090}, {5580090}, {10980090} and {16380090}, which represent the mark_time_stamp of the four Mark(), together with information that indicates the attribute of time contained in the mark_time_stamp is "handling label" module 214 controls the decoding.

Phase spider S301 demonstration module 214 controls the decoding determines which of the four time values (mark_time_stamp)that has the attribute "label processing", which was transferred from module 212 control the player matches the current time.

When the result determined at step spider S301 demonstration, indicates that the current time does not correspond to any value of the time having the attribute "label processing, the processing flow returns to step spider S301 demonstration. Phase spider S301 demonstration module 214 controls the decoding repeatedly performs the same processing.

[Determination of compliance when processing tags]

When processing a label, phase spider S301 demonstration module 214 controls the decoding determines whether the current time is one of the mark_time_stamp. However, in this embodiment, since the block A counting time indicates discre the data values, if their compliance is determined by a simple method, a problem may occur.

This problem will be described by a simple example with reference to Fig. In the upper part Fig, I0, P1, P2, and P3 represent the modules access to image data. It is assumed that pic_struct of each of these modules access to image data is equal to 3, namely, the display time for one frame (3003) at 90 kHz). In this example, it is assumed that the order of decoding is the same order as they are displayed, and it does not change the order. 10 is a module access, registered in Ermer(), as described in "Processing in preparation for the play. The access module 10 has a timestamp PTS=180090. In contrast, the access modules P1, P2 and P3 do not have a timestamp.

When processing such video data, a clock unit A counting time update, as shown in the bottom block on Fig, When the display 10, serves PTS and pic_struct for 10. Since 10 has a PTS, it put on the block A counting time. In the PTS unit A calculation time becomes equal 180090. When the output P1 because it has no PTS, serves only pic_struct for this P1. Because pic_struct 10 is 3, the time for one frame (3003) at 90 kHz) is added to the block A counting time. Thus, the value of the block A counting time mills is tsya equal 183093. Similarly, when output P2, because pic_struct P1 is 3,3003 add to the time unit A counting time. As a result, the value of the block A calculation time becomes equal 186096. When output P3, similarly 3003 add to the value of the time unit A counting time. As a result, the value of the block A calculation time becomes equal 189099.

Now consider the processing that is executed when the mark_time_stamp one of the marks registered in the PlayListMark() (figures 9 and Fig), as well 186000. As described above, the value output clock (block A counting time), 180090, 183093, 186096, 189099, is 186000, which corresponds to the time the label is not displayed. Thus, when mark_time_stamp and the time value compare simple ways, and it is determined whether the value is equal, their difference is 0 or not, the problem occurs.

Thus, given a rule used in determining the compliance time. In other words, in this example, when the mark_time_stamp of a specific event is contained in the duration display a specific image, determines that a particular event occurs at the time of the beginning of the display of the corresponding image. In the previous example mark_time_stamp=186000, is contained in the duration of the display of the image P1. Thus, it is determined that this event occurs at the time of the beginning of the display of the of P1, namely 183093.

Next will be described an algorithm determination module 214 controlling the decoding, which performs a determination in accordance with the above definition.

In this example, the time (unit A counting time) only update when the update video. In other words, processing is performed on the phase spider S301 demonstration, shown in Fig, is executed only when the update time. Thus, in the playback device, consisting of software, because the number of processing steps can be reduced significantly, such a structure is preferred.

When the update time, cause the processing shown in Fig. Phase spider S301 demonstration module 214 controls the decoding checks whether it was an event, which is defined as an event that should correspond to the current time. In other words, the module 214 controls the decoding checks to see if the event contained in the duration of the display of the image display based on the current time and the display time of the image displayed. When there is no event defined as an event that should correspond to the current time, the processing flow returns to step spider S301 demonstration. When there is no event defined as an event that should correspond to the current time,the processing flow goes to step W302. When there is no event defined as an event that should correspond to the current time, processing may be finished.

In particular, although, for example, displays 10, because the time is 180 090 and pic_struct to 10 is 3, it is clear that the duration of the display I0 will be 3003. Thus, searches mark_time_stamp, which satises 180090 < mark_time_stamp < 180090 + 3003. At this point, since the time of the event 186 000 shown as an example in this case, does not satisfy this expression is not received, the determination of compliance with the time.

While you display I1, since the time 183093 and pic_struct for II is 3, it is clear that the duration of the display 10 is 3003. Thus, searches mark_time_stamp, which satises 183093 ≤ mark_time_stamp < 183093 + 3003. As time 186000 event, consider as an example, at this point satisfies this expression is determined that these values are time consistent. After this is done, the process after step W302.

In the above description, an example was presented one definition of compliance time. Instead, you can apply a different definition. For example, when mark_time_stamp specific events equal to or greater than the start time of the display of a particular image - α"and smaller h is m the start time to display the next display image α", you can specify that the event occurred at the beginning of the display of the corresponding image. Instead, in accordance with the same criteria definitions, we can determine that the time at which the event occurs, represents, for example, the start time of the display - α the corresponding images.

When introducing such a definition is not necessary to know the time of the video data stream, when the time label, namely mark_time_stamp installed. Thus, when the operation is performed authoring a multimedia product, because the processing of the encoding of the video image becomes strictly dependent on the creation of a database, they can be run separately from each other.

In contrast, when the result determined at step spider S301 demonstration, indicates that the current time value matches one of these four time values with the attribute processing "label"module 214 controls the decoding transmits a message, which indicates that the current time becomes a time having the attribute "processing labels, together with the corresponding time having the attribute of "handling label"module 212 control the player. After that, the processing flow goes to step W302.

On stage W302 module 212 control the player receives a message that indicates that the current time is ranovitsa time, having the attribute "patch", together with the corresponding time, which has an attribute handling marks from module 214 controls the decoding, and recognizes Mark(), for which mark_time_stamp matches the current time, as Mark(), which must be processed to the processing of the label (below this Mark() is sometimes called the target label).

In other words, the module 212 control the player recognizes PlayItem() for the currently playing PlayList(). By accessing the PlayListMark() (Fig.9) file "PLAYLIST.DAT" (7) PlayList(), PlayItem() and time (mark_time_stamp)that has the attribute "handling label (below this time is sometimes called a time tag), which corresponds to the current time and transferred from module 214 controls the decoding module 212 control the player recognizes the target tag.

In particular, assuming that played first PlayItem#0 of the first PlayList#0 shown in Fig module 212 control the player recognizes that the time tag is a mark_time_stamp any one of the four Mark(), which is from the first to the fourth Mark() of the seven marks Mark()contained in the PlayListMark()in the upper table shown in Fig.

When the time marks transferred from the module 214 controls the decoding module 212 control the player, for example, is 16380090 module 212 control the player races will osnet as the target marks the fourth Mark(), mark_time_stamp which corresponds 16380090, which represents the time labels of the four Mark(), which represent the first through the fourth Mark()contained in the PlayListMark() in the upper table shown in Fig.

When the module 212 control the player recognizes the target tag, the processing flow proceeds from step W302 at step S303. At step S303 module 212 control the player defines, describes whether the target label entry_ES_stream_id and entry JES_private_stream_id (Fig.9)that identifies the elementary stream.

When the result determined at step S303 indicates that the target label does not describe entry_ES_stream_id and entry_ES_private_stream_id (Fig.9)that identifies the elementary stream, namely both entry_ES_stream_id and entry_ES_private_stream_id equal h, the processing flow goes to step S305, skipping to step S304. At step S305 module 214 controls the decoding performs the processing target label.

In contrast, when the result determined at step S303 indicates that the target label describes entry_ES_stream_id and entry_ES_private_stream_id (Fig.9)that identifies the elementary stream, the processing flow goes to step S304. At step S304 module 212 control the player determines whether the reproduced elementary stream elementary stream identified by entry_ES_stream_id and, if necessary, entry_ES_private_stream_id.

When the result, particularly the second stage S304, indicates that the elementary stream that is played does not contain an elementary stream identified by entry_ES_stream_id and entry_ES_private_stream_id target label, the processing flow returns to step spider S301 demonstration. In other words, when an elementary stream identified by entry_ES_stream_id and entry_ES_private_stream_id target label is not played, the target label is ignored.

In contrast, when the result determined at step S304 indicates that the elementary stream that is played back contains an elementary stream identified by entry_ES_stream_id and entry_ES_private_stream_id target label, and it plays an elementary stream identified by entry_ES_stream_id and entry_ES_private_stream_id target label, determines that the target label is accurate. After that, the processing flow goes to step S305. At step S305 module 212 control the player performs the processing target label.

In other words, at step S305, by appeal to the mark_type of the target label (Fig.9), the module 212 control the player determines the target tag.

When the result determined at step S305, indicates that the target label is a marker or label index, namely mark_type target label is a "Section" or "Index", the processing flow goes to step S306. At step S306 module 212 control the player in charge of the AET update module 219 processing of graphic images, the section number or index number, using the number of target labels. After that, the processing flow returns to step spider S301 demonstration.

When the result determined at step S305, indicates that the target label is the label of the event, namely mark_type target label, is an "Event", the processing flow goes to step S307. At step S307 module 212 control the player passes as an event message that indicates that the event occurred, and mark_data target label in the module 211 management scenario. After that, the processing flow goes to step S308.

At step S308 module 211 controls the script receives the event message and mark_data module 212 control the player and performs a sequence of processes described in the file "SCRIPT.DAT" argument mark_data in accordance with the event message, as the interrupt request. After that, the processing flow returns to step spider S301 demonstration.

In other words, the module 211 control script performs processing corresponding mark_data.

In particular, in the PlayListMark() of PlayList#1 in the lower table shown in Fig, mark_type each of the second Mark() (Mark#1) and the third Mark() (Mark#2)represent the "Event". However, mark_data Mark#1 is 1, while the mark_data Mark#2 is 2.

When the module 211 controls the script receives an event message corresponding to the second Mark()and the message about the event and, corresponding to the third Mark(), module 211 control script performs the processing in accordance with the event messages, using the same event handler (procedure interrupt processing). Module 211 controls the script checks mark_data passed along with the event message, and performs processing in accordance with the mark_data using the event handler.

In particular, when the mark_data is, for example, 1, module 211 control script controls module 219 processing of images for icon display of the first type. When mark_data is, for example, 2, module 211 control script controls module 219 processing of images for icon display of the second type.

The mark_data is not limited to 1 and 2. In addition, the processing corresponding to the mark_data is not limited to the display of simple icons.

In other words, when the mark_data is in the range from 3 to 18, the module 211 control script controls module 219 processing of images for icon display of the first type with intensity corresponding to the value at which 2 subtracted from the mark_data (digital value in the range from 1 to 16). On the other hand, when the mark_data is in the range from 19 to 34, the module 211 control script controls module 219 processing graphical image is of ageni to display the icons of the second type of intensity, the corresponding value, in which 18 subtracted from the mark_data (digital value in the range from 1 to 16).

When a controller which controls the user connected to the input interface 115 (Fig 1), and the controller has a vibration mechanism, which is a DC motor (FRI, DC) with an eccentric load, mounted on the motor shaft, and which vibrates when I turn on the engine, if the value of the mark_data is in the range from 35 to 42, the motor of the vibrator can be turned on for a period of time corresponding to the value in which 34 deducted from the value of the mark_data (digital value in the range from 1 to 8).

mark_data is a digital value. Using and algorithm mark_data can be described using the script that performs module 211 management scenario. Thus, mark_data can be used in accordance with the specified rule or original rule specified by the manufacturer of the disk 101 or the content provider that provided the data recorded on the disc 101.

When the current time corresponds to the time having the attribute "label processing", the target label is identified by a time stamp that represents the time that has the attribute handling marks. When the target label does not describe entry_ES_stream_id and entry_ES_private_stream_id that identify elementary the stream, processed in accordance with the mark_type of the target label. Even if the target label describes entry_ES_stream_id and entry_ES_private_stream_id that identifies the elementary stream, while playing elementary stream identified by entry_ES_stream_id and entry_ES_private_stream_id be processed in accordance with the mark_type of the target label.

While playing the second PlayList#1 shown in Fig, you execute the following processing of the label.

In other words, as shown in the bottom table presented on Fig, PlayListMark() in accordance with the second PlayList#1, describes the first Mark() (Mark#0), the second Mark() (Mark#1) and the third Mark() (Mark#2)for which values mark_time_stamp equal to 90000, 27090000 and 27540000 respectively.

In addition, since entry_ES_stream_id second Mark and the third Mark() PlayListMark() in the lower table shown in Fig describe he and he, the second Mark and the third Mark() correlated with elementary streams identified a file_id, which are he and he respectively.

As described with reference to Fig, the second PlayList#1 describes only one PlayItem() (PlayItem#0). According to PlayItem#0 of the currently playing file stream clip "00003.PS". As described in the information file of the clip "00003.CLP", shown in figa and FIGU, which corresponds to the file stream clip "00003.PS", the file stream clip "00003.PS" multiplexed with the three elementary streams, which presented Aut a video stream stream#0, identified by a file_id, which is equal ha, the video stream stream#1, identified by a file_id, which is equal he, and the audio stream stream#2, identified by private_stream_id that is equal h.

Thus, the second Mark() in the PlayListMark() in the lower table shown in Fig correlated with the file video stream stream#0, a file_id equal ha, which is multiplexed with the file stream clip "00003.PS". The third Mark() is correlated with the video stream stream#1, in which a file_id equal ha, which is multiplexed with the file stream clip "00003.PS".

When PlayItem#0 of the second PlayList#1 shown in Fig, reproduce, as described in step S124 shown in Fig module 212 control the player recognizes that the three Mark()contained in the PlayListMark() in the lower table shown in Fig belong to PlayItem#0 PlayList#1, and transmits the values {90000}, {27090000} and {27540000}, which represent the mark_time_stamp for three Mark(), together with information that indicates that these time values are the attribute of the processing label" in module 214 controls the decoding.

When processing tags, while reproducing PlayItem#0 PlayList#1, module 214 controls the decoding usually determines what kind of time values {90000}, {27090000} and {27540000} correspond to the current time calculated by the block A counting time (phase spider S301 demonstration). When the current time corresponds to a time having the attribute "label processing", module 214 controls the decoding passes the time label, which is a time having the attribute "processing labels, together with a message that indicates that the current time becomes a time having the attribute "handling label"module 212 control the player.

When the current time corresponds 27090000 among time values {90000}, {27090000} and {27540000}with attribute processing "label"module 214 controls the decoding passes the time tag with the attribute "label processing", 27090000, together with a message that indicates that the current time becomes a time having the attribute "handling label" module 212 control the player.

Module 212 control the player recognizes that plays PlayItem#0 of PlayList#1. Module 212 control the player compares the values 90000, 27090000 and 27540000 that represent the mark_time_stamp of the three Mark(), belonging to PlayItem#0 Mark(), described in the PlayListMark() in the lower table shown in Fig value 27090000, which represents the time the label is transferred from module 214 controls the decoding, and recognizes that Mark(), mark_time_stamp which corresponds 27090000, which represents the time label, and the second Mark() (Mark#1), described in the PlayListMark() in the lower table shown in Fig as the target label (on stage W302).

The second Mark(), which is the th target label, described in the PlayListMark()in the lower table shown in Fig, entry_ES_stream_id equal he. As described above, entry_ES_stream_id, which is equal he represents the video stream stream#0 (figa and figv), a file_id equal he, multiplexed with the file stream clip "00003.PS". Module 212 control the player determines whether the elementary stream that is being played, the video stream stream#0 (steps S303 and S304).

When the elementary stream that is played does not contain the video stream stream#0, the module 212 control the player ignores the target tag (step S304).

In contrast, when the elementary stream, which reproduce contains the video stream stream#0, the module 212 control the player handles the target label so that it was reliable, and performs processing in accordance with the target label (steps S305-S308).

In this case, mark_type the second Mark(), which is a target tag, described in the PlayListMark()in the lower table shown in Fig, is an "Event". Thus, the second Mark() represents the label of the event. Module 212 control the player passes the event message that indicates that the event occurred, and mark_data target label in the module 211 management scenario (steps S305 and S307). Module 211 controls the script executes the sequence of processes described in the "SCRIPT.DAT", with argument mark_data, in accordance with the event message, taken from module 212 control the player, as the interrupt request (step S308).

As described above, when processing a tag management module player determines whether the current time of playback of the file stream of the clip being played in accordance with the mark_time_stamp, which represents a single value of play time on the time axis of the PlayList(),mark_type, which represents the type of Mark()and PlayList() (Fig.7), which contains the PlayListMark() (Fig.9), which has no Mark) or more than one Mark()which contains the mark_data as an argument the fact that the label of the event corresponds to the mark_time_stamp. When the current time corresponds to the mark_time_stamp module 212 control the player recognizes Mark(), which has mark_time_stamp equal to the time label that represents the current time, as the target label. When mark_type target tag represents the type in which the event occurs, namely the target label is a label for the event, passed mark_type of the target label and the event message. Module 212 control the player performs the processing in accordance with the mark_data. Thus, processing in accordance with the mark_data can be performed in accordance with the playback time of the file stream of the clip.

[Determination of compliance when processing out_time]

As the op is Sano above, when the time count unit A calculation time becomes equal to OUT_time to Play Item transmitted by module 212, the player control module 214 controls the decoding cancels the processing of decoding and ends the playback of the PlayItem. In this embodiment, the completion of the PlayItem#0 described in step S151 flowchart of the sequence of operations shown in Fig.

In this case, when the time and OUT_time just compare when determining whether there is a likelihood of problems. Thus, when comparing the time and OUT_time, use the definition corresponding to the above-described determination of compliance.

In other words, as shown in Fig, when OUT_time PlayItem corresponding playListEnd, less than PET FoCFP (frame or complementary pair of fields of the video data stream that is played in a given time), displayed at the end of the playList, when an event occurs PlayListEnd at the time of the beginning of the display (PTS) FoCFP, the duration of the display which contains OUT_time corresponding time PlayListEnd, namely PTSFOCFP [3]≤ OUT_time < PTSFOCFP [3]event PlayListEnd at the time you start showing PTSFOCFP [3]for FoCFP [3]. In this example, PETFOCFP [k]represents the time duration of the display which is based on pic_struct, add auth to PTS FOCFP[k]".

Thus, since the determination of the compliance time is performed only when the output stream is facilitated by the load associated with processing. In addition, as described above, preparation of the data stream becomes strictly independent from the training database.

In addition, the module 214 management informs the decoding module 212 control the player completes play PlayItem. When the module 212 control the player determines what is the last PlayItem PlayItem of the PlayList module 212 control the player provides the generating module 211 management scenario events playListEnd.

When the module 211 controls the script accepts the event playListEnd module 211 controls the script receives information that the playback of the PlayList, in respect of which the order was issued, completed, and continues to perform a programmed operation. In other words, the module 211 management scenario, for example, plays a different playList, displays a menu or exits.

In the case shown in Fig when OUT_time is equal to the end time display the last image in the PlayItem, such a case may not be suitable for processing in the above-described determination of compliance. On Fig, for example, when the displayed FoCFP [2] and set the pause mode, if playStop(), FoCFP [3] display in pause mode. After that, if again cause playstop command(), the displayed image is not changed, but is playListEnd.

In other words, due start time display the last image + duration on the basis of pic_struct = OUT_time, not satisfied interdependence start time display the last image + duration on the basis of pic_struct<OUT_time.

In this case, after the module 216 controls the decoder displays the last image and then passes the duration of image display, the module 216 controls the decoder transmits the information, which represents the end of the display module 214 controls the decoding. Thus, clocks are set ahead on "starting time display the last image + duration on the basis of pic_struct". Thus, may be satisfied with the condition of compliance.

[Decoding subtitle]

Whenever the module 218 controls the subtitle decoder receives one access module subtitles stored in the buffer A, and a timestamp added to it, from block 235 reading subtitles module 215 control buffer (figure 5), the module 218 controls the subtitle decoder decodes the internal software of the subtitle decoder module access to the subtitles. the moreover, module 218 controls the subtitle decoder transmits a timestamp and a duration of module 214 controls the decoding

When the module 214 controls the decoding changes the clock time (block A counting time), using information transmitted from module 216 controls the decoder module 214 controls the decoding checks PTS access module subtitles passed from module 216 controls the video decoder. In other words, when the module 214 controls the decoding has determined that PTS access module subtitles correspond to the time-based criterion to determine the compliance module 214 controls the decoding output subtitle module 219 processing of graphic images and output subtitle module 218 controls the decoder subtitles.

When the module 214 controls the decoding output subtitle module 218 controls the subtitle decoder module 218 controls the subtitle decoder transmits the decoded image data of the subtitle in module 219 processing of graphic images. In module 219 processing of graphic images stored input data of the subtitle, and then they are combined with the video data, which will be introduced later.

Module 214 controls the decoding also checks the duration of the display of the subtitle. In other words, when the definition is about, what is "start time display subtitle + the duration of the display corresponds to the current time-based criterion to determine the compliance module 214 controls the decoding enhances subtitle module 219 processing of graphic images. As a result, the module 219 processing graphic images erases subtitle data that were saved and entered, and stops combining the data of the subtitle with the video data, which will be introduced later.

[The necessity of intervals label]

In the above-described criteria for determining the compliance time specified range are rounded up to one time. In other words, the time t that satisfies the interdependence start time display specific video ≤ t < the end time display, rounded up to the time of the beginning of the video display.

Thus, the time values of two adjacent events can be rounded up to one time, depending on the relationship of the provisions of these events. For example, in the example shown in Fig if mark_time_stamp of the events which immediately preceded the event, which is 186000, as well 184000, it is determined that these two events occur at the time you start showing P1.

To avoid this situation it is necessary to ensure that DL is one video was administered in only one event. Thus, when the intervals of the neighboring events are three fields or more (more than the maximum display time indicated by pic_struct), the above condition is guaranteed.

On Fig shows an example of the previous conditions. In other words, Fig, case And indicates that the frame rate is equal 5005/240000 (progressive rate of 23.976 Hz), and the minimum interval of the event is 7507, while In case denotes the frame rate 4004/240000 (with alternation, 59.94 Hz), and the minimum interval events at a frequency of 90 kHz is 6006.

In the encoding system of video, such as AVC and MPEG2 Video signal for one frame display during time three fields for efficient encoding 2-3 pull-down image. Thus, the maximum duration of the signal for one frame is three fields. In other words, when the interval of neighboring events separated only by time for the three fields or more, excludes the determination of the occurrence of these two neighbouring events at the time you start showing some video.

In addition, the interval of adjacent events can be identified when using more than three fields. For example, the interval of adjacent events can be defined with two or more frames.

Instead, the above condition may be, the guarantor is identified by checking the video for all events and definitions what they don't overlap.

[Processing control output attribute]

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing control output attribute is performed at step S126 shown in Fig etc.

As described in step S126 shown in Fig module 212 control the player verifies that at least one elementary stream, intended to play, namely number_of_DynamicInfo (Fig), which represents the number of DynamicInfo() (peg), which describe the output attribute of at least one elementary stream, which was determined to play on stage S125 shown in Fig.

When number_of_DynamicInfo each of the at least one elementary stream, intended for playback is set to 0, the module 212 control the player does not perform any processing.

In contrast, when number_of_DynamicInfo elementary stream, intended to play, is not equal to 0, the module 212 control the player performs processing to control the output attribute in accordance with the flowchart of the sequence of operations shown in Fig.

Thus, when the three information file clip "00001.CLP", "00002.CLP" and "00003.CLP", recorded on the disc 101, are files such as p is cauldrons on figa and FIGU, and played first PlayItem#0 of the first PlayList#0, which plays the file stream clip "00001.PS", the corresponding file information of the video "00001.CLP", because number_of_DynamicInfo all four elementary streams multiplexed with the file stream clip "00001PS, which represent stream#0 to stream#3, 0 for file information of the video "0001.CLP" (figa and figv), the module 212 control the player does not handle control output attribute.

Similarly, when playing the second PlayItem#1 of the first PlayList#0, which plays the file stream clip "00002.PS"corresponding to the file information of the video "00002.CLP"because number_of_DynamicInfo four elementary streams multiplexed with the file stream clip "00002.PS, which represent stream#0 to stream#3, 0 for file information of the video "00002.CLP" (figa and figv), the module 212 control the player does not handle control output attribute.

In contrast, when reproducing PlayItem#0 of the second PlayList#1, which plays the file stream clip "00003.PS", the corresponding file information of the video "00003.CLP", because number_of_DynamicInfo video stream stream#0, which represents the first elementary stream and the audio stream stream#2, which represents the third elementary stream, is equal to 2 and 3, respectively, these three elementary stream stream#0 to stream#3, multiple is financed with a file stream clip "00003.PS" for file information of the video "00003.CLP" (figa and figv), module 212 control the player performs processing to control the output attribute.

In other words, when processing control output attribute, at step S320, the module 212 control the player passes the pts_change_point described in the information file clip Clip() (Fig)corresponding to the stream file of the clip, which must be played together with information that represents a time having the attribute of the process of DynamicInfo()", for module 214 controls the decoding. Module 214 controls the decoding takes pts_change_point, which is a time having the attribute of the process of DynamicInfo()", module 212 control the player. After that, the processing flow proceeds to step S321.

At step S321 module 214 controls the decoding determines whether the current time calculated by the block A estimating the time value of pts_change_point, which is a time having the attribute of the process of DynamicInfo()". When the result determined at step S321, indicates that the current time does not match the pts_change_point, the processing flow returns to step S321.

In contrast, when the result determined at step S321, indicates that the current time matches any one time, with the attribute "process DynamicInfo (); module 214 controls the decoding transmits a message, which hereafter is chaet, that the current time has become a time having the attribute of the process of DynamicInfo()", and the time that has the attribute "process DynamicInfo()" (below, sometimes referred to as the DynamicInfo time), the module 212 control the player. After that, the processing flow goes to step S322.

At step S332 module 212 control the player receives a message that indicates that the current time has become a time having the attribute of the process of DynamicInfo()", and DynamicInfo time module 214 controls the decoding, and recognizes the DynamicInfo() is paired with the pts_change_point (Fig), which corresponds to the time DynamicInfo, as the target value DynamicInfo(). Thereafter, the flow goes to step S323.

At step S323 module 212 control the player passes the output attribute, described in the DynamicInfo() (peg), which is a target value DynamicInfo(), module 219 processing of graphic images or module 221 audio output. After that, the processing flow goes to step S324.

At step S324 module 219 processing of graphic images or module 221 audio output begins to control the output of video data or audio data in accordance with an output attribute that was passed from module 212 control the player, in step S323. After that, the processing flow returns to step S321.

Thus, video data is output in accordance with, for example, an aspect ratio described is output as the output mode. Alternatively, data of the sound output in accordance with, for example, a stereo mode or output mode dual audio (bilingual), described as the output mode,

Next, with reference to Fig will be described in detail processing control output attribute.

Namely, Fig shows a pair of pts_change_point and DynamicInfo() (Fig)described in the information file of the clip "00003.CLP", shown in figa and figv.

As described above, in the information file of the clip "00003.CLP", shown in figa and FIGU, number_of_DynamicInfo for video stream stream#0 and the audio stream stream#2, which represent the first elementary stream and the third elementary stream of the three elementary streams, stream#0 to stream#2, multiplexed with the file stream clip "00003.PS", equal to 2 and 3 respectively. Thus, in the information file of the clip "00003.CLP" two sets pts_change_point''s and DynamicInfo() is described for the first video stream stream#0 of the file stream clip "00003.PS", and three sets pts_change_point''s and DynamicInfo() is described for the third audio stream stream#2 stream file clip "00003.PS".

In the upper table shown in Fig described two sets of pts_change_point and DynamicInfo() of the first video stream stream#0 of the file stream clip "00003.PS". In the lower table shown in Fig described three sets of pts_change_point and DynamicInfo() the third audio stream stream#2 stream file clip "00003.PS".

In the upper table shown in Fig, in addition to the two sets of pts_change_point and DynamicInfo() of the first video stream stream#0 is described a file_id (=0xE0), private_stream_id (=h) and number_of_DynamicInfo (=2) of the first video stream stream#0 of the file information of the video "00003.CLP", shown in figa and figv. Similarly, in the lower table shown in Fig, in addition to the three sets of pts_change_point and DynamicInfo() the third audio stream stream#2 is described streamed (=0xBD), private_stream_id (=h) and number_of_DynamicInfo (=3) of the audio stream stream#2 information file clip "00003.CLP", shown in figa and figv.

In the upper table shown in Fig, pts_change_point the first set of the two sets of pts_change_point and DynamicInfo() video stream stream#0 is 90 000, and display_aspect_ratio (Fig) for his DynamicInfo() is "4:3". pts_change_point second set is 54090000 and display_aspect_ratio for his DynamicInfo() is set to "16:9".

In the lower table shown in Fig, pts_change_point the first set of the three sets of pts_change_point and DynamicInfo() of the audio stream stream#2 is equal to 90000, and channel_assignment (Fig) for his DynamicInfo() is "Dual". pts_change_point second set 27 090 000, and channel_assignment for his DynamicInfo() is equal to "Stereo". pts_change_point third set equal 32490000, and channel_assignment for his DynamicInfo() is equal to "Double".

Now suppose that at step S125 shown in Fig, the first video stream stream#0, identified by a file_id, which represents 0xE0, and the third audio stream stream#2, identified by a file_id, which is 0xBD and private_stream_id that is h were defined as flows that are designed to do the Oia file stream clip "00003.PS".

In this case, the module 212 control the player examines these two sets of pts_change_point and DynamicInfo() in the upper table shown in Fig for video stream stream#0, identified by a file_id, which is 0xE0, and three sets of pts_change_point and DynamicInfo() in the lower table shown in Fig for the audio stream stream#2, identified by a file_id, which is 0xBD and private_stream_id that is h, and recognize the original value.

In other words, the pts_change_point the first set of the two sets of pts_change_point and DynamicInfo() in the upper table shown in Fig for video stream stream#0, identified by a file_id, which is he, is equal to 90000. Time 90000 corresponds to the time 90000 described in presentation_start_time, which represents the start time of the file stream clip "00003.PS" in the file information of the video "00003.CLP", shown in figa and figv respectively for file stream clip "00003.PS", which was multiplexed video stream stream#0.

Similarly, the pts_change_point the first set of the three sets of pts_change_point and DynamicInfo() in the lower table shown in Fig for the audio stream stream#2, identified by a file_id, which is 0xBD and private_stream_id that represents h, 90000. Time 90000 corresponds to the time 90 000 described in presentation_start_time, which represents the start time of the file stream clip "00003.PS" in the file info is information clip "00003.CLP", shown in figa and figv corresponding to the file stream clip "00003.PS", which was multiplexed audio stream stream#2.

Module 212 control the player recognizes pts_change_point, which corresponds to the time 90000 described in presentation_start_time, which represents the start time of the file stream clip "00003.PS" as the initial value. Thus, the module 212 control the player recognizes pts_change_point the first set of the two sets of pts_change_point and DynamicInfo() in the upper table shown in Fig and pts_change_point in the first set of the three sets of pts_change_point and DynamicInfo() in the lower table shown in Fig, as the initial values.

Module 212 control the player indicates the output attribute of the elementary stream in accordance with the DynamicInfo() is paired with the pts_change_point recognized as the initial value at step S126 shown in Fig, before playing the file stream clip "00003.PS".

For the video stream stream#0, identified by a file_id, which is he, in the upper table shown in Fig, display_aspect_ratio for DynamicInfo() in conjunction with the pts_change_point, which is 90000 as the initial value is "4:3". In this case, the module 212 controls player controls module 219 processing of graphic images, using information that indicates that display_aspect_ratio is "4:3", namely, INF is rmatio on the output attribute, which indicates that the video stream stream#0 is a video with an aspect ratio of 4:3.

For the audio stream stream#2, identified by a file_id, which is 0xBD and private_stream_id that represents h, in the lower table shown in Fig, channel_assignment for DynamicInfo() in conjunction with the pts_change_point, which is 90000, as initial values, is a "Double". In this case, the module 212 control the player transmits information, which indicates that channel_assignment is a "Double", namely information about the output attribute, which indicates that the audio stream stream#2 is a data dual audio module 221 sound output.

At step S126 shown in Fig module 212 control the player performs processing to control the output attribute for pts_change_point as initial values.

After this module 212 control the player passes the value 90,000 and 54090000, which represent two of the pts_change_point in the video stream stream#0 in the upper table shown in Fig, and {27090000}, and {32490000}, and {54090000} for 90000, 27090000 and 32490000 that represent time values of the three pts_change_point, except 90000, which represents the original value for the audio stream stream#2 in the lower table shown in Fig, together with information that indicates that this is three time have the attribute "process DynamicInfo () ' for module 214 controls the decoding (step S3 20).

Module 214 controls the decoding takes time values {27090000}, {32490000} and {54090000}with attribute "process DynamicInfo()", module 212 control the player. After the start of playback of the video stream stream#0 and the audio stream stream#2, the control module decoding starts checking the current time counted by the block A counting time.

When the current time corresponds to one of the time points {27090000}, {32490000} and {54090000}, which have the attribute "process DynamicInfo()", module 214 controls the decoding passes the DynamicInfo time that represents the time that has the attribute "process DynamicInfo()" and which corresponds to the current time, the module 212 control the player (step S321).

When the current time becomes equal to, for example, 27090000 module 214 controls the decoding passes 27090000, which corresponds to the current time, and represents one of the time values with the attribute "process DynamicInfo()", as DynamicInfo time, the module 212 control the player.

Module 212 control the player takes on the value 27090000, which represents the time DynamicInfo, module 214 controls the decoding, checks pts_change_point, which corresponds 27090000, as time DynamicInfo two pts_change_point for video stream stream#0, in the upper table shown in Fig, and the ri pts_change_point for the audio stream stream#2 at the bottom of the table, shown in Fig, and recognizes the DynamicInfo() is paired with the pts_change_point, which corresponds 27090000, namely the second DynamicInfo() for the audio stream stream#2, in the lower table shown in Fig, as the target value DynamicInfo() (step S322).

When the target value DynamicInfo() is a DynamicInfo() of the video data stream, the module 212 control the player passes the output attribute, described in the target DynamicInfo(), module 219 processing of graphic images (step S323). When the target value DynamicInfo() is a DynamicInfo() of the audio stream, the module 212 control the player passes the output attribute, described in the target DynamicInfo(), module 221 audio output (step S323).

When the module 219 processing graphics takes the output attribute of the module 212, the player control module 219 processing graphics starts to control the output of video data in accordance with the output attribute (step S324).

In other words, the module 219 processing graphic images converts the ratio of the size of the video data that is displayed in the module 220 video output, for example, in accordance with the aspect ratio of the video data (display_aspect_ratio (Fig))represented by the output attribute, obtained, for example, the module 212 control the player, and the ratio of the size of the device in the water video connected to the output 120 of the video image shown in figure 1.

In particular, when the ratio of the dimensions of the image output represents, for example, 16:9 and the aspect ratio of the video data represented by the output is a 4:3, module 219 processing graphics processing video compression, which displays in the module 220 output video data in the horizontal direction, places the data, which cover the left and right edges of the video data in black color, among video data, and outputs the resulting video. When the aspect ratio of the video output device of a video image is 4:3 and the ratio of the size of video data represented by the output attribute is 16:9, module 219 processing graphics processing video compression, which displays in the module 220 video output in the vertical direction, locates these data, which cover the top and bottom edges of the video data in black color, among video data, and outputs the resulting video. When the ratio of the dimensions of the image output and the ratio of the size of the video data as the output attribute are the same, for example equal to 4:3 or 16:9, module 219 of the graphics processing of the images displays the video data in the module 220 video output, without performing processing video compression.

Using two sets of pts_change_point and DynamicInfo() for the video stream stream#0, identified by a file_id, which is he, in the upper table shown in Fig, video having the aspect ratio of 4:3, is obtained from the video stream #0 after time 90000, which represents the start time of the playback of the video stream stream#0, before time 54090000. After the time 54090000 video having the aspect ratio of 16:9, is obtained from the video stream stream#0.

Thus, if we assume that the ratio of the dimensions of the image output, is connected to the output 120 of the video image shown in figure 1, is, for example, 4:3, module 219 processing graphics transmits video data having an aspect ratio of 4:3, obtained from the video stream stream#0 in the output video aspect ratio is 4:3, after time 90000, before time 54090000. Device video output displays the received video data.

After the time 54090000 module 219 processing graphics performs compression processing for the video data having the aspect ratio of 16:9 in the vertical direction, and converts the video data with the ratio of the sizes of 16:9, in the video signal having the aspect ratio of 4:3, using data that provide coverage of the upper and lower edges of the video in black color. The converted signal is passed to the output video. Device video output displays the converted video data.

When the module 221 audio output takes the attribute of the output module 212, the player control module 221 audio output begins to control the output sound in accordance with the output attribute (step S324).

In other words, the module 221 audio output processes the sound data received from the module 217 controls the audio decoder in accordance with the purpose of the channel for audio data (channel_assignment (Fig))represented by the output attribute received from the module 212 control the player, and in accordance with the mode of the audio output module 212 control the player via the input interface 115 (Fig 1), with which the user performs operations using the remote controller, and outputs the processed data to the sound output 121 of the sound (figure 1).

In particular, when the destination channel for audio data represented by the output attribute is a dual (bilingual) mode, in which the left channel is a data "basic sound", and the right channel is own the th data "auxiliary audio", module 221 audio output processes the sound data transferred from the module 217 controls the audio decoder in accordance with the output mode of the sound transmitted from the module 212 control the player, and outputs the processed data to the sound output 121 of the sound.

In other words, if the "core sound" was identified as the mode of the audio output module 221 audio output copies left channel audio data received from the module 217 control decoder audio right channel audio data, and outputs the left and right channel audio data (data "basic sound") at the output 121 of the sound. If the "auxiliary audio" was identified as the mode of the audio output module 221 audio output copy right channel audio data received from the module 217 control decoder audio left channel, and outputs the left and right channels (data, "auxiliary audio) on output 121 of the sound. If both the primary and secondary sounds" were identified as the mode of the audio output module 221 audio output directly outputs the sound data received from the module 217 control decoder sound output 121 of the sound.

If the destination channel audio data represented by the output attribute is, for example, the stereo module 221 audio output directly outputs the sound data received from the module 217 control audio decoder, output the first output 121 of sound, regardless of what mode of sound output was specified.

Using three sets of pts_change_point and DynamicInfo() for the audio stream stream#2, identified by a file_id, which are 0xBD and private_stream_id that represents h, in the lower table shown in Fig, data dual audio obtained from the audio stream stream#2 through 90000 time after time start playback before time 27090000. In addition, the data stereo audio obtained from the audio stream stream#2, through time 27090000 before time 32490000. In addition, these dual audio obtained from the audio stream stream#2 after time 32490000.

Thus, when the "main sound" was identified as the mode of the audio output module 221 audio output copies the data of the sound of the left channel data for dual sound that were obtained from the audio stream stream#2 through 90000 time after time 27090000, right channel audio data. Left channel and right channel audio data output at the output 121 of the sound.

Data stereo audio obtained from the audio stream stream#2 after the time 27090000 before time 32490000 output on output 121 of the sound.

Left channel data dual audio obtained from the audio stream stream#2, through time 32490000, copy right channel audio data. Left channel and right channel audio data output at the output 121 of the sound.

As described above, when processing panel the effect to the output attribute determines does the playback time of the elementary stream, which plays pts_change_point, the information file clip Clip() (Fig), which contains n sets of pts_change_point, representing the playback time of each elementary stream multiplexed with the file stream of the clip, and DynamicInfo(), which represent the output attribute of each elementary stream (where n represents 0 or any integer greater than him). When the playback time of the elementary stream, which reproduce corresponds pts_change_point, recognize DynamicInfo() in conjunction with the pts_change_point. The output of the playback of the elementary stream is controlled in accordance with the output attribute, described in the DynamicInfo(). Thus, the output of the elementary stream can be controlled in accordance with the playback time of the elementary stream and the output attribute.

[Processing control the display of subtitles]

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing control the display of subtitles, which controls the display of the subtitle data corresponding to the subtitle stream.

When starts playing the PlayList(), module 212 control the player initializes the display mode subtitle data for a module 219 processing of graphic images, at step S341. The other is their words, module 212 controls player controls module 219 processing of graphic images to change the display mode, the subtitle data display mode, the default. Initializing display mode performed in step S341, corresponds to the initialization of the display mode is performed at step S127 shown in Fig.

After step S341, the processing flow proceeds to step S342. At step S342 module 212 control the player determines whether the user entered a new mode command to display the subtitle data in the input interface 115 via the remote control.

When the result determined at step S342, indicates that the new team display mode has been entered, the processing flow proceeds to step S343. At step S343 module 212 control the player determines does the subtitle stream.

When the result determined at step S343, indicates that the subtitle stream is not played, the processing flow returns to step S342.

In contrast, when the result determined at step S343, indicates that the subtitle stream is reproduced, the processing flow proceeds to step S345. At step S345 module 212 control the player determines whether the new command mode display command display mode, the default. It is always the result determined in step S343, indicates that the new team display mode is a command-default display mode, the processing flow returns to step S341. At step S341, as described above, the module 212 controls player controls module 219 processing of graphic images to change the display mode, the subtitle data display mode, the default.

In contrast, when the result determined at step S345, indicates that the new team display mode is not a command display mode, the default, namely new command display mode is a display mode, which was not accepted by default, such as a command to increase the subtitle data, the team reduce the subtitle data or a command to increase the brightness, the processing flow proceeds to step S346. At step S346 module 212 control the player receives StaticInfo() subtitle stream, which reproduce of the StaticInfo() (Fig) information file clip Clip() (Fig), which corresponds to the file stream of the clip with which the multiplexed current subtitle stream. After that, the processing flow proceeds to step S347.

At step S347 module 212 control the player determines configurable_flag for StaticInfo(), obtained in step S346.

When the result op edeleny at step S347, indicates that configurable_flag equal to 0, which means that does not allow changing the display mode subtitle data, the processing flow proceeds to step S348. At step S348 module 212 controls player controls module 219 processing of graphic images to overlay video output message, which indicates that the display mode data of the subtitles cannot be changed. After that, the processing flow returns to step S342. At step S342 displays the error message.

In contrast, when the result determined at step S347, indicates that configurable_flag equal to 1, which means that you may change the display mode subtitle data, the processing flow goes to step S349. At step S349 module 212 control the player sends a new command to the display mode that was introduced through remote control by the user via the input interface 115, module 219 processing of graphic images. After that, the processing flow goes to step S350.

At step S350 module 219 processing graphics starts performing the process of increasing, reducing or process changes the brightness of the subtitle data transmitted from the module 218 controls the output of the subtitle, in accordance with the display mode, which was transferred from module 212 controls the por is iravatham at step S349. After that, the processing flow proceeds to step S342. Thus, the subtitle data is displayed with the display size, display position, or with the colors of the display, in accordance with the display mode that was entered by the user via the remote control.

In contrast, when the result determined at step S342, indicates that the new team display mode has not been entered, the processing flow proceeds to step S351. At step S351 module 212 control the player determines whether it has been modified PlayItem(), as described with reference to Fig. When the result determined at step S342, indicates that the PlayItem() was not changed, the processing flow returns to step S342.

In contrast, when the result determined at step S351, indicates that the PlayItem() has been changed, the processing flow returns to step S341. At step S341, as described above, the module 212 controls player controls module 219 processing of graphic images to change the display mode, the subtitle data display mode, the default. In other words, when a PlayItem() has been changed, the display mode subtitle data restore in the display mode, the default.

As described above, only when configurable_flag the subtitle stream is equal to 1, which means that the display resolution is but change, the display mode data of the subtitles for the subtitle stream can be changed in accordance with the display mode, entered by the user via the remote control.

Thus, for example, in the information file of the clip "00001.CLP", shown in figa and figv because configurable_flag the subtitle stream stream#2, which is the third elementary stream of the four elementary streams multiplexed with the file stream clip "00001.PS", equal to 0, which indicates that the display mode is not allowed to change, while display the subtitle stream stream#2, even if the user works with the remote control, trying to change the mode of display subtitle display mode will not be changed.

In contrast, because configurable_flag the subtitle stream stream#3, which is the fourth elementary stream of the four elementary streams multiplexed with the file stream clip "00001.PS", equal to 1, which indicates that the display mode is allowed to change, while display the subtitle stream stream#3, when the user works with the remote control, to change the display mode for the subtitle, the size of the subtitle display is changed.

Now it is assumed that the file stream clip "00001.PS" is reproduced in accordance with the first PlayItem#1 p is pout PlayList#1, shown in Fig. In addition, in the information file of the clip "00001.CLP", shown in figa and FIGU, it is assumed that the third and fourth elementary streams of the four elementary streams multiplexed with the file stream clip "00001.PS", represent streams of subtitles and that reproduce the third subtitle stream stream#2, the third and fourth streams of the subtitle stream#2 and stream#3.

When the user works with the remote control to enter command mode subtitle display (step S342), the command display mode is supplied from the input interface 115 (Fig 1) in the module 212 control the player. When the module 212 control the player accepts the command mode of the display module 212 controls the player searches the StaticInfo() in the information file clip (Fig), which corresponds to the reproduced subtitle stream (step S346).

In other words, the subtitle stream, which reproduce, represents the third subtitle stream stream#2, multiplexed with the file stream clip "00001.PS". Module 212 controls the player searches the StaticInfo() in the information file of the clip corresponding to "00001.CLP" the third subtitle stream stream#2.

In addition, the module 212 control the player determines configurable_flag, which is equal to 0, as described in the StaticInfo() the third subtitle stream stream#2, shows the CSOs on figa and figv (step S3 47). Thus, the module 212 control the player recognizes that the display mode of the third subtitle stream stream#2 is not allowed to change.

In this case, the module 212 control the player determines that the subtitles stream to reproduce, does not correspond to the modes increase and decrease, and controls module 219 processing of graphic images, so that it generates the appropriate error message (step S348), imposes on video this error message and displays the video data with the superimposed image.

While playing the fourth subtitle stream stream#3 the third and fourth streams of the subtitle stream#2 and stream#3 of the four elementary streams multiplexed with the file stream clip "00001.PS", when the module 212 control the player takes command of the display mode that was entered by the user via the remote control module 212 controls the player searches the StaticInfo() in the corresponding file information of the video "00001.CLP" the fourth subtitle stream stream#3.

Module 212 control the player determines configurable_flag, which is equal to 1, described in the StaticInfo() the fourth subtitle stream stream#3, shown in figa and figv (step S347). Thus, the module 212 control the player recognizes that the display mode of chetvertok is the subtitle stream stream#3 was allowed to change.

In this case, the module 212 control the player determines that the current subtitle stream corresponds to the zoom mode or the mode reduction and transmits a command to the display mode that was entered by the user via the remote control module 219 processing of graphic images (step S349).

Thus, the module 219 processing of graphic images, for example, increases or decreases subtitle data received from module 218 management subtitle decoder, in accordance with the display mode received from the module 212 control the player, puts the received data of the subtitles on the video data transferred from the module 212 controls the video decoder, and outputs the video data with the superimposed image.

When the module 212 control the player starts to play the first PlayItem() of PlayList(), module 212 control the player initialize the display mode subtitle data module 219 processing of graphic images (step S341). In other words, the module 212 controls player controls module 219 processing of graphic images to change the display mode, the subtitle data display mode, the default.

When PlayItem() is changed, the module 212 control the player initializes the display mode given what's subtitle module 219 processing of graphic images (steps S341 and S351).

When PlayItem() is changed, the module 212 control the player checks configurable_flag a new subtitle stream, which must be played in accordance with the new repeatable PlayItem(). When configurable_flag equal to 0, the module 212 control the player initializes the display mode subtitle data module 219 processing of graphic images. When configurable_flag equal to 1, the module 212 control the player provides maintenance module 219 processing graphics display mode, in which the PlayItem() is not changed.

In the processing of controlling the display of the subtitles displayed on Fig when a new command display mode is entered by the user via the remote control, a new command display mode transmit module 219 processing of graphic images (step S349). Command display mode may be stored, for example, in the non-volatile storage device, which consists of the storage device 113 (Fig 1). Command display mode stored in the nonvolatile storage device may be transferred to the module 219 processing of graphic images.

When the command mode display, which was installed by the user, stored in non-volatile memory device as the source device installation is istwa disc, shown in figure 1, when he or she introduces a new command to the display mode using the remote control, the command display mode stored in the nonvolatile memory device may be replaced by a new command display mode, and a new command display mode stored in the nonvolatile storage device may be transferred to the module 219 processing of graphic images. In this case, as in the non-volatile storage device stored command mode display, which was established after the end of the playlist when playing the next PlayList(), subtitle data display mode command display, without the need to enter the command display via remote control.

In this case, it is assumed that the command display mode stored in the nonvolatile memory device includes, for example, the degree of increase or the degree of reduction, in which the subtitle stream was increased or decreased.

As described above, in the processing of controlling the display of subtitles is determined whether the mode change display data of the subtitle of the display mode, the default in configurable_flag, which indicates whether to change the mode displayed is the s of the display mode, default contained in the StaticInfo(), for the subtitle data that does not change when playing elementary streams contained in the information file clip Clip() (Fig). When the display mode, the default subtitle data that reproduce permitted to modify, perform the process of displaying, for example, the process of increasing, reducing or process change the color of the subtitle data. Thus, it is possible to control the display of subtitle data.

[Processing control capture]

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing control capture that controls the capture of video data corresponding to the video stream. On Fig also shows the flowchart of the sequence of operations, which describes the processing of the background image/screensaver that performs secondary use of video data, which were recorded during the processing of the control grip.

When the team capture video data is received from the user via the remote control and the input interface 115 (Fig 1) in the module 212 control the player starts the processing of the control grip.

In other words, when processing control capture, at step S371, the module 212 control the player determined by the t, does the video stream. When the result determined at step S371, indicates that the video does not play, the module 212 control the player completes the processing of the control grip.

In contrast, when the result determined at step S371, indicates that the video stream is reproduced, the processing flow proceeds to step S372. Module 212 control the player receives capture_enable_flag_PlayList of the PlayList() (Fig.7), which corresponds to the reproduced video stream, and capture_enable_flag_Clip from the information file clip Clip() (Fig), which corresponds to the reproduced video stream.

As described in Fig.7, the capture_enable_flag_PlayList in PlayList() indicates whether the secondary use of video data corresponding to the video stream, reproduced in accordance with the PlayList(). On the other hand, as described on Fig, capture_enable_flag_Clip information file clip Clip() indicates whether the secondary use of video data corresponding to the video stream stored in the stream file of the clip that corresponds to the information file clip Clip().

After step S372, the processing flow goes to step S373. At step S373 module 212 control the player determines whether the image capture reproduced video data, when receiving the command capture via the input interface 115 (Fig 1) in accordance with the capture_enable_flag_PlayList and capture_nable_flag_Clip, which were obtained in step S373.

When the result determined at step S373, indicates that the capture image reproduced video data when you receive the command capture via the input interface 115 is not permitted, namely, at least one of the capture_enable_flag_PlayList and capture_enable_flag_Clip obtained in step S373, About equal, which means that it is not permitted secondary use of video data, the processing flow proceeds to step S374. At step S374 module 212 controls player controls module 219 processing graphics for overlay on a video error message, which indicates that the video capture is not allowed, and completes the control grip. The result shows the error message.

In contrast, when the result determined at step S373, indicates that allowed the capture of the image reproduced video data, when receiving the command capture via the input interface 115, namely both capture_enable_flag_PlayList and capture_enable_flag_Clip that were obtained in step S373, equal to 1, which means that the permitted secondary use of video data, the processing flow proceeds to step S375. At step S375 module 212 control the player transmits a command to capture reproduced video data, when the command capture is supplied through the input interface 115, module 219 processing graphic the sky images. After that, the processing flow proceeds to step S376.

At step S376 module 219 processing graphics performs image capture video data from module 216 controls the video decoder in accordance with the command capture obtained from module 212 control the player, saves the image in the storage device 113 (Fig 1) and completes the processing of the control grip. When capture_enable_flag consists of a set of bits and the conditions for their use are limited at this point to perform the associated operation. In other words, when the size of the captured image is limited, you capture the image with a reduced size. When the software program has a limit, a flag that represents this restriction also recorded.

As described above, when processing control capture, capture_enable_flag_PlayList and capture_enable_flag_Clip to PlayList() (7), and an information file clip Clip() (Fig)corresponding to the video that plays when the user enters a command capture process using a logical "And". When the result of the operation "And" equal to 1, namely, when all the capture_enable_flag_PlayList and capture_enable_flag_Clip equal to 1, which means that the permitted secondary use of video data, it is determined that video data can be used again. As a result, these VideoLAN what's grasp.

When the video stream is reproduced in accordance with the first PlayItem#0 of the first PlayList#0 shown in Fig, namely a video stream multiplexed with the file stream clip "00001.PS", reproduce, when the user enters a command capture, because capture_enable_flag_PlayList of the first PlayList#0 is equal to 1 and capture_enable_flag_Clip information file clip "00001.CLP", shown in figa and figv corresponding to the file stream clip "00001.PS", the playback of the first PlayItem#0 is equal to 1, it is determined that the reproduced video data (video data corresponding to the video stream multiplexed with the file stream clip "00001.PS")possibly recyclable, and is video capture.

While the video plays in accordance with the second PlayItem#1 of the first PlayList#0 shown in Fig, namely reproduce the video stream multiplexed with the file stream clip "00002.PS"when the user enters a command capture, because capture_enable_flag_PlayList of the first PlayList#0 is equal to 1 and capture_enable_flag_Clip information file clip "00002.CLP"shown in figa and figv corresponding to the file stream clip "00002.PS", reproduced in accordance with the second PlayItem#1, equal to 0, it is determined that the reproduced video data (video data corresponding to the video stream multiplexed with the file stream clip "00002.PS") are not permitted to use again, and capture videod the data is not performed.

While the video stream is reproduced according to PlayItem#0 of the second PlayList#1 shown in Fig, namely reproduce the video stream multiplexed with the file stream clip "00003.PS", when the user enters a command capture, because capture_enable_flag_PlayList of the second PlayList#1 is equal to 0 and capture_enable_flag_Clip information file clip "00003.CLP", shown in figa and figv corresponding to the file stream clip "00003.PS", reproduced according to PlayItem#0 of the second PlayList#1, equal to 1, it is determined that the reproduced video data (video data corresponding to the video stream multiplexed with file stream clip "00003.PS") are not permitted to use again. Thus, the video capture is not performed.

In this case, when checking that the capture_enable_flag_PlayList of the second PlayList#1 is equal to 0, we can determine that cannot be permitted secondary use of video data. Thus, it is possible to exclude checking capture_enable_flag_Clip information file clip "00003.CLP", shown in figa and figv corresponding to the file stream clip "00003.PS", the reproduced according to PlayItem#0 of the second PlayList#1.

The image captured in the management process capture and stored in a storage device 113, can be reused in the processing of the background image/screensaver.

Processing the background image/screensaver implement aetsa, for example, when the module operates 212 control the player, but the elementary stream is not reproduced, namely the disk 101 has not been inserted in the drive disk 102 (Fig 1), or elementary stream has already been played.

When processing the background image/screensaver, step S3 80, the module 212 controls player controls module 219 processing of graphic images to display the image stored in the storage device 113, the processing control capture. Module 219 processing graphical image displays the image stored in the storage device 113, the processing control capture under management module 212 control the player.

When the module 219 processing graphical image displays the image stored in the storage device 113 as a still image, is the so-called representation of the background image. When the image is displayed with the increase, decrease and move at specific intervals, obtained screensaver. Processing the background image/screensaver that displays the image stored in the storage device 113, the processing control capture may be performed using another independent application program, instead of module 212 control the player.

To the GDS flag which represents the constraint added to the image stored in the storage device 133, the image displayed is limited in accordance with this flag.

As described above, the capture_enable_flag_PlayList and capture_enable_flag_Clip, which indicate whether the secondary use of reproduced video data are in accordance with, for example, PlayList() or PlayItem(), which is larger than the module access to image data. In accordance with the capture_enable_flag_PlayList and capture_enable_flag_Clip determine whether the secondary use of reproduced video data. When a certain result indicates that the reproduced video data is allowed to re-use, have a grip on the reproduced video data and processing the background image/screensaver. Thus, it is possible to control the secondary use of video data.

When processing control capture shown in Fig, PlayList() (7) contains the capture_enable_flag_PlayList and an information file clip Clip() (Fig), which corresponds to the file stream clip, reproduced according to PlayItem(), contains capture_enable_flag_Clip. When using simultaneously capture_enable_flag_PlayList and capture_enable_flag_Clip determine whether the secondary use of video data. Alternatively, when the PlayList() (7) contains capture_enable_flag_PlayList or an information file clip Clip() (Fig), the corresponding file is stream clip reproduced according to PlayItem(), contains capture_enable_flag_Clip, namely either capture_enable_flag_PlayList or capture_enable_flag_Clip, you can determine whether the secondary use of video data.

When processing control capture shown in Fig, at step S376, the module 219 processing graphic images captures video data from module 216 controls the video decoder in accordance with the command capture obtained from module 212 control the player, but it is only one image. Alternatively, module 219 processing graphic images can capture multiple images. In other words, can be captured many images that module 216 controls the decoder outputs in the form of a temporal sequence. In this case, you can pre-define the number of images captured at a specific point in time. Alternatively, the bits of the capture_enable_flag_PlayList and capture_enable_flag_Clip can be extended to information that represents the number of images that can be captured at a specific point in time.

In the above case, the permission information that indicates whether the secondary use of video data, which represent the capture_enable_flag_PlayList and capture_enable_flag_Clip described in the PlayList() and the file information of the clip, Clip(). In using the implement information resolution determine whether the secondary use of all the video data reproduced in accordance with the PlayList(), and all video data corresponding to the video stream multiplexed with the file stream of the clip that corresponds to the information file clip Clip(). Information on permits can describe any video input. When using information permit you to determine whether the secondary use of video data in any module.

In other words, Fig shows the syntax private_stream2_PES_payload(), which contains information about the permission to use. On Fig shows the syntax au_information(), which contains information about the permission to use.

private_stream2_PES_payload(), shown in Fig is the same as that shown in Fig, except that video_stream_id is located directly in front of capture_enable_flag_ps2 as information on permitted use. Similarly, au_information(), shown in Fig is the same as shown in Fig, except that pic_struct_copy is located directly in front of capture_enable_flag_AU as information on permitted use.

capture_enable_flag_ps2 contained in private_stream2_PES_payload(), shown in Fig indicates whether the secondary use of video data of the video data stream after the PES_packet() of private_stream_2, which contains private_stream2_PES_payload() before PE_packet() next private_stream_2. Thus, using capture_enable_flag_ps2 contained in private_stream2_PES_payload(), shown in Fig, you can determine whether the secondary use of video data, following after a certain point possible start decoding until the next possible start decoding.

In addition, capture_enable_flag_AU contained in the au_information(), shown in Fig indicates whether the secondary use of video data module of the video corresponding to capture_enable_flag_AU. Thus, when using capture_enable_flag_AU contained in the au_information(), shown in Fig may be determined whether the secondary use of video data of each module access video data, namely each image.

At least two of the capture_enable_flag_PlayList that use information permission PlayList() (7), capture_enable_flag_Clip, which uses the information of the resolution information file clip Clip() (Fig), capture_enable_flag_ps2 as information permission to use private_stream2_PES_payload() (Fig) and capture_enable_flag_AU as information to allow use of the au_information() (peg), can be used with redundancy. In this case, the result, for which executes the logical operation "And"at least two types of information allow the use of used as redundant information, you can determine authorization is prohibited whether the secondary use of the video image.

As described in step S211 shown in Fig, block 233 reading the video data in the module 215 control buffer (figure 5) searches for the PES_packet() in the program stream stored in the buffer A private_stream_2, which contains private_stream2_PES_payload(), shown in Fig or Fig that contain au_information(); shown in Fig. Thus, when used private_stream2_PES_payload(), shown in Fig, which contains capture_enable_flag_ps2 and au_information(), shown in Fig that contain capture_enable_flag_AU module 212 control the player must request at block 233 reading video capture_enable_flag_ps2 and capture_enable_flag_AU, to determine whether the secondary use of video data.

Next, with reference to Fig, will be described the structure of the hardware device recording disk.

The recorder drive is shown in Fig can be applied, for example, in the disc player, game device, a car navigation system, etc.

In the recorder drive is shown in Fig, ROM 410, for example, is an optical disc such as a DVD, a magneto-optical disk or magnetic disk. The disk 410 can write data content, such as video data, audio and subtitle data. In addition, the disk 410 can write data content. When different kinds of data recorded on di is ke 410, it can be used as the disk 101, as shown in figure 1.

Input output 400A video connected to the module input video data, such as device capture an image (not shown). Input output 400A transmits video data video data transferred from the module input video input interface 401 video. Input output 400 V sound is connected to the input module of the sound, such as a microphone and an amplifier (not shown). Input output 400 In sound transmits the input audio interface 402 for audio input.

The interface 401 of the input video data, performs the necessary processing for the input video data and transmits the resulting video data to the encoder 403 video data via the bus 411. The interface 402 sound input performs the necessary processing for the input audio data and transmits the resulting data audio encoder 404 audio via the bus 411.

Encoder 403 encodes video data video data transferred from the CPU 405, and the interface 401 of the input video data and provides the actuator 409 disk on the resulting data encoded compressed (encoded video data, for example, the MPEG2 video stream) to the disk 410 via the bus 411.

The encoder 404 encodes sound data of the sound transmitted from the CPU 405, and the interface 402 a sound input, and provides a record of the actuator 409 disk received in the encoded compressed data (encoded data sound is, for example, in the form of audio stream MPEG2) on the disk 410 via the bus 411.

The CPU 405 and the storage device 406 constitute a computer system. In other words, the CPU 405 performs the program stored in the storage device 4Q6, manages the entire burner drive and performs various types of processing that will be described below. In the storage device 406 saved the program that performs the CPU 405. In addition, in the storage device 406 temporarily stored data, which should work the CPU 405. The storage device 406 may consist only of non-volatile storage device or a combination of volatile and non-volatile storage device. When the recorder drive is shown in Fig, contains the hard disk on which is recorded a program that performs a CPU 405, and the program recorded (installed) on your hard disk, storage device 406 may consist only of a volatile storage device.

The program that performs the CPU 405 may be recorded in advance in the storage device 406 as a recording medium built into the recorder drive.

Instead, the program may be temporarily or permanently recorded in the actuator 409 disk, floppy disk, which is not driven 409 disk or on a removable recording medium, that is om as CD-ROM (compact disc - permanent memory), MO (magneto-optical) disk, a magnetic disk or a memory card. Such removable recording media can be provided as so-called packaged software.

In addition, the program may be previously stored in the storage device 406. Instead, the program can be installed from the removable recording media in the recording device of the disk. Instead, the program may be transferred wirelessly transfer data from a download site into the recorder drive via satellite, used for broadcast satellite transmission of digital data. Instead, the program may be transferred into the recorder drive by cable, through which runs a network such as LAN (local area network) or the Internet. The recorder disk may take the program through the input interface 408 and can install it in the internal storage device 406.

In addition, the program may be processed by one CPU. Instead, the program may be processed in a distributed mode, using lots of CPU.

Interface 407 actuator controls the actuator 409 disk under the control of the CPU 405. Thus, the interface 407 actuator transmits data transferred from the CPU 405, a storage device 406, encoder 403 videod the data and the encoder 404 audio drive 409 disk drive via the bus 411, and provides a record of all actuator 409 data disk in the disk 410. Instead, the interface 407 drive reads data from the disk 410 and transmits the data to the CPU 405 in the memory device 406 via bus 411.

Input interface 408 receives the signal in accordance with a user operation performed with the buttons (keys) and remote control (remote commands), and transmits this signal to the CPU 405 via bus 411. In addition, the input interface 408 acts as a data interface, such as a modem (including modem ADSL (asymmetric digital subscriber line) or NIC (network interface card).

Video data and sound data may be transmitted by cable or wirelessly transfer data from the module of the video data input from the input module of the sound, respectively.

The disk 410 may be loaded in the drive 409 disk and can be unloaded from it. The actuator 409 ROM has a built-in interface (not shown). The actuator 409 disk connected to the interface 407 drive through the interface. The actuator 409 disk performs drive loaded disk 410 and performs, for example, processing of writing data on the disk 410, in accordance, for example, with a write command received through the interface 407 drive.

In necessary cases, the data (write data)recorded on the disc 410, include the program you can in order to execute the computer. In this embodiment, as the recording media used disk 410, which represents the recording media in disc form. Instead, as the recording media may be used a semiconductor storage device or a recording medium in a tape form.

Bus 411 is connected CPU (Central processing unit) 405, a storage device 406, the interface 407 actuator, the input interface 408, the encoder 403 video data, the encoder 404 audio interface 401 of the input video data and interface 402 for audio input.

Next, with reference to Fig, will be described the function of the unit disc, which embody the method of encoding data in accordance with the present invention. In the function that the device performs recording disk shown in this drawing, the encoder 404 sound performs encoding with compression of the audio signal that has been entered through the input output 400 In sound and interface 402 a sound input, and outputs the resulting signal to the module 421 multiplexing.

Encoder 403 performs video encoding compressed video signal that has been entered through the input output 400A video and interface 401 of the input video data, and outputs the resulting signal, which must be multiplexed module 421.

Module 421 performs multiplexing of pattiro is the W input MPEG2 video data stream and an input stream of audio MPEG2 and multiplexes them on a time sharing basis, as described with reference to figa and FIGU - Fig. Module 421 MUX selects the internal encoding of the image from the stream and enters the PES_packet() of private_stream_2, shown in Fig in the image with the internal encoding with a frequency component of approximately twice per second.

Module 421 multiplexing outputs the multiplexed stream into a module 424 overwrite RAPI through PPA 422, as well as in the module 423 allocation information RAPI. Module 423 allocation information RAPI detects the initial position of the PES_packet() of private_stream_2 video stream multiplexed stream, the value of the timestamp (PTS) of the image with the internal encoding, which directly precedes the PES_packet() of private_stream_2, and the end position of the image with internal coding, second, third and fourth reference image which precedes the image with the internal encoding, and stores them.

In this case, RAPI is PES_packet() of private_stream_2.

Module 423 allocation information RAPI displays the end position detected internal images and the second, third and fourth reference images, which precedes the image with the inner encoding module 424 overwrite RAPI. Module 424 overwrite RAPI overwrites fields 1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture shown in Fig as information RAPI, C is picavet upper end position, second, third and fourth reference images as digital values in units of sectors, and stores them in the output server 426.

After processing all multiplexed flow controller 425 receives the initial position of all RAPI, which were isolated and stored by the module 423 allocation information RAPI and which have been multiplexed in the multiplexed stream, and the end position of the image with the internal encoding, which immediately precedes each RAPI, and the end position of the second, third and fourth reference image which precedes the image with the internal encoding.

The controller 425 creates Ermer(), described with reference to Fig, with the input.

The controller 425 creates Ermer() file information of the clip with the address of each RAPI, PTS images with internal encoding, which immediately precedes each RAPI, and one of the end positions of the image with the internal encoding, and second, third and fourth images, which precedes the image with the internal encoding, and stores Ermer() in the output server 426.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing create Ermer().

In step S381, the encoder 403 encodes video data compressed signals is, which was introduced through the output 400A input video data and interface 401 of the input video data, and outputs the resulting signal to the module 421 multiplexing. The encoder 404 sound performs encoding with compression of the audio signal, which has been introduced through the conclusion of 400 In sound input and interface 402 a sound input, and outputs the resulting signal to the module 421 multiplexing. In this case, the thread that is derived from the encoder 404 audio, is an audio stream to MPEG2. Similarly, the stream that is output from the encoder 403 video is a MPEG2 video stream.

At step S382 module 421 multiplexing performs batching input stream MPEG2 video and audio stream MPEG2, and multiplexes them on the basis of the separation time (packaging), as described with reference to 18A and FIGU - Fig, selects an image from the internal encoding of the stream, and enters into it PES_packet() of private_stream_2, shown in Fig, with a frequency of approximately twice per second. In this example, the PES_packet() of private_stream_2 means that after it directly follows the image with the internal encoding of video data that can be decoded without reference to another picture. At this point, the image with the internal encoding is usually the time stamp (PTS/DTS).

At this point the data is not zapisywa the t field 1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture described with reference to Fig. In addition, the subtitle stream (not shown) may be filed in the module 421 multiplexing, resulting in its multiplexer with the video stream and audio stream.

At step S383 module 421 multiplexing outputs the multiplexed stream into a module 424 overwrite RAPI through PPA 422 and also in the module 423 allocation information RAPI. Module 423 allocation information RAPI detects the initial position of the PES_packet() for private_stream_2, values timestamp (PTS) of the image with the internal encoding, which directly precedes the PES_packet() for private_stream_2, and the end position of the image with internal encoding, and second, third and fourth reference image which precedes the image with the internal encoding of the video stream multiplexed stream, and stores them.

In addition, the module 423 allocation information RAPI displays the end position detektirovanie image with internal encoding, and second, third and fourth reference image which precedes the image with the inner encoding module 424 overwrite RAPI. Module 424 overwrite RAPI overwrites fields 1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture shown in Fig as information RAPI, recording end position of the upper, second, third, and Thursday is REGO reference images as values in units of sectors, and stores them in the output server 426.

At step S385 initial position RAPI, which were isolated and stored in the module 423 allocation information RAPI and which have been multiplexed in the multiplexed stream, and the end position of the image with the internal encoding, which immediately precedes each RAPI, and the end position of the second, third and fourth reference image which precedes the image with internal coding, served in the controller 425.

The controller 425 creates Ermer(), described with reference to Fig with the input information. In this example, it is assumed that Ermer() only contains information of the video data stream. Ermer() video data represents the position of all the RAPI in the stream, namely the position of all the PES_packet() of private_stream_2. This information to create information that is supplied from the module 423 allocation information RAPI in the controller 425.

More specifically, the controller 425 creates Ermer() file information of the clip with the address of each RAPI, PTS images with internal encoding, which immediately precedes each RAPI, and one of the end positions of the image from the internal encoding, and second, third and fourth images, which precedes the image with the internal encoding, and stores Ermer() in the output server 426. In other words, the controller 425 copies Zn is an increase, close to a given value of count sectors (the number of sectors that can be read during the encoding process of the end positions of the four reference images (1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture) in N-th_Ref_picture_copy.

At step S386, the controller 425 determines index_minus1 based on N-th_Ref_picture_copy and writes it to the disk 410. In this example, the stream data and the database file is stored in the output server 426, passed in the actuator 409 disk interface 407 drive and write to the disk 410.

In the above-described processing Ermer() create, as shown in Fig.

[Usage 1stRef_Picture, 2ndRef_Picture, 3rdRef_Picture and 4thRef_Picture]

Next, with reference to the block diagram of the sequence of operations shown in Fig will be described processing playback fast forward, using Ermer(), shown in Fig.

It is assumed that a user (not shown) transmits a command to fast forward the program 210 playback of video content. Module 212 control the player selects one of the provisions of the possible start playback, registered in Ermer(), from the information file of a clip playing stream (step S3 91), and makes a decision with respect to the data to read in relation to the size of the N-th_Ref_Picture_copy of RPN_EP_start described in Ermer (step S393). Module 212 control the player re the AET information in module 213 data contents about this information and commands to the module 214 controls the decoding, to perform playback fast forward.

Using the operating system 201, the module 213 feed data content reads the file stream of the clip that contains the program stream, which was multiplexed elementary stream, intended to play, and transmits the file stream clip in module 215 controls the buffer (step S393). Since the name of the file, etc. have been identified, they do not denote again. Unlike the case when the playback is started, the read command is passed the address of the beginning of the reading and the size of the data intended for transmission.

Block 233 reading the video data further demultiplexes multiplexed data that has been entered into the module 215 controls the buffer (step S394), and transmits only the video stream module 216 controls the video decoder. In this case, since the handle playback fast forward, module 217 control decoder sound module 218 management subtitle decoder, block 234 the function of reading the sound and the block 233 reading the subtitles do not work.

The input data contains one reference image of four or less of the reference images. When processing a playback fast forward only the reference image decode and display on the basis of the entry point, selected in the processing selecting entry points (step S395), described with reference to the block diagram of the sequence of operations shown in Fig. However index_N_minus1 was transferred to the module 214 controls the decoding. Thus, the module 214 controls the decoding decodes the indicated amount of the reference image, and transmits the decoded image located on the next block and displays them on the reference image (step S396).

After displaying the reference image module 212 control the player selects the entry point Ermer(), which should be displayed next, and repeats the above processing, and outputs image for replay mode playback fast forward (step S397).

When you choose an entry point as a destination of transition in Ermer(), to display, use index_N_minus1. Next will be described the method of using index_N_minus1 in this case. As described above, after read N_th_Ref_Picture_copy, index_N_minus1 represents the number of reference images contained in the data. In the example data shown in Fig because index_N_minus1 each of the first and third points of entry equal to 0, the number of internal images is one. Because index_N_minus1 each of the second and fourth entry points equal to 3, here the content is tsya four reference image.

When take two or more reference images, there is a tendency for improvement of the subjective quality of the images reproduced in the playback mode with fast forward. However, to display a large number of reference images is necessary to increase the amount of data to be read. Thus, the refresh rate becomes low. In other words, there is a relationship on the basis of compromise between them. Thus, when the module 212 control the player selects the entry point into Ermer(), which should be displayed next, the module 212 control the player evaluates the value index_N_minus1.

In other words, at high playback speed, fast forward, although sparse intervals of entry points in Ermer() become large, it is preferable to choose an entry point, index_N_minus1 which is large (namely, high subjective image quality). In contrast, when the playback speed fast forward low, choose the entry point with a small value index_N_minus1.

In the above-described algorithm for determining index_N-minus1 choose the entry point, the value of N-th_Ref_picture_copy which is close to "30". In other words, in Erman()established in accordance with this algorithm, when the data is read in accordance with N-th_Ref_picture_copy, the number of scican the x data becomes equal to "approximately 30" sectors. When the read speed is dominant, it is important that the read time remained constant. Thus, to effectively use this method of selecting entry points.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described the processing of the selection of entry points in the play mode fast forward.

At step S401 module 212 control the player determines whether the playback mode fast forward mode high speed mode or low speed. When the result of determination indicates that the playback, fast forward is a mode with high speed, the processing flow goes to step S402. In contrast, when a certain result indicates that the playback, fast forward is a mode with low speed, the processing flow goes to step S411.

<Description select playback fast forward low speed>

At step S402, because the play mode fast forward is a mode with low speed module 212 control the player performs a sequential increment of the number of the entry point, which must be selected (the current entry point (N) 2 after which it the entry point (entry point +=2). At step S403 module 212 control the player reads index_N_minus1 from the current entry point N, log (n-1)preceding by one point, and log (N+1), the next at one point later.

At step S404 module 212 control the player determines whether the value is equal index_N_minus (N), namely the value index_N_minus1 N-th entry points, 0 or 1. When the value index_N_minus1 at step S404 is 0 or 1, the processing flow goes to step S405. At step S405 module 212 control the player selects the N-th entry point and completes the processing. In contrast, when the value index_N_minus1 at step S404 is not equal to neither 0 nor 1, the processing flow goes to step S406.

At step S406 module 212 control the player determines whether the value is equal index_N_minus (N+1), namely the value index_N_minus1 for (N+1)-th entry points, 0 or 1. At step S406, when index_N_minus (N+1) is equal to 0 or 1, the processing flow proceeds to step S407. At step S407 module 212 control the player selects the (N+1)-th entry point and ends the processing. In contrast, when index_N_minus (N+1) at step S406 is not equal to neither 0 nor 1, the processing flow goes to step S408.

At step S408 module 212 control the player determines whether the value is equal index_N_minus (n-1), namely the value index_N_minus1 for (n-1)-th entry points, 0 or 1. When the value index_N_minus1 at step S408 is 0 or 1, the processing flow goes to step S409. At step S409 module 212 controls programa elem selects (n-1)-th entry point and completes the processing. In contrast, when the value index_N_minus1 at step S408 is neither 0 nor 1, the processing flow goes to step S410.

Because it is obvious that the values index_N_minus1 all points N, (N+1) and (n-1) input is not equal to neither 0 nor 1, at step S410, the module 212 control the player selects the N-th entry point and completes the processing.

<Description select playback fast forward, running with high speed>

Since the playback, fast forward is performed in high-speed mode, at step S411, the module 212 control the player performs a sequential increment of the number of entry points that will be selected (current point) (N) 5, starting from the last entry point (corresponding to the number of entry points +=5). At step S412 module 212 control the player reads index_N_minus1 from the current entry point N, from an entry point, the next at one point before, 1 (N-1), and log (N+1), the next at one point later.

At step S413 module 212 control the player determines whether the value is equal index_N_minus (N), namely the value index_N_minus1 for the N-th entry points, 3 or 2. When the value index_N_minus1 for the N-th entry point at step S413 is 3 or 2, the processing flow goes to step S414. At step S414 module 212 control the player selects the N-th entry point and completes the processing. In contrast, when the value indexN_minus1 N-th entry point at step S413 is not equal to either 3, no 2, the processing flow proceeds to step S415.

At step S415 module 212 control the player determines whether the value is equal index_N_minus (N+1), namely the value index_N_minus1 for (N-1)-th entry point 3 or 2. When the value index_N_minus1 for (N+1)-th entry point at step S415 is 3 or 2, the processing flow proceeds to step S416. At step S416 module 212 control the player selects (N-1)-th entry point and completes the processing. In contrast, when the value index_N_minus1 for (N-1)-th entry point at step S415 is not equal to either 3 or 2, the processing flow goes to step S417.

At step S417 module 212 control the player determines whether the value index_N_minus (n-1), namely the value index_N_minus1 for (n-1)-th entry points, 3 or 2. When the value index_N_minus1 for (n-1)-th entry point at step S417 is 3 or 2, the processing flow proceeds to step S418. At step S418 module 212 control the player selects (n-1)-th entry point and completes the processing. In contrast, when the value index_N_minus1 for (n-1)-th entry point at step S417 is not equal to either 3 or 2, the processing flow proceeds to step S419.

Because it is obvious that the values index_N_minus1 for all entry points N (N+1) and (n-1) is not equal to either 3 or 2, in step S419, the module 212 control the player selects the N-th entry point and completes the processing.

In other words, when the playback speed fast forward is too high, although the intervals R is sredjivanje entry points in Ermer() become large, preferably choose an entry point, index_N_minus1 which is large (namely, highly subjective image quality). In contrast, when the playback speed fast forward is small, choose the entry point with a small index_N_minus1.

In the previous processing playback fast forward can be performed at high speed without degrading the subjective quality of the image. When playback fast forward carry out with the low speed, since the playback fast forward to perform with a large number of reference images than at high speed, you can prevent a decrease in the quality of reproduced image.

In the above example, it is assumed that information about read the entry points is usually constant during processing of playback. In this case, the playback device disk, which has high performance, read as many entry points, the quality of reproduced images is improved. However, if the playback device is a disk that does not have high performance processing, reads a large number of data entry points, can decrease the processing speed. Thus, set priority levels for reading information and the entry points. The unit disc, which offers high performance processing, you may use information on all entry points, while the device is playing a disc, which has a low processing characteristics, can read only the entry point having a high priority level.

On Fig shows a functional block diagram describing the function of the unit disc, which sets the priority levels for entry points. In the recorder drive is shown in Fig, functions similar to the functions of the recorder drive is shown in Fig will be denoted by similar numbers of reference positions, and their description will not be shown.

Encoder 443 subtitle reads the material of the subtitles from the server 442 material subtitle, encodes a compacting this material, and writes the resulting data to the server 444 subtitle data. Unlike video data and audio data, subtitle data are interleaved along the time axis. Thus, the start time of the display and the display time of the material of the subtitles recorded in the server 442 material subtitle and subtitle data recorded in the server 444 subtitle data, are provided as information 445 time subtitles.

Although the module 441 multiplexing basically has the same function, h is on and the module 421 multiplexing, shown in Fig, the latter also multiplexes subtitle data and information about the time of the presentation of the subtitles with the video data and the sound data. In other words, the module 441 multiplexing reads not only the input MPEG2 video stream and the audio stream MPEG2, but also subtitle data and information 445-time presentation subtitle transmitted from the server 444 subtitle data, and multiplexes them on the basis of time sharing, as described with reference to figa and FIGU - Fig.

Module 441 multiplexing selects an image from the internal encoding of the stream and inserts the PES_packet() for private_stream_2 shown in Fig in it, with a frequency of approximately twice per second. In this example, the PES_packet() of private_stream_2 indicates that immediately after the image with internal coding, consisting of video data that can be decoded without the need of another reference image. At this point, the image with the internal encoding is usually the time stamp (PTS/DTS).

All modules access subtitle data usually have a timestamp.

At this point the data has not been recorded in the fields 1stRef_picture, 2ndRef_picture, 3rdRef_picture and 4thRef_picture described with reference to Fig. In addition, the subtitle stream (not shown) may be entered in the module 441 multiplexing so that it multiplexer with a view of what optocom and audio stream.

Module 441 multiplexing transmits the multiplexed stream into a module 424 overwrite RAPI through PPA 422, as well as in the module 423 allocation information RAPI. Module 423 allocation information RAPI highlights information about the video data stream and subtitle stream from the multiplexed stream, and stores the selected streams. In other words, the module 423 allocation information RAPI detects the initial position of the PES_packet() for private_stream_2, values timestamp (PTS) of the image with the internal encoding, which directly precedes the PES_packet() of private_stream_2, and the end position of the image with internal encoding, and second, third and fourth reference image which precedes the image with the internal encoding of the video stream and stores them. In addition, the module 423 allocation information RAPI detects the initial position and the timestamps of all modules access the subtitles from the subtitle stream.

The controller 446 creates Ermer(), as shown in Fig, with the input information. It is assumed that Ermer() contains information about the video data stream and a subtitle stream. Basic information about the video data in Ermer() represents the position of all the RAPI, namely PES_packet() of private_stream_2, and timestamps internal images, which directly precedes the PES_packet() for private_stream_2. This information is I can be created with the information which type of module 423 allocation information RAPI in the controller 446. Basic information about the subtitle in Ermer() represents the position and the timestamp of the access modules subtitles. This information can also be created with the information entered from the module 423 allocation information RAPI in the controller 446.

The controller 446 creates priority_flag, which was not defined in the information Ermer() with inputs from the module 423 allocation information RAPI. In other words, the controller 446 evaluates the timestamp of all points of the input video data stream (entry points RAPI) and modules access subtitles and sets priority_flag (which will be described below) for each of them. To install priority_flag information about 447 scene changes section was introduced in the controller 446.

[Description Ermar]

Next, with reference to Fig will be described Ermer(), which set the priority levels for files. As shown in Fig, after number_of_EP_entries should priority_flag (2 bits) as information about the probable point of the beginning of the decoding of the elementary stream identified by a file_id and private_stream_id that immediately precede number_of_EP_entries, reserved_for_future_use (14 bits), PTS_EP_start (32-bit) and RPN_EP_start (32-bit). priority_flag, reserved_for_future_use, PTS_EP_start (32-bit) and RPN_EP_start (32 bits) are repeated as many times as presents number_of_EP_entries.

priority_flag is set, display the TES on Fig. In other words, the entry point of the stream of video data when the value priority_flag is 3, it indicates that the entry point corresponds to the beginning of this section. When the value priority_flag equal to 2, this means that this entry point corresponds to an important scene changes over intervals of one minute, other than the above entry points. When the value priority_flag equal to 1, this means that this entry point corresponds to a scene change intervals of three seconds, other than the above entry points. It is assumed that the value priority_flag other entry points is 0.

The entry point of the subtitle stream when the value priority_flag is 3, this means that this entry point corresponds to the beginning of this section. When the value priority_flag equal to 2, this means that the entry point corresponds to another important scene changes, in addition to the above entry points. When the value priority_flag equal to 1, this means that this entry point corresponds to a change of scene, in addition to the above entry points. It is assumed that the value priority_flag other entry points is 0.

When the clip is a two-hour movie and it contains two points online access per second, the total number of entry points is 14400 (=2 hours × 3600 seconds × 2 times). When the number of partitions is approximately several tens, the number of entry points, to the verge priority_flag=3, becomes equal to several tens and equals the number of partitions. Because the number of important changes of scene (priority_flag=2) and a number of other changes in the scene (priority_flag=1) depend on the content, though without generalization, it is assumed that the number of entry points for which priority_flag=3 or 2, is approximately 200 and the number of entry points for which priority_flag=3, 2 or 1, is about 2400, and the total number of entry points is 14400. In this case, it is also assumed that the total number of entry points for which priority_flag=2 and 1, equal to 1000. In this case, it is assumed that, when read only entry point for which priority_flag=3, 2 and 1, the memory becomes equal to approximately 1000/14 400, which is 1/14 when read all the entry points. In addition, in this case, since one point of entry is 10 bytes, the memory capacity can be reduced to a single stream of video data 10 bytes × (14400-1000)=120134 kilobytes.

In addition, it is believed that a two-hour movie contains 1000-2000 proposals subtitles. In contrast, there are dozens of topics. Thus, assuming that the read only entry point for which priority_flag=3, the memory capacity can be reduced by several tens /1000 or a few dozen/ 2000. As the number of subtitle streams is greater than the number is Otoko video the effect of reducing the memory capacity becomes significant.

In this example, the flag consists of the values 3, 2, 1 and 0. Instead, you can assume that it can be represented by the bits and the corresponding bit may be set equal to 1. In other words, such a field may consist of three bits. When an older significant bit equal to 1, this may indicate that this entry point is the beginning of the section. When the next bit is 1, this may indicate that this entry point is an entry point through intervals of one minute. When the LSB is 1, this may indicate that this entry point is an entry point at intervals of five seconds. When all bits equal to About may be determined that the entry point is not included in these three categories.

For subtitle stream when the value priority_flag entry point is 1, it means that this entry point corresponds to the beginning of this section. In this case, it is assumed that the value priority_flag other entry points is 0.

Next, with reference to the block diagram of the sequence of operations shown in Fig, will be described processing installation prioirty_flag.

At step S441, the controller 446 determines whether the current point of the input video data by the beginning of this section, namely that the estimated entry point corresponds to the time partition inform the tion 447 scene change detection section. When the current entry point corresponds to the time of the partition is determined that the difference is equal to 0. When the current entry point corresponds to the time of partition, the processing flow proceeds to step S442. At step S442, the controller 446 sets priority_flag=3 for this entry point and stores it in the output server 426.

When the result determined at step S441, indicates that the current entry point is not the beginning of a section of video data, the processing flow proceeds to step S443. In step S443, the controller 446 determines whether the current entry point to the important position of a scene change of video data, namely the entry point, for which the next evaluation is in position over the interval of one minute from the beginning of important changes in scenes" information 447 scene change detection section. When the result determined at step S443, indicates that the estimated next entry point is in position over the interval of one minute from the beginning of important changes in the scene", the processing flow proceeds to step S444. At step S444, the controller 446 sets priority_flag=2 for the current entry point.

When the result determined at step S443, indicates that the current entry point is not an important scene change, the processing flow proceeds to step S445. At step S445, the controller 446 determines whether the current entry point is the usual change of scene the s video namely, the entry point assessment which will be executed next, is in position in the interval of three seconds from the beginning of the "change of scene" in information 447 scene change detection section. When the result determined at step S445, indicates that the entry point that will be evaluated is in position in the interval of three seconds from the beginning of the "change of scene", the processing flow proceeds to step S446. At step S446, the controller 446 sets priority_flag=1 for the current entry point.

When the result determined at step S445, indicates that the current entry point is not the usual scene changes, namely the current entry point does not correspond to any scene changes, the processing flow goes to step S447. At step S447, the controller 446 sets priority_flag=0 for the current entry point.

In step S448, the controller 446 determines whether processed all points of the input video data. When a certain result indicates that all entry points have not been processed, the processing flow returns to step S441. At step S441, the processing is repeated. In other words, processing is repeated from step S441 to step S448, while priority_flag will not be set for all pixels of the input video data.

When the result determined at step S448, indicates that the processing has been finished for all entry points, the processing flow goes to step S449.

the and step S449 controller 446 determines whether the current entry point subtitle the beginning of this section, namely the entry point, which is appreciated, corresponds to the time section information 447 scene change detection section. When the result determined at step S449, means that the current entry point corresponds to the time of partition, the processing flow goes to step S450. At step S450, the controller 446 sets priority_flag=3 for the current entry point.

In contrast, when the result determined at step S449, means that the current entry point is not at the beginning of the section subtitles, the processing flow proceeds to step S451. At step S451, the controller 446 determines whether the current entry point to the position of the important changes of scene for subtitles, namely the entry point, which will measure the following, is in position over the interval of one minute from the beginning of important changes in scenes" information 447 scene change detection section. When the result determined at step S451, indicates that the entry point, which will measure the following, is in position over the interval of one minute from the beginning of important changes in the scene", the processing flow proceeds to step S452. At step S452, the controller 446 sets priority_flag=2 for the current entry point.

When the result determined at step S451, indicates that the current entry point is not an important scene change, the flow of processing and proceeds to step S453. At step S453, the controller 446 determines whether the current entry point is the usual scene changes subtitle, namely the entry point, which will be evaluated next is in position in the interval of three seconds from the beginning of the "change of scene" in information 447 scene change detection section. When the entry point, which will be evaluated next is in position in the interval of three seconds from the beginning of the "change of scene", the processing flow proceeds to step S454. At step S454, the controller 446 sets priority_flag=1 for the current entry point.

When the result determined at step S453, indicates that the current entry point is not the usual scene changes, namely the entry point, which will be evaluated next is not any change in the scene, the processing flow proceeds to step S455. At step S455, the controller 446 sets priority_flag=0 for the current entry point.

At step S456, the controller 446 determines whether processed all entry points of the subtitle. When the result determined at step S456, indicates that all entry points have not been processed, the processing flow returns to step S449. At step S449, the processing is repeated. In other words, processing is repeated from step S449 to step S456, up until priority_flag will not be installed for all entry points of the subtitle. When the result determined at step S456, indicates that all points of I is Yes subtitles have been processed, the controller 446 outputs the data Ermer() in accordance with the syntax shown Fig, the output server 426.

[Work side playback: vacuum Ermer()]

The playback device drive performs vacuum Ermer() based on priority_flag, which was established as described above, and in the space of the memory storage device (for example, a storage device 113, as shown in figure 1). In other words, in the unit disc, which has limited options due to the low value, only the entry point, priority_flag which is of great value to store in the storage device. Of course, the device that has a memory that allows you to fully maintain Ermer(), there is no need to perform this operation.

At step S106, the processing in accordance with the flowchart of the sequence of operations shown in Fig, video entry points for which priority_flag=1 or higher, maintain in the storage device. Similarly, subtitle entry point for which priority_flag=1 or higher, maintain in the storage device. In this case, when read Erma() for the stream of video data module 212 control the player reads the entry point, priority_flag which is equal to 3, 2 or 1, from the storage device and does not read entry point, priority_flag equal to 0, the memory device is tion based on the value of a file_id and private_stream_id. For subtitle stream module 212 control the player reads the entry point, priority_flag which is equal to 3 and 21 of the storage device and reads the entry point, priority_flag which is equal to 1 or 0, of the storage device based on the value of a file_id and private_stream_id.

When you execute the above-described processing, the storage capacity of the storage device required for Ermer() for a single stream of video data is smaller in size from about 1/6 to 1/10 as compared with the case where this processing is not performed. In addition, the amount of memory required for Ermer()for one subtitle stream can be reduced by approximately the magnitude of a few hundredths. As a result, the playback device disk with a low cost entry point can be saved according to the storage capacity. In the playback processing can be efficiently performed.

In the example above, priority_flag=3 is selected for the event in the beginning of this section. Instead priority_flag=3 can be set to any value, such as an important scene change, and the beginning of the section.

In the above embodiment, the processing sequence is performed using software tools. Instead, it can be implemented using specialized hardware cf the of funds.

In the above embodiment, as the video decoder 116 (Fig 1) and the encoder 403 video (Fig) using the hardware decoder. Instead, as the video decoder 116 (Fig 1) and the encoder 403 video (peg) can be used software decoder. This applies to the decoder 117 sound (figure 1) and the encoder 404 audio (Fig).

In the above embodiment, as the subtitle decoder uses a software decoder. Instead, as the subtitle decoder may use hardware decoder.

ROOM DESCRIPTION REFERENCE POSITIONS

101

102 the DISK drive

111 BUS

112 CPU

113 STORAGE DEVICE

114 the driver INTERFACE

115 INPUT INTERFACE

116 video DECODER

116A MECHANISM of DECODING VIDEO data

V DPB (DECODED BUFFER IMAGE)

FROM V-0 TO V-n FROM DPB-0 TO DPB-n

117 AUDIO DECODER

118 INTERFACE VIDEO OUTPUT

119 INTERFACE AUDIO OUTPUT

120 OUTPUT VIDEO OUTPUT

121 OUTPUT AUDIO OUTPUT

201 OPERATING SYSTEM

210 PROGRAM PLAYING video content

211 MANAGEMENT MODULE SCRIPT

212 the CONTROL MODULE PLAYER

213 MODULE DATA FEED CONTENT

214 MODULE controlling the DECODING

A UNIT COUNTING TIME

215 MODULE BUFFER MANAGEMENT

A BUFFER

217 the CONTROL MODULE AUDIO DECODER

218 the CONTROL MODULE DECODER SUBTITLES

219 MODULE PROCESSING GRAPHIC IMAGES

220 MODULE OUTPUT VIDEOIZOBRAZHENIEM A PPO

221 MODULE AUDIO OUTPUT

221 PPO

231 the power SAVE POINTER to BEGINNING of DATA

232 UNIT SAVE the RECORD POINTER DATA

233 the FUNCTION BLOCK READS the VIDEO data

234 BLOCK the scanning of the SOUND

235 the FUNCTION BLOCK READ SUBTITLES

241 the power SAVE indicator READING VIDEO

242 REGISTER a file_id

243 REGISTRAR au_information()

251 the power SAVE indicator READING SOUND

252 REGISTER a file_id

253 the private_stream_id REGISTER

261 the power SAVE FUNCTION FLAG READ SUBTITLES

262 the power SAVE indicator READING SUBTITLES

263 REGISTER a file_id

264 the private_stream_id REGISTER

301 VIDEO BUFFER

302 BUFFER ADDITIONAL INFORMATION

400A OUTPUT VIDEO INPUT

S101 CHECK DISK

S102 ERROR HANDLING

S103 be CONSIDERED SCRIPT.DAT AND PLAYLIST.DAT.

S104 SUBMIT FILES

S105 to ANALYZE PLAYLIST.DAT.

S106 to READ the INFORMATION FILE CLIP

S107 to CHECK the RESULT FILE?

S108 to INTERPRET AND EXECUTE SCRIPT.DAT.

S121 to RECOGNIZE IN_time.

S122 CHECK the position of the START PLAYBACK

S123 to DISPLAY the TIME CODE

S124 to ANALYZE the PlayList Mark()

S125 to DETERMINE the amount TO PLAY THE FIGHT

S126 to CONTROL the OUTPUT ATTRIBUTE

S127 to PREPARE FOR the BEGINNING of PLAYBACK

S128 to BEGIN READING DATA

S129 to START MANAGING DECODER

S130 to START DECODING

S131 to START the PROCESSING of GRAPHIC IMAGES

S132 to START PROCESSING OUTPUT

S141 DOES the MODULE ACCESS time STAMP?

S142 to REPLACE the timestamp (PST).

S143 SAVE the CURRENT VALUE of pic_struct

S144 ADD INTERVAL IN ACCORDANCE WITH the PREVIOUS pic_struct

S151 DECODE CONTROL CANCEL

S152 to START PLAYBACK of the PLAYBACK controls

S171 DISPLAY a MESSAGE

S172 CONVERT timecode

S173 DISPLAY timecode

S191 to CHECK the NUMBER of THREADS

S192 to IDENTIFY the current THREAD

S193 to RECOGNIZE the STREAM THAT WILL be PLAYED the FOLLOWING

S194 to PROVIDE a READOUT of the NEXT THREAD

S211 SEARCH of private_stream_2 PACKET.

S212 MARKED steam_id?

S213 be CONSIDERED au_Information() IN the INTERNAL REGISTER

S214 to INSERT the NEXT VIDEOBACHATA IN the POINTER READING VIDEO

S215 WHETHER the DATA REQUESTED?

S216 to ANALYZE the program STREAM AND DISPLAY VIDEO ON AU_length.

S217 PROCESSED IF the ACCESS MODULES FOR number_of_access_unit?

S230 private_stream_1?

S231 to SEARCH for CODE SYNCHRONIZATION

S232 to search for a PACKAGE of AUDIO DATA MPEG

S23 to UPDATE the POINTER READING SOUND

S234 SEARCH PACKAGE private_stream_1

S235 MARKED private_stream_id?

S236 POINTER 234 READING SOUND ← POSITION DIRECTLY AFTER AU_locator+AU_locator

S237 REQUESTED WHETHER the DATA?

S238 ANALYZE SOFTWARE THREADS AND WITHDRAW ACCESS HAS the FIXED LENGTH

S251 FLAG READING SUBTITLES?

S252 SEARCH PACKAGE private_stream_1

S253 MARKED private_stream_id?

S254 INDEX READING SUBTITLES ← POSITION DIRECTLY AFTER AU_locator+AU_locator

S255 REQUESTED WHETHER the DATA?

S256 to ANALYZE SOFTWARE THREADS AND DISPLAY the SUBTITLES ON the LENGTH DESCRIBED IN the BEGINNING

S271 BIG DIFFERENCE timestamps?

S272 VIDEO data is DELAYED RELATIVE to the AUDIO?

S273 to PROVIDE passage of the MODULE ACCESS TO image data

S274 CHECK au_ref_flag FOR ACCESS MODULE

S275 MODULE ACCESS IS NOT a REFERENCE IMAGE?

S276 PROCESSING MODULE ACCESS TO image data

S277 SKIP PROCESSING MODULE FOR ACCESS TO image data

S278 to PROVIDE CONTINUOUS VIDEO OUTPUT

S279 CONTINUOUS VIDEO OUTPUT

Spider S301 demonstration the CURRENT TIME CORRESPONDS to the mark_time_stamp?

W302 to IDENTIFY the TARGET tag

S303 DESCRIBES WHETHER the TARGET label entry_ES_stream_id/entry_ES_private_stream_id?

S304 WHETHER the current thread is the THREAD OF entry_ES_stream_id/entry_ES_private_stream_id?

S305 mark_type RELEVO the marks?

S306 PROCESSING UPDATE SECTION/INDEX

5307 INFORM MODULE 211 MANAGEMENT SCENARIO OF the EVENT MESSAGE AND mark_data

S308 to HANDLE mark_data

S320 SEND pts_change_point

S321 CURRENT TIME CORRESPONDS to the pts_change_point?

S322 to RECOGNIZE DynamicInfo() FOR pts_change_point, WHICH CORRESPONDS to the CURRENT TIME

S323 to PASS the OUTPUT ATTRIBUTE, DESCRIBED IN the DynamicInfo().

S324 START to control the OUTPUT VIDEO/AUDIO IN ACCORDANCE WITH the OUTPUT ATTRIBUTE

S341 to PROVIDE INITIALIZATION MODE SUBTITLE DISPLAY

S342 was INTRODUCED a NEW COMMAND DISPLAY MODE FOR the SUBTITLE?

S343 DOES the SUBTITLE STREAM?

S345 COMMAND DISPLAY MODE, the DEFAULT?

S346 GET StaticInfo() FOR the currently PLAYING SUBTITLE STREAM

S347 to DETERMINE configurable_flag FOR StaticInfo()?

S348 to DISPLAY the ERROR MESSAGE

S349 SEND the COMMAND MODE DISPLAY

S350 to START processing the SUBTITLE DISPLAY IN accordance WITH the DISPLAY MODE

S351 did PlayItem?

S371 WHETHER REPRODUCED VIDEO data?

S372 GET capture_enable_f OF PlayList() And Clip() PLAYABLE VIDEO

S373 WHETHER CAPTURING VIDEO data IN ACCORDANCE WITH capture_enable_flag?

S374 to DISPLAY the ERROR MESSAGE

S375 to TRANSMIT a COMMAND TO CAPTURE

S376 to CAPTURE AND SAVE VIDEO

S380 to DISPLAY the IMAGE

<> S381 ENCODE

S382 MULTIPLEXING AND SELECT INTERNAL IMAGE

S383 to DETECT the POSITION of the IMAGE WITH INTERNAL CODING, AND the NEXT IMAGE.

S384 REWRITE INFORMATION RAPI

S385 COPY IMAGE POSITION, CLOSE TO 30 SECTORS IN the N-th_Ref_picture_copy IN accordance WITH the INFORMATION RAPI.

S386 to DETERMINE AND RECORD index_N_minus1 IN ACCORDANCE WITH N-th_Ref_picture_copy.

S391 to CHOOSE a UNIT THAT is SUITABLE FOR PLAYBACK

S392 to DETERMINE the DATA TO be READ IN ACCORDANCE WITH RPN_EP_Start And N-th_Ref_picture_copy.

S393 to READ the STREAM of the CLIP THAT CONTAINS the program STREAM, WHICH was MULTIPLEXED ELEMENTARY STREAM THAT is INTENDED FOR PLAYBACK?

S394 to DEMUX the video STREAM

S395 to EXECUTE PROCESSING of SELECTING ENTRY POINTS

S396 to DECODE AND DISPLAY the REFERENCE IMAGE

S397 was delivered the COMMAND to STOP PLAYBACK FAST FORWARD?

S401 HIGH SPEED?

S402, the NUMBER of ENTRY POINTS (N)+=2

S403 be CONSIDERED index_N_minus1 FOR EACH OF the ENTRY POINTS N, N+1 AND N-1.

S404 index_N_minus1 (N)==0 OR 1?

S405 to USE N

S406 index_N_minus1 (N+1)==0 OR 1?

S407 to USE N+1

S408 index_N_minus1 (n-1)=0 OR 1?

S409 to USE N-1

S410 to USE N

S411 NUMBER (N) ENTRY POINTS +=5

S412 be CONSIDERED index_N_minus1 FOR EACH OF the ENTRY POINTS N, N+1 AND N-1

S413 index_N_minus1 (N)==3 OR 2?

S414 to USE

S415 index_N_minus1 (N+1)=3 OR 2?

S416 to USE N+1

S417 index_N_minus1 (n-1)==3 OR 2?

S418 to USE N-1

S419 to USE N

S441 BEGINNING of the VIDEO?

S442 priority_flag=3

S443 IMPORTANT CHANGE SCENES VIDEO?

S444 priority_flag=2

S445 USUAL SCENE CHANGES of the VIDEO?

S446 priority_flag=1

S447 priority_flag=0

S448 ALL OVER?

S449 BEGINNING of SUBTITLE?

S450 priority_flag=3

S451 IMPORTANT SCENE CHANGE SUBTITLE?

S452 priority_flag=2

S453 CONVENTIONAL SCENE CHANGE SUBTITLE?

S454 priority_flag=1

S455 priority_flag=0

S456 ALL OVER?

1. A data recorder, comprising: a detection unit that selects image information with the internal coding contained in video data and becomes the point of online access and detects information of position multiple images, in front of which is the selected image with the internal coding; unit designation, which identifies information image position located at a given distance from the image with the internal encoding status information of multiple images, detected by the detecting unit; a recording unit that records information of the position of many of the images detected by the detection unit, and information about the image, indicated by the block designations, nanoscale write data, together with the video data, and block allocation, which allocates the information provision image corresponding to the specified number of sectors of the recording medium data, status information of multiple images, detected by the detecting unit, in which unit designation identifies information image position located at a given distance from the image with the internal encoding status information of multiple images, detected by the detecting unit, in accordance with the information of the position of the image corresponding to the specified number of sectors in the status information of multiple images, selected by the unit selection, and in which the recording unit records information of the position of the image corresponding to the specified number of sectors of the recording media data, in the status information of multiple images, selected by the unit selection.

2. The data recorder according to claim 1, in which a specified number of sectors set in accordance with the capacity of that device, reproducing video data from the recording media data, uses as a video buffer.

3. The way the data records containing the following steps: select image information from the internal encoding, which is contained in video data and becomes a point of operative access is a, and detecting status information of multiple images, in front of which is the selected image with the internal encoding; allocation status information for the image, corresponding to a given number of sectors of the recording medium data, status information of multiple images, refer to the information provision image located at a given distance from the image with the internal encoding status information of multiple images, detected by the detecting unit, in accordance with the information of the position of the image corresponding to the specified number of sectors in the status information of multiple images, selected by the unit selection, and recording status information of multiple images, corresponding to a given number of sectors of the recording medium data, information provisions set images selected at the stage of allocation, and information about the image is indicated on the phase designations on the recording media data together with the video data.

4. Machine-readable medium recording a program on which a recorded program, and the program provides execution of the computer the following steps: select image information from the internal encoding, which is contained in video data and becomes a point of online access, the detection information of the position of the multiple images, to whom is the selected image with the internal encoding; allocation status information for the image, corresponding to a given number of sectors of the recording medium data, status information of multiple images, refer to the information provision image located at a given distance from the image with the internal encoding status information of multiple images, detected by the detecting unit, in accordance with the information of the position of the image corresponding to the specified number of sectors in the status information of multiple images, selected by the unit selection, and recording status information of multiple images, corresponding to a given number of sectors of the recording medium data, status information of multiple images selected at the stage of allocation, and information about the image is indicated on the phase designations on the recording media data together with the video data.

5. A processing unit, comprising: a block read that reads the position of many of the images in the internal encoding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a given distance is from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding is contained in video data and becomes operational points of access, block selection, which selects images from the internal encoding, and many images that precede the image with internal coding, in accordance with the information symbols, which represents the image located at a specified distance from the images in the internal encoding in multiple images in the internal encoding, which become points online access to the video data read by the block read and recorded on the recording media data, and many images that precede the image with internal coding, block play, which performs playback fast forward video with all images from the images in the internal encoding to the image located at a specified distance from the image with internal coding, many of the images that precede the image with internal coding, and which were selected by the block selection, and in which the image is located at a given distance is from the images in the internal encoding, represent images, which are images from internal encoding approximately the specified number of sectors of the recording media data.

6. The processing unit according to claim 5, in which a specified number of sectors of the media data record set in accordance with the capacity of the memory block, which is used as a buffer for the video data when the video data reproduced from the recording media data.

7. Data processing method containing the following steps: reading status information of multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and information signs, which indicates the image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data, and a lot of the images from the internal encoding is contained in video data and becomes operational points of access; selection of images from the internal encoding, and multiple images, which are preceded by images from the internal encoding in accordance with the information symbols, which means the image is to be placed, located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points online access to the video data read during read and recorded on the recording medium data, and many images that precede the image with the internal encoding; and playback fast forward video with all images from the images in the internal encoding to the image located at a specified distance from the image with internal coding, many of the images that precede the image with internal coding, and which have been selected in the selection step, while images on the specified the distance from the images in the internal encoding, represent images, which are images from internal encoding approximately the specified number of sectors of the recording media data.

8. Machine-readable recording media program, on which is recorded a program for executing a computer the following steps: reading status information of multiple images with internal coding, and multiple images, in front of which there are many images in the internal encoding, and informationsmaterial, which means image located at a specified distance from the images in the internal encoding, in many images, in front of which there are many images in the internal encoding, with the recording media data together with the video data, and many of the images in the internal encoding are contained in video data and becomes operational points of access; selection of images from the internal encoding, and multiple images, which are preceded by images from the internal encoding, in accordance with the information symbols, which refers to images that are located at a given distance from the image with internal coding, many of the images in the internal encoding, which become points of operative access video read during read and recorded on the recording medium data, and multiple images, in front of which there are images with internal encoding; and playback fast forward video with all the images from the images in the internal encoding to the image located at a specified distance from the image with internal coding, many of the images that precede the image with internal coding, and which have been selected in the selection step,and the image, located at a specified distance from the images in the internal encoding, represent images, which are images from internal encoding approximately the specified number of sectors of the recording media data.



 

Same patents:

FIELD: information technology.

SUBSTANCE: method is offered to compress digital motion pictures or videosignals on the basis of superabundant basic transformation using modified algorithm of balance search. The algorithm of residual energy segmentation is used to receive an original assessment of high energy areas shape and location in the residual image. The algorithm of gradual removal is used to decrease the number of balance assessments during the process of balance search. The algorithm of residual energy segmentation and algorithm of gradual removal increase encoding speed to find a balanced basis from the previously defined dictionary of the superabundant basis. The three parameters of the balanced combination form an image element, which is defined by the dictionary index and the status of the basis selected, as well as scalar product of selected basic combination and the residual signal.

EFFECT: creation of simple, yet effective method and device to perform frame-accurate encoding of residual movement on the basis of superabundant basic transformation for video compressing.

10 cl, 15 dwg

FIELD: information systems.

SUBSTANCE: invention refers to video coders using adaptive weighing of master images. The video decoder for decoding data from video signal for the image, having multiple motion boxes, containing: the master image weighting coefficient module for accepting, at least, one master image index, thereat each one from the mentioned master image indexes is intended for independent indication not using any other indexes, one of the multiple master images, used for prediction of current motion box and weighting coefficient from the set of weighting coefficients for current mentioned one from mentioned multiple motion boxes.

EFFECT: increase of efficiency in predicting master images.

20 cl, 7 dwg

FIELD: physics, computing.

SUBSTANCE: invention relates to the field of coding and decoding of a moving image. In the method, at least one reference image for the processing of the field macroblock is selected from at least one reference image list, using information about reference image indexes, each at least one reference image selected is a field, and the parity of at least one reference field selected may be based on the parity of the field macroblock and the reference image index information.

EFFECT: efficient provision of information about reference image compensating motion, by reference image indexes determined in different ways, according to the coded macroblock modes.

10 cl, 12 dwg

FIELD: physics.

SUBSTANCE: said utility invention relates to video encoders and, in particular, to the use of adaptive weighing of reference images in video encoders. A video encoder and a method of video signal data processing for an image block and the specific reference image index for predicting this image block are proposed, which use the adaptive weighing of reference images to increase the video signal compression, the encoder having a reference image weighting factor assigning module for the assignment of the weighting factor corresponding to the said specific reference image index.

EFFECT: increased efficiency of reference image predicting.

8 cl, 7 dwg

FIELD: movement estimation, in particular, estimation of movement on block basis in video image compression application.

SUBSTANCE: method and device are claimed for conducting search for movement in video encoder system using movement vectors which represent difference between coordinates of macro-block of data in current frame of video data and coordinates of corresponding macro-block of data in standard frame of video data. A set of movement vector prediction parameters is received, where movement vector prediction parameters represent approximations of possible movement vectors for current macro-block, movement vector search pattern is determined and search is conducted around each movement vector prediction parameter from the set of movement vector prediction parameters using search pattern, and on basis of search result, the final movement vector is determined.

EFFECT: increased efficiency of video signals compression.

3 cl, 7 dwg

FIELD: video encoding, in particular, methods and devices for ensuring improved encoding and/or prediction methods related to various types of video data.

SUBSTANCE: the method is claimed for usage during encoding of video data in video encoder, containing realization of solution for predicting space/time movement vector for at least one direct mode macro-block in B-image, and signaling of information of space/time movement vector prediction solution for at least one direct mode macro-block in the header, which includes header information for a set of macro-blocks in B-image, where signaling of aforementioned information of space/time movement vector prediction solution in the header transfers a space/time movement vector prediction solution into video decoder for at least one direct mode macro-block in B-image.

EFFECT: creation of improved encoding method, which is capable of supporting newest models and usage modes of bi-directional predictable (B) images in a series of video data with usage of spatial prediction or time distance.

2 cl, 17 dwg

FIELD: compensation of movement in video encoding, namely, method for encoding coefficients of interpolation filters used for restoring pixel values of image in video encoders and video decoders with compensated movement.

SUBSTANCE: in video decoder system for encoding a video series, containing a series of video frames, each one of which has a matrix of pixel values, interpolation filter is determined to restore pixel values during decoding. System encodes interpolation filter coefficients differentially relatively to given base filter, to produce a set of difference values. Because coefficients of base filter are known to both encoder and decoder and may be statistically acceptably close to real filters, used in video series, decoder may restore pixel values on basis of a set of difference values.

EFFECT: efficient encoding of values of coefficients of adaptive interpolation filters and ensured resistance to errors of bit stream of encoded data.

5 cl, 17 dwg

FIELD: video decoders; measurement engineering; TV communication.

SUBSTANCE: values of motion vectors of blocks are determined which blocks are adjacent with block where the motion vector should be determined. On the base of determined values of motion vectors of adjacent blocks, the range of search of motion vector for specified block is determined. Complexity of evaluation can be reduced significantly without making efficiency of compression lower.

EFFECT: reduced complexity of determination.

7 cl, 2 dwg

The invention relates to the field of digital signal processing

FIELD: information technology.

SUBSTANCE: invention belongs to such record medium as the BD-ROM (blue-ray disc read-only memory), particularly to the systems displaying subtitles and interactive content via graphic means. The AVClip stream recorded on a BD-ROM disc is formed by multiplexing the graphic stream with the video stream. The graphic data stream is a succession of PES-packets, which includes PES-packets with graphic data (ODS segments) (Object Definition Segments) and PES-packets with control information (PCS segments) (Presentation Composition Segments). In each ODS segment, the values of DTS-score (decoding time score) and PTS-score (presentation time score) refer to the decoding start and stop moments for respective graphic data. In each PCS-segment, the value of the PTS-score refers to the time moment where the respective decoded graphic data is combined with the video data stream.

EFFECT: creating a record medium that would actualise high level of graphic data resolution without increasing the factory costs.

8 cl, 91 dwg

FIELD: methods for reproduction of graphic data from data carrier for displaying on-screen menu.

SUBSTANCE: in accordance to the invention, data carrier stores a stream of interactive graphic data reproduced on request, which is activated in response to user command. Effect is achieved by decoding graphic data and outputting decoded graphic data, where decoded data is outputted in response to activation command, if the data is the first graphic data, or outputted in a certain time, if decoded data is the second graphic data.

EFFECT: ensured reproduction from data carrier by means of extension of function of interactive graphic data stream to support interaction with user.

5 cl, 26 dwg

FIELD: recording device, and record carrier, on which a program is recorded for recording onto record carrier, for example, DVD disks.

SUBSTANCE: in accordance to the invention, control information address (DK) is identified with a link to control table (TV), recorded in fixed area on optical disk, and recording format is selected for control information (DK), recorded in combination with extended file (EF).

EFFECT: increased efficiency of usage of information recording area, ensured faster finding of recorded files.

3 cl, 35 dwg

FIELD: method and device for processing information, program and record carrier.

SUBSTANCE: invention includes method and device for processing information, program and data carrier for recording a file, including information, which is provided for clarification in GUI (graphical user interface), information of main reproduction track, information of auxiliary reproduction track, information for connection between corresponding reproduction domains which constitute main reproduction track, or information about tabs and renewal points, used by user for setting up a required scene. Method includes receiving sound and/or image information; generating information of map which describes mutual dependence between input point provision timestamp and address and track control information associated with input point, which include information of main route, which describes provision track, consisting of first reproduction element, and auxiliary track information, which describes provision track, consisting of second reproduction element, wherein provision of aforementioned second reproduction element of auxiliary track is synchronized with aforementioned first main track reproduction element; and track control information and map information are outputted.

EFFECT: provision of methods and devices for processing and reproducing information.

12 cl, 121 dwg

FIELD: engineering of hardware for personal computers and television, possible use in personal computer monitors and television receivers.

SUBSTANCE: in the method for creating an image on the screen, accumulation of previous frame codes is performed by three frame code accumulators, synchronous output of all codes in parallel form to "code - radiation duration" transformers and enabling of all elements by control signals from transformers for radiation with durations proportional to value of each code. Device for realization of the method is a digital monitor which contains flat-panel screen, introduced to which are three channels of color signals, each one of which includes accumulator of frame codes and block for generating control signals, light channel of each emitting cell contains a light luminescence diode and color filter, control input of each light diode is connected to output of block for generating control signals, and the screen has a number of recesses equal to number of emitting cells, in which those emitting cells are positioned.

EFFECT: creation of image on the screen without horizontal scanning.

2 cl, 9 dwg

FIELD: method and device for processing AV information, engineering of data carrier for recording a file including information provided for clarification purposes in graphic user interface, information of main and auxiliary reproduction routes, information about connection between appropriate reproduction domains along main route.

SUBSTANCE: type CPI_type is described in PlayList. CPI_type contains type EP_type and type EP_map_type. If position of I-image can be determined, type EP_map_type is utilized, if it can not be determined, type EP_map_type is utilized. Therefore, recorded AV stream data are subject to analysis of I-image and AV data of stream recorded without designation of I-image position may be controlled jointly.

EFFECT: possible joint controlling of AV stream, for which high speed reproduction is possible, and AV stream, for which such a possibility is not available, and also repeated recording is possible.

17 cl, 123 dwg

The invention relates to electrical engineering and can find application in the production of films, television for protection, commercial television, television for professionals, cable television, and primarily in the video

The invention relates to the recording and playback of images

FIELD: physics.

SUBSTANCE: invention is related to coding audio signals with flows of audio data. Invention consists in combination of separate flows of audio data into multi-channel flows of audio data by means of data unit modification in audio data flow, which is divided into data units with audio data of determination unit and data unit, for instance, by supplementing, adding or replacing of their part, so that they include indicator of length, which displays value or length of data, respectively, of audio data of data unit or value or length of data, respectively, of data unit, in order to receive the second flow of audio data with modified data units. Alternatively, flow of audio data with indicators in determination units, which point to audio data of determination unit connected to these units of determination, but distributed among different data units, is transformed into flow of audio data, in which audio data of determination unit are combined into audio data of continuous determination unit. Then audio data of continuous determination unit may be included into self-sufficient element of channel together with their determination unit.

EFFECT: simplification of audio data manipulation in relation to combination of separate flows of audio data into multi-channel flows of audio data or general manipulation of audio data flow.

13 cl, 9 dwg

Up!