Broadcasting system, transmission device and transmission method, reception device and reception method, and programme

FIELD: information technology.

SUBSTANCE: invention relates to broadcasting systems. Transmission device 11 transfers the data of the material received by means of a sensor, and the additional information used to process the data of the material of reception device 12. Reception device 12 receives the data of the material and the additional information, and generates input data based on the material's data and additional information. Besides, reception device 12 generates the output proceeding from the output data, e.g. generates the output displayed on the output facilities, such as a display device or an air conditioner.

EFFECT: possibility to provide a map image, corresponding to user's preferences and additional effects, or like, when performing respective processing on the reception part of a broadcasting system.

21 cl, 22 dwg

 

The technical field to which the invention relates.

The present invention relates to a broadcasting system, transmission device and transmission method, reception device and method, and program, and more specifically, it includes, for example, the broadcasting system, transmission device and transmission method, reception device and method, and program that enable you to perform corresponding processing on the receiving side play.

The level of technology

For example, in the modern system broadcast to the analog broadcast or digital broadcast images and sound editing as material data in the station broadcasts on the side of the transmission. Edited images and sounds passed as program broadcasting.

On the other hand, for example, in every house on the receive side, the output of programs transmitted in the broadcasting mode of the station broadcasts, i.e. displaying images and outputting sounds as programs execute using a TV (television receiver) or the like.

It should be noted that, when attention is focused on one program, one type of image and sound, the resulting editing performed in the station broadcast, transmit mode broadcast as one is the TEP in the modern system of broadcasting. In accordance with this, the users who are on the receiving side, can enjoy only one type of image and sound.

However, the potential to raise the requests for the participation of the user, for example, in one program such as a drama program, and these requests are that the user wishes to see the development of the drama program from the point of view favorite character from the user, and queries can be directed to the fact that the user wishes to change the course of history, depending on point of view.

Therefore, therefore, the author of this application previously proposed a device receiving a digital broadcast, allowing the user to freely select and track a variety of materials prepared from different points of view (for example, see Patent document 1).

[Patent document 1] Publication No. 2006-217662 on the examination for Japanese patent

The invention

Technical task

At the same time, the station broadcast, for example, image and sound editing, as described above. However, on the display screen TV provided on the receiving side, which should be a displayed image, do not edit.

Accordingly, when the display screen of the TV, reduktornogo on the receiving side, great face man show in TV, with a size that is larger than the size of the face in the real world, owing to what may be degraded sense of reality.

In addition, on the TV screen, which is provided on the receiving side, for example, as described above, display of the edited images and sounds that have been edited in a station broadcasting.

Accordingly, for example, when the station broadcasts are editing, in which a large number of telop (used in Japan transmission system of the text with the television signal, the text superimposed on the image) is combined with the image, even the user (audience)who feel the effects allocated using a telop, are unnecessary, should look at the image that combined a large number of telop.

The present invention was developed in view of such circumstances, and allows you to perform corresponding processing on the receiving side of the system broadcast, allowing, thus, for example, to provide for the user a sense of reality or display the image corresponding to the user preferences.

Technical solution

The broadcasting system in accordance with the first aspect of the present invention is a system for broadcasting, is with the transmission device, which transmits data and a reception device that receives the data. The transmission device includes a transmission medium, designed to transmit data material, which was produced using the sensor, and additional information, which is used for data processing of the material in the pickup device. The pickup device includes a means of receiving, designed for reception of material data and the additional information; generating tool that is designed to generate output data based on the data material and the additional information; and an output medium, designed for generation of output based on the output data.

In the first aspect, as described above, the transmission device transmits the data material, which was produced using the sensor, and the additional information, which is used for data processing of the material in the pickup device. The reception device receives the material data and the additional information and generates output data based on the data material and additional information. Then form the output based on the output data.

The transmission device or program in accordance with the second aspect of the present invention is a transmission device, which, together with the reception device that receives data, obrazujetsia broadcasting and then transmits the data or the program to ensure that the computer functions of the transmission device. The transmission device includes a transmission medium, designed to transmit data material, which was obtained using the sensor, and additional information, which is used for processing material pickup device.

The transfer method in accordance with the second aspect of the present invention is a transmission method for a transmission device, which, together with the reception device that receives data, forms a broadcasting system, and which transmits the data. The transmission method includes the step of transmitting the device data material obtained using the sensor, and additional information, which is used for data processing of the material, in the pickup device.

In the second aspect, as described above, the transfer material data obtained by the sensor, and the additional information, which is used for data processing of the material in the pickup device.

The pickup device or program according to the third aspect of the present invention is a reception device, which, along with a transmission device that transmits data, forms the broadcasting system, and which receives the data or the program to ensure that the computer functions of the pickup device. The pickup device includes, in the case where the condition is the transmission device transmits the data material, you get with the help of the sensor, and the additional information, which is used for data processing of the material in the pickup device, the tool reception designed for reception of material data and the additional information; generating tool that is designed to generate the data output on the basis of material data and the additional information; and an output medium, designed for generation of output data on output.

The method in accordance with the third aspect of the present invention is a method for pickup device, which, along with a transmission device that transmits data, forms the broadcasting system, and which receives the data. The method includes the following steps: when the transmission device transmits the data material, which was produced using the sensor, and additional information used for data processing of the material, in the pickup device using the pickup device to receive the data material and the additional information; generate a data output based on the data material and additional information and generate output based on the data output from the tool output to generate the output.

In the third aspect, as described above, accept the material data and the additional information and the output of aneerood on the basis of these materials and additional information. Then form the output based on the output data using the output to generate output.

It should be noted that the programs can be provided by transmission through a transmission medium or by recording them in the recording media.

In addition, the transmission device or the reception device can be an independent device, or may be an internal block comprising a single device.

In addition, the transmission (broadcast) (transfer) data can also be made via cable or wireless transmission medium and can also be accomplished through a mixed cable and wireless transmission medium.

Preferred effects

In accordance with the first-third aspects of the present invention the process may be performed on the receiving side play.

Brief description of drawings

Figure 1 shows a block diagram illustrating an example of configuration options for performing broadcasting system in which the present invention is used.

Figure 2 shows a block diagram illustrating a configuration example of the transmission device 11.

Figure 3 shows diagrams to explain the parts of the data material and information effect.

Figure 4 shows a diagram to explain the information of the switch.

Figure 5 shows the diagram intended the war to explain the information of the viewing angle of the display.

Figure 6 shows a diagram intended to explain the data used for the combination.

7 shows a diagram intended to explain the information of the real world.

On Fig shows the block diagram of the sequence of operations to explain the transfer process.

Figure 9 shows a block diagram illustrating an example configuration of the device 12 receiving.

Figure 10 shows the block diagram of the sequence of operations designed to explain the pickup device.

Figure 11 shows a diagram intended to explain the data, which serves to block 54 generate when processing in real mode.

On Fig shows a block diagram illustrating a configuration example of the block 54 generation, which performs processing in real mode.

On Fig shows a diagram schematically illustrating a state in which the image of the subject image shoot with the camera, and the display state in the output unit 55, which is a display device.

On Fig presents diagrams schematically illustrating a state in which the image of the subject image shoot with the camera using the optical zoom image, and the display state in the output unit 55, which is a display device.

On Fig presented the trading scheme, schematically illustrating the state in which the image of the subject image shoot with the camera, using digital zoom, and a display state in the output unit 55, which is a display device.

On Fig presents diagrams illustrating images displayed in the output unit 55, which is a display device.

On Fig shows a block diagram of a sequence of operations designed to explain the process in real mode.

On Fig shows a diagram explaining the data that serves to block 54 generating in the process mode of entertainment.

On Fig shows a block diagram illustrating a configuration example of the block 54 generation, which performs the process in the mode of amusement.

On Fig shows a block diagram illustrating an example configuration of the module 84 image processing.

On Fig shows a block diagram illustrating an example of another configuration of the device 12 receiving.

On Fig shows a block diagram illustrating an example of configuration options for performing computer system to which is applied the present invention.

The explanation of the non-reference position

11, the transmission device 12 pickup device 13 transmission medium 31 of the sensor unit 32, device editing, 33 block coding, 34 the Lok transmission, 51 unit admission, 52 block decoding, 53 the separation unit 54 to the power generation, 55 output block input block 56, the buffer 71, 72 analysis module additional information 73 module image processing, 81 buffer image data 82 analysis module for more information, 83 buffer data destined for the combination, 84 processing unit of the image 91 of the selection of the image 92 of the selection instruction, 93 part of the installation of the display area, 94 part a logic gate 95 of the processing overlay 101 power generation, 102 output block input block 103, 111 the power generation, 112 output block input block 113, 201 bus, CPU 202, ROM 203, RAM 204, 205 hard disk 206 output module input module 207, 208, the data transmission module 209 drive, 210 interface input/output, 211 removable recording media

Detailed description of the invention

Figure 1 illustrates an example configuration option run play (the term "system" refers to a logical set of a variety of devices, regardless of whether these appliances having individual configurations in the same building), belongs to the present invention.

In figure 1, the broadcasting system includes a transmission device 11 and the device 12 receiving.

The transmission device 11 is placed, for example, in station broadcasting and transmission is t different data types as programs through the carrier 13 of the transmission, such as a terrestrial wave, a satellite circuit, the Internet or CATV (cable TV).

The device 12 taking place, for example, at home, and it receives the data transmitted from the transmission device 11 through Wednesday 13 transfer.

It should be noted that the device 12 receiving made with the possibility of functioning, such as TV.

In addition, the system broadcasts the number of devices 11 transmission is not limited to one and may be provided by a variety of devices 11 of the transmission. A similar configuration is also used for device 12 receiving.

Figure 2 shows an example configuration of the transmission device 11, shown in figure 1.

In figure 2, the transmission device 11 includes a block 31 of the sensor device 32 editing unit 33 encoding and transmission unit 34.

It should be noted that the block 11 of the sensor and the device 32 editing can be provided as a device different from the transmission device 11.

The sensor block 31 receives the data of the material used as the material for the editing device 32 editing, and transmits the data material in the device 32 editing and in block 33 encoding.

Here one type of sensor, one type of sensors or multiple types of sensors can be used as a block 31 of the sensor.

In addition, examples of the types of sensors include a camera, to the which are sensitive to light and which outputs the image captures the image), the microphone, which is sensitive to sound and outputs the sound (collects sound), and sensors that individually detect the temperature, humidity, air flow direction, air flow rate, vibration, brightness, etc.

Using the camera to perform image capture, and image (data) is output as the data material. In addition, using a microphone collect sound and sound (data) output as material data. In addition, each of the sensors, which determine the corresponding one of temperature, humidity, air flow direction, air flow, vibration and brightness data indicating the corresponding one of temperature, humidity, air flow direction, air flow, vibration and brightness, output as material data.

The device 32 editing generates on the basis of material data transmitted from the block 31 of the sensor, additional information that should be used for data processing of the material in the device 12, and transmits the additional information in block 33 encoding.

Here, the additional information includes information in the real world, denoting a physical quantity in the real world, the goal defined by the block 31 of the sensor, and the information effect, which is information that is used to prevent the data processing of the material.

Information effect generate, for example, in accordance with the operation performed in the operation device 32 editing the producer of the program, which generates a program in the broadcast.

Information in the real world generate, regardless of the operations performed by the producer of the program.

It should be noted that the device 32 edit additional information generated for each data type material. Accordingly, when multiple types of sensors used in the sensor block 31 and many types of material data supplied from the block 31 of the sensor unit 33 coding, additional information corresponding to each of the many types of material data, supplied from the device 32 editing unit 33 encoding.

Block 33 encoding encodes data of the material transferred from the block 31 of the sensor, and the additional information transmitted from the device 32 editing. It should be noted that the block 33 coding performs multiplexing of the data material and the coded data in accordance with necessity. Then the block 33 encoding delivers the coded data, which are the result of encoding in block 34 of the transfer.

The transmission unit 34 transmits the coded data is transferred from the block 33 of the code, that is, material data, receiving the data with the help of block 31 of the sensor, and the additional information generated by the device 32 editing, through Wednesday 13 transmission (figure 1).

Next will be described the details of the effect to be included in the additional information generated by the device 32 editing.

Figure 3 shows part of the data material and information effect corresponding parts of the data material.

In other words, in part a of figure 3 illustrates the multiple streams of images (moving images), which are provided as part of the data material.

For example, when the image for a specific program made by editing N, which is a set of streams of images S N 1, S n 2, ... and S # N, N streams of images S No. 1 to S # N is used as part of the data material for the image.

Here N streams of images S No. 1 to S # N can represent an image, individually obtained using N cameras, which are provided as N sensors, or may be an image obtained M (<N) cameras, which are provided as M sensors, where M is less than N (and equal to and greater than one).

In other words, for example, N streams of images obtained by taking pictures of landscapes from different locations using a single camera at different times, can be used as the N streams of images S No. 1 to S # N

In addition, the N streams of images that are used as the N streams of images, 's No. 1 - S # N, represent the following: N streams of images obtained as a result of an individual taking pictures of the whole orchestra, conductor, musician, playing a particular musical instrument, the audience, spectators, etc., for example, using N cameras; N streams of images obtained by individual capture images of the entire stadium, scoreboard, representing the account, specific players, spectators, etc. when performing coverage of sporting events, such as the report of the football game; and N flow images obtained from individual capture images throughout the Studio, each of the cast, etc. in the music program or the like.

In part b of figure 3 illustrates the information effect, corresponding to N streams of images S No. 1 to S # N, which are part of the data material, as illustrated in part a of figure 3.

Information effect includes, for example, switching the information display frame of the image data used for combining, etc.

Information switching is, for example, information for switching between the images displayed in the display device, which is described below, the device 12 p is Yama, by selecting one of the stream of images among the N streams of images S No. 1 to S # N, which are part of the data material.

Information frame of the displayed image indicates, for example, the area corresponding to the area that will be displayed in the device display device 12 of the reception of the image (figure 3 a single stream of images among the images S No. 1 to S # N), which is material data.

The data used for combining represent, for example, the data that you want to use in combination with the image (figure 3 a single stream of images among the images S No. 1 to S # N), which is material data, and includes data intended for the combination, which must be combined with the image.

Here are examples of data to be combined include data (data for the image within the image)related to the image displayed on the small screen, using the display mode of the image within the image (below is an image-PinP), data related to the telop (data related to subtitles), etc.

Further information switching relating to information effect, shown in figure 3, will be described with reference to figure 4.

For example, as described above, when the image for a particular program the guide And configure by editing the N streams of images S No. 1 - S # N, the information switching is information for switching among the images displayed in the device display device 12 receiving, by selecting one of the stream of images among the N streams of images S No. 1 - S № n

For example, the device 32 editing, shown in figure 2, the operation of opening and editing is performed by the serial connection of the image's No. 1, which is a part of the data material in the range from code ts1time to code te1time image S No. 2, which is a part of the data material, ranging from code ts2time to code te2time image S No. 3, which is a part of the data material in the range from code ts3time to code te3time image's No. 1, which is a part of the data material in the range from code ts4time to code te4time ... with each other in accordance with the operation performed by the producer of the program, as shown in figure 4, forming thus (image) specific program A. In this case, for example, information with which the data portion of the material that make up the image for the program and the time codes of the parts of the data material, defined in separate moments of time in the program And is used as an information item is reclusive.

It should be noted that the device 32 editing, when a special effect, such as deleting the images provided for the connection (the so-called edit point) between a particular stream of images S # i and another stream of images # i', information switch includes information related to a special effect.

Next information frame of the displayed image included in the information effect, shown in figure 3, will be described with reference to figure 5.

Information frame of the displayed image indicates, for example, as described above, the area corresponding to the area designated for display device, display device 12 reception for the image which represents the material data.

In other words, for example, when material data represent an image, the device 32 editing can be done editing, in which accounting rectangular area corresponding to the image area, which is material data, as a region (below, called the display area), corresponding to the portion of the image that will be displayed in the device display device 12, and the image area is defined as the image for the program in accordance with the operation performed by the producer of the programme is s, as shown in figure 5.

In this case, the device 32 editing the producer of the program can be used as the maximum area of the image that is material data, and can produce a program, as if the producer of the program to capture image using camera operations using the operation panorama shooting, the tilt operation or the operation of resizing images.

Here the maximum display area is an area of the image which represents the material data. In line with this, the angle of the camera, which is provided as a block 31 of the sensor (figure 2), are set as the maximum angle (which is located on the side of the wide angle) and capture the image, resulting in the size of the display area can be maximum.

It should be noted that any information, which may be installed in the display area, can be used as information of the angle of the display.

In other words, for example, the coordinate system is defined with respect to the image area, which is material data, and the coordinate system and the coordinates are between one vertex of the display area and the top, which is diagonally opposite this vertex is, moreover, the coordinates of one vertex of the display area and the length of the horizontal and vertical display area, or the like, can be used as the information of the angle of the display.

In addition, for example, the display area, which is located in a predetermined position and which has a certain size, can be determined as adopted by default the display area, and background use operations panoramic shooting, operations tilt and zoom the image, which perform to accepted the default display area while editing the producer of the program, can be used as the information of the angle of the display.

Further, the data used to combine included in the information effect, shown in figure 3, will be described with reference to Fig.6.

As described above, the data used for combining represent data that is used for combining the data material. The data used for combining include data intended for the combination, which must be combined with the data of the material and the time information indicating the time at which perform the combination, with the use of data intended for the combination.

In other words, for example, Yes the basic material represent the image, and, in the editing device 32, shown in figure 2, while editing perform the overlay image PinP or telop to (combined with) the image which represents the material data in accordance with the operation performed by the producer of the program, forming thus a program. In this case, the PinP image or telop is used as data intended for the combination.

In addition, information indicating the time during which the image PinP or telop that is used as data intended for combining, impose (for example, at the time, which starts the overlay, and the point in time at which the overlapping ends), for example, the time codes for the program, are used as information about temporal characteristics.

It should be noted that the data used for combining includes, in addition, for example, information relating to the position on the image overlaid image PinP or telop used as data for the combination.

In addition, for example, when the material data are sound, BGM (background music) or the like, are used as data intended for the combination.

When a specific data type of the material is sound, and sound is the sound that accompanied the text of the image (moving image), which is a different type of material data, information indicating a time relative to the sound, also can be used as information temporal characteristics for BGM, which must be combined with sound, and you can also use the time codes of the image, accompanied by a sound.

Next will be described the details of the real world, included in the additional information generated by device 32 editing (figure 2).

As described above, information of the real world is information denoting a physical quantity in the real world, a goal that was defined by a block 31 of the sensor (figure 2). For example, when the block 31 of the sensor is a camera, information about the length of the image denoting the length of the image in the target location of the image (below also referred to as a subject image), whose image shoot with the camera, etc. included in the information of the real world.

Here the length of the capture image using camera will be described with reference to Fig.7.

7 shows a top view, schematically illustrating the state in which the image shooting with the camera.

Now, for ease of description, assume that the optical system (lens group) camera does not have distorted the Yami. Assume that the angle (below also referred to as "angle camera reviews) for the camera in the horizontal direction is equal to 6. Furthermore, assume that the distance (below also referred to as "object distance") from the camera to the subject image is X.

It should be noted that it is assumed that the angle 9 of the camera is known.

In addition, the distance X to the object can be measured using, for example, so-called technology audiokodirovaniya.

In other words, the distance X to the object can be measured using, for example, sensor infrared or sonar. In addition, the distance X to the object can be measured using the schema detection of contrast, in which the focus adjust so that the contrast of the image captured by the camera, was the maximum, the scheme for detecting the phase difference, in which the focus is adjusted to remove the difference between the phases of the images taken with the camera, or the like.

It should be noted that, when the distance X of the object is known in advance, when performing image capture with the camera, measure the distance X to the object is not required.

Now, for ease of description, between the horizontal and vertical lengths of the real world, the image of which is removed in the position of the subject image is to be placed, attention focus, for example, only on the horizontal length. It is assumed that the horizontal length represents the length D of the image in the position of the subject image.

The length D of the image may be determined in accordance with the equation D=2×X×tan(θ//2), using the angle θ of the camera and the distance X to the object.

Therefore, in addition to the image which represents the material data, the device 32, are presented in figure 2, also receives the angle θ of the camera and the distance X of the object from the camera, which is provided by the block 31 of the sensor. The device 32 editing determines the length D of the imaging angle of view 9 cameras and the distance X to the object. Then the device 32 editing enables the length D of the image as information about the length of the image, information of the real world, and transmits this information in the real world in the block 33 encoding.

Next, with reference to Fig, will be described processing (processing), performed by the transmission device 11, shown in figure 2.

The device 11 of the transfer material data, which are obtained by block 31 of the sensor is passed to the device 32 editing and in block 33 encoding.

In the device 32 edit additional information is, which includes one of or both of the information effect and the information of the real world, generated on the basis of material data transmitted from the sensor block 31 and the like, and the additional information serves to block 33 encoding.

When the material data supplied from the block 31 of the sensor unit 33 coding and additional information is passed from the device 32 editing unit 33 coding, as described above, at step S11, the block 33 encoding encodes data of the material to be transferred from the block 31 of the sensor, and the additional information provided from the device 32 editing. Block 33 coding transmits the coded data obtained by encoding, in the transmission unit 34, and the processing goes to step S12.

At step S12, the transmission unit 34 transmits the coded data that has been submitted from the block 33 coding, through Wednesday 13 transmission (figure 1).

Then, figure 9 illustrates an example configuration of the device 12 receiving, shown in figure 1.

Figure 9 receiver unit 12 includes a block 51, block 52, the decoding unit 53 division, block 54 generating output unit 55 and the input unit 56.

It should be noted that the output unit 55 and the input unit 56 may be provided as a device different from the device 12 receiving.

The block 51 of the receiving receives the encoded data transmitted from the device is STV 11 passing through Wednesday 13 transmission that is, the material data and the additional information, and submits these coded data in the block 52 decoding.

Unit 52 decodes decodes the coded data is transferred from the block 51 of the receiving, for receiving data and transmits the data in the block 53 division.

Block 53 split divides the data submitted from the block 52 decoding, material data and the additional information and shall submit these material data and the additional information in block 54 generation.

Block 54 generating generates output data based on the data material and the additional information supplied from the block 53 division, and supplies this output to the output unit 55.

The output unit 55 generates a given output signal based on the output data supplied from the block 54 generation.

Here, the output unit 55, it is possible to use a device that stimulates the five senses of man.

In other words, a device that stimulates the sense of sight, such as a display device that displays an image, or a lighting device which emits light can be used as the output unit 55. In addition, a device that stimulates the sense of hearing, such as a speaker which emits sound, can be used as the output unit 55.

In addition, the device that the stimulus which induces a sense of feeling, such as the air conditioner, which can perform the so-called air conditioning (regulation characteristics of the air) (balance air) temperature, humidity and air flow, or shaking the device that generates vibrations, can be used as the output unit 55.

The user performs the operation input unit 56 using the device 12 receiving. When the user performs the operation input unit 56 to provide guidance, input unit 56 receives the instruction provided by the user, and transmits information about the statement, meaning this statement in the block 54 generation.

Here, when the information instruction supplied from the input unit 56 in block 54 generation, block 54 generating generates output data on the basis of the information statement.

Next, with reference to figure 10, will be described processing (processing)performed by the device 12 receiving, shown in Fig.9.

The block 51 of the receiving waits for transmission of the coded data from the transmission device 11 through Wednesday 13 transmission. At step S31, the block 51 of the receiving receives the coded data and submits these coded data in the block 52 decoding. Processing proceeds to step S32.

At step S32 unit 52 decodes decodes the coded data supplied from the block 51 of the reception, to obtain given the s and submits these data to the block 53 division. The processing moves to step S33.

At step S33, the block 53 split divides the data (decoded data)supplied from the block 52 decoding, material data and the additional information and delivers the material data and the additional information in block 54 generation. The processing goes to step S34.

At step S34 block 54 generating generates output data based on the data material and the additional information supplied from the block 53 division, and supplies this output to the output unit 55. The processing goes to step S35.

At step S35, the output unit 55 generates a given output based on the output data supplied from the block 54 generation.

As described above, in the device 12 receiving, shown in Fig.9, the processing of generating the output data is performed in block 54 generating on the basis of these materials and additional information. Treatment includes various types of processing, and these kinds of processing are classified into processing in real mode and the processing mode of entertainment.

In other words, information of the real world and the information effect is included in additional information. Among the processes for generating the output process is performed using the information of the real world, is a process in real mode, and the process is performed with use the drug information effect, is the process mode of entertainment.

Next will be described the process in real mode and the process mode of entertainment.

Figure 11 illustrates the data (information), which serves to block 54 generating, when the device 12 performs reception process in real mode.

During real mode material data and information of the real world, which is included in the additional information supplied from the block 53 separation in block 54 generation.

In addition, during real mode, when the user performs the operation input unit 56, for filing a statement of change of scale of the image, information about the instruction corresponding to the operation is supplied from the input unit 56 in block 54 generation.

In addition, during real mode characteristic information indicating characteristics of the output unit 55 serves to block 54 to generate, for example, from the output unit 55.

Then, in real mode, the block 54 generating generates output data on the basis of material, information of the real world and the characteristic information in relation to the output unit 55 so that the physical quantity recognized at the output of the output unit 55 will be identified as a physical quantity, denoted by the information of the real world.

Here, to the Yes information instructions supplied from the input unit 56 in block 54 generation, block 54 generating generates the output based on the information statement.

It should be noted that for the process in real mode, the device 12 can be performed without providing the input unit 56. When the device 12 reception is executed without providing the input unit 56, the information statement is not available in block 54 generation.

Next will be described the process in real mode for the case when, for example, the output unit 55 is a display device, and the output data represents the image displayed in the output unit 55, which is a display device.

Here it is assumed that the information D the length of the actual image, which is described with reference to Fig.7, is included in the information of the real world. In addition, it is assumed that the information F of size indicating the size of the display screen, in which the output unit 55, that is, the display device displays the image included in the characteristic information.

It should be noted that here, for simplicity of description, it is assumed that the information F size indicates, between the horizontal and vertical lengths of the display screen of the output unit 55, which is a display device, for example, a horizontal length, as in the case of D the length of the removed image is the position (the length D of the images).

On Fig illustrates an example of the configuration of the block 54 generation when the device 12 performs reception process in real mode.

On Fig block 54 generating includes BF (buffer) 71, block 72 analysis of additional information and module 73 image processing.

The image which represents the material data supplied from the block 53 division (11) in the buffer 71. The buffer 71 stores the image which represents the material data transmitted from the block 53 division.

Information in the real world, included in the additional information supplied from the block 53 separation in block 72 the analysis of additional information. Block 72 analysis of additional information, analyzes the information of the real world, supplied from the block 53 division. Module 72 analysis of additional information selects, for example, information D the length of the actual image (Fig.7)included in the information in the real world, and provides information D the length of the captured image in the module 73 image processing.

As described above, the information D on the length of the image is served from the module 72 analysis of additional information in the module 73 image processing. In addition, an image which represents the material data stored in the buffer 71, passed into the module 73 image processing. the moreover, information F of size, which is a characteristic information related to the display device, is supplied from the output unit 55 (11), which is a device display module 73 image processing.

In addition, when the user performs the operation input unit 56 (11) for instructions on changing the image scale, the information statement, the corresponding operation is passed from the input unit 56 of the module 73 image processing.

Module 73 image processing processes the image which represents the material data based on the image which represents the material data transmitted from the buffer 71, D the length of the image, which was transferred from the module 72 analysis of additional information, and information F of size, which is a characteristic information supplied from the output unit 55, generating, thus, the image that represents the output, so that the size of the object (object image), recognized among the images displayed in the output unit 55, which is a device the display will be identified as the size of the subject image in the real world. Module 73 image processing passes you adnie data to the output unit 55 (11), which is a display device.

In addition, the module 73 image processing processes the image which represents the material data based on the image which represents the material data transmitted from the buffer 71, D the length of the captured image, which was transferred from the module 72 analysis of additional information, and information F of size, which is a characteristic information transmitted from the output unit 55, generating, thus, the image as output, so that the size of the subject image detected from the image displayed in the output unit 55, which is a display device, not to exceed the amount of the subject image in the real world. Module 73 image processing sends output to the output unit 55 (11), which is a display device.

In other words, when the information statement is passed from the input unit 56 of the module 73, the image processing module 73 image processing performs processing to enlarge the image, which is a material data transmitted from the buffer 71, only on the degree of increase in accordance with the zoom, which is stated in the information statement filed with the Oh from the input unit 56, generating, thus, the output data.

However, in this case the module 73 image processing limits the magnification of the image which represents the material, so that the size of the subject image, recognized among the images displayed in the output unit 55, which is a display device, will not exceed the size of the subject image in the real world.

It should be noted that, assuming that the magnification in the case of zoom, which is material data, represents the degree of z zoom, the zoom ratio when the degree of z zoom in the image is less than one (and greater than zero), indicates a decrease.

In addition, Fig it is assumed that information of F size are passed from the output unit 55, which is a device display module 73 image processing unit 54 generation. However, in another case, the information F of size can be pre-stored, for example, in block 54 generation.

Next will be described processing performed by the module 73 image processing.

It should be noted that here, for simplicity of description, it is assumed that the display area (figure 5), the designated data is the angle of the display, equal to the entire area of the image which represents the material data.

On Fig schematically illustrates the state in which the image of the subject image shoot with the camera, and the state in which the image which represents the material data obtained by shooting the image display output unit 55, which is a display device.

In other words, part And Fig is a top view schematically illustrating a state in which the image shooting with the camera, and is the same drawing that and Fig.7.

As described above, the information D on the length of the photographed image may be determined in accordance with the equation D=2×X tanθ/2, using the angle θ of the camera and the distance X to the object.

In part b on Fig illustrates a display state in which an image which represents the material data obtained by shooting the image with the camera, as shown in part a on Fig display in the output unit 55, which is a display device, without increasing their size.

In the output unit 55, which is a display device, the real world, the amount of which is indicated by information D the length of the image display is a display screen, the amount of which is indicated by information F of size.

In accordance with this, the subject image display size, which is F/D times the size of the subject image in the real world.

Now suppose, if D is the length of the actual image is equal to or larger than F size (F≤D), if the zoom ratio z is the zoom ratio is equal to or less than D/F, the size of the subject image displayed in the output unit 55, which is a display device that is equal to or smaller than the size of the subject image in the real world.

Accordingly, when information D the length of the captured image is equal to or larger than F of size, if the degree of z zoom image is equal to or less than D/F, there is no loss of reality of the subject image displayed in the output unit 55, which is a display device, for displaying the subject image output unit 55, which is a display device, with size larger than the size of the subject image in the real world.

Therefore, in the module 73 image processing (Fig), when information D the length of the captured image is equal to or larger than F of size, the degree of z in the expansion of the scale of the image, which is used when performing the magnification of the image which represents the material is limited so that it was equal to or less than D/F.

In contrast, when information D the length of the captured image is less than F of size (F>D), when an image which represents the material data display in the output unit 55, which is a display device, without performing any image processing, the size of the subject image displayed in the output unit 55, exceeds the size of the subject image in the real world.

Accordingly, when the subject image display in the output unit 55, which is a display device, with a size that is larger than the size of the subject image in the real world, the reality of the subject image displayed in the output unit 55, which is a display device that is lost, so that the lost sense of reality.

As described above, for example, a case where the output unit 55 is a display device having a large screen, consider a case in which information D the length of the image is less than F of size.

In addition, for example, also in the case in which the image which is a material data is removed using the optical zoom or digital zoom image information D the length of the image may be smaller than F size.

On Fig is a schematic representation of the state in which the image of the subject image shoot with the camera using the optical zoom, and a state in which an image which represents the material data obtained by shooting the image display output unit 55, which is a display device.

In other words, part And Fig is a top view, schematically illustrating the state in which the image shooting with the camera using the optical zoom.

The angle θ of the camera is reduced (make less), using the optical zoom. As a result, information D the length of the captured image, for example, becomes smaller than in the case shown in part a on Fig.

In part b on Fig presents a display state in which an image which represents the material data obtained by shooting the image, as shown in part And Fig display in the output unit 55, the cat is who represents the device display.

In the output unit 55, which is a display device, the real world, the amount of which is indicated by information D the length of the captured image display on the display screen, the size of which is indicated by information F of size.

On Fig information D the length of the captured image is less than F of size. For this reason, the subject image display in the output unit 55, which is a display device, with a size that is larger than size, which is F/D times the size of the subject image in the real world.

On Fig schematically illustrates the state in which the image of the subject image shoot with the camera, using digital variation of magnification, and a state in which an image which represents the material data obtained by shooting the image display output unit 55, which is a display device.

In other words, in part a on Fig shows a top view, schematically illustrating the state in which the image shooting with the camera, using digital resizing.

In the case of digital resize the image obtained by cutting, in which, for example, the Central area of the image captured by the camera, cut and increase the through signal processing, output as an image which represents the material data.

As for the image, which is material data, obtained using the digital zoom, the angle θ of the camera is reduced, as in the case of optical sizing (Fig). As a result, the information D the length of the image, for instance, such that it was less than in the case shown in part a on Fig.

In part b on Fig illustrates a display state in which an image which represents the material data obtained by shooting the image, as shown in part And Fig display in the output unit 55, which is a display device.

In the output unit 55, which is a display device, the real world, the amount of which is indicated by information D the length of the captured image display on the display screen, the size of which is indicated by information F of size.

On Fig information D the length of the captured image is less than F of size. For this reason, the subject image display in the output unit 55, which is a display device, with a size that is larger than the size that is in the F/D times the size of the subject image in the real world.

As the op is Sano above, when the image which represents the material data is removed using the optical zoom or digital zoom image information D the length of the actual image may be less than F of size. In this case, the subject image display size that is larger than the size of the object of the captured image in the real world, so that you lose the reality.

Therefore, in the module 73 image processing (Fig), when information D the length of the captured image is less than F of size, the degree of z increases when resizing the image set as D/F, and the image which represents the material data, the increase in D/F (here the image is reduced, since D/F is less than one).

In this case, the subject image, which is displayed in a size that is F/D size of the subject image in the real world, if an increase (decrease) do not perform in module 73 image processing, display size, which is equal to the size of the object of the captured image in the real world.

On Fig illustrates images displayed in the output unit 55, which is a display device, when information D length shot from the expressions less what information F of size.

In other words, in part a on Fig illustrates a display state in which an image which represents the material data captured using the optical zoom image, display of the output unit 55, which is a display device. In part b on Fig illustrates a display state in which an image which represents the material data taken using the digital zoom, display in the output unit 55, which is a display device.

In the case when information D the length of the captured image is less than F of size, module 73 image processing sets the degree of z increases, when you resize the display, as D/F (<1), increases the image which represents the material data in the D/F times, and delivers an enlarged image as output data, the output unit 55.

As a result, in the output unit 55, which is a display device that displays a subject image in a state where it is equal to the size of the subject image in the real world.

Here, in the module 73 image processing, an image which represents the material, increase (decrease) in D/F (<1) times, and, as a result, from the representation (output), the resulting increase represents the image, the horizontal length of which is equal to D extent of the captured image.

And, in this case, since the information D the length of the captured image is less than F of size, the horizontal length F' (=D) image, which represents the output will be less than F of size, that is, the horizontal length of the display screen of the output unit 55, which is a display device.

For this reason, the output unit 55, which is a display device, display the black color (so-called black frame) or the like, in another part of the display screen than the plot, which displays the image, which is output to the display screen, for example, as shown in part And Fig.

It should be noted that, when you shoot an image which represents the material data, using digital zoom, the plot (the plot, shaded with diagonal lines in part a on Fig) (below also called "cut area") image, which was shot by another camera, in addition to the Central plot, which was increased using the digital zoom may be left.

In this case, the output unit 55, which is a display device, display the cropped area in another part of the display screen, except the area on which to display the image, which is output on the display screen, for example, as shown in part In on Fig.

Next, with reference to Fig, will be described processing performed in real mode, which is done in the module 73, the image processing shown in Fig.

At step S51 module 73 image processing determines the maximum zoom constituting the D/F, which represents the maximum depth value z zoom image in which an image which represents the material data, reduce to the maximum on the basis of information D the length of the captured image included in the information in the real world, which is supplied from module 72 analysis of additional information, and information F of size, which is a characteristic information related to the output unit 55, which is a display device. In addition, the module 73 image processing establishes the criterion the degree of z1zoom image, which represents the degree of z is increasing in the case in which the information statement is not PR is delivered from the input unit 56 (11). The processing goes to step S52.

Here, when information D the length of the captured image is equal to or larger than F size criterion extent z1the zoom is set equal to one. In the case when information D the length of the captured image is less than F size criterion extent z1zoom image set as the maximum degree of zoom, equal to D/F.

At step S52 module 73 image processing determines whether or not the transmitted information instructions for instructions on resizing of the input unit 56.

At step S52, when determining that the information statement is not received from the input unit 56 of the module 73 image processing, the processing goes to step S53. Module 73 image processing sets the degree of z zoom the displayed image so that it has become a criterion of the degree of increase z1the image scale. The processing goes to step S57.

In addition, at step S52, when determining that the information of the instruction was transmitted from the input unit 56 of the module 73 image processing, the processing goes to step S54. Module 73 image processing determines whether or not the degree of zoom (below also called the is zoom ratio And the zoom ratio in accordance with the instructions), which is defined in information instructions supplied from the input unit 56, the maximum level of zoom, component D/F.

At step S54, when determining that the degree increases when resizing in accordance with the instruction is equal to or less than the maximum zoom ratio D/F of the zoom, the processing goes to step S55. Module 73 image processing sets the degree of z zoom the displayed image, which represents the zoom ratio And the zoom ratio in accordance with the instruction. The processing goes to step S57.

In addition, at step S54, when determining that the extent And scale of the image in accordance with the instruction is not equal to or less than the degree of zoom D/F, that is, when the instructions indicate the level of zoom at which the subject image display with a size larger than the size of the subject image in the real world, and that exceeds the maximum degree of zoom, equal to D/F, processing proceeds to step S56. Module 73 image processing sets the degree of z zoom image as the maximum degree of zoom, equal to D/F. the Processing goes to step S57.

Then, the output unit 55, which is a display device that displays an enlarged image, which represents the output data transferred from the module 73 image processing. Thus, as described with reference to Fig, the subject image display size, which is F/D size of the subject image in the real world.

Accordingly, when the information statement has not been filed from the input unit 56 of the module 73 image processing, that is, when the user does not perform the operation of providing instructions on changing the scale of the image, the subject image display in the output unit 55, which is a display device, size (z=D/F)×F/D), that is equal to the size of the subject image in the real world or dimension (z=1)×F/D), i.e. with the size of the F/D of the size of the object the image in the real world.

In other words, when information D the length of the image is smaller, the eating information F of size, as described above, the criterion of degree z1zoom in on the displayed image, and, therefore, the degree of z zoom the displayed image is set as the maximum degree of zoom, a component of D/F. Thus, the subject image display size, which is D/F×F/D size of the subject image in the real world, i.e. with a size that equals the size of the subject image in the real world.

In addition, when information D the length of the image is equal to or larger than F of size, as described above, the criterion of degree z1zoom image, and, therefore, the degree of z zoom the displayed image is set equal to one. Thus, the subject image display size, which is F/D (≤1) size of the subject image in the real world.

In contrast, when was submitted information instructions from the input unit 56 of the module 73 image processing, that is, when the user has performed an operation flow statement of the scale of the image to scale the image, using a statement of the extent And zoom the image, when the degree And scale of the image in accordance with the instruction is equal to or less than the maximum degree of zoom, component D/F, the subject image display size (z=A)×F/D), i.e. F/D×A size of the subject image in the real world. In the case when the degree And scale of the image in accordance with the instruction is not equal to or less than the maximum degree of zoom, component D/F, the subject image display size (z=D/F)×F/D), which is equal to the size of the subject image in the real world.

In other words, when the degree And scale of the image in accordance with the instruction is equal to or less than the maximum degree of zoom, component D/F, as described above, the degree of z increase the display scale of the image is set as the degree And scale of the image in accordance with the instruction. Thus, the subject image display size, which is A (≤D/F)×F/D size of the subject image in the real world.

In addition, in the case when the degree And scale of the image in accordance with the instruction is not equal to or less than the maximum degree of zoom, component D/F, as described above, the degree of z scale of the displayed image is set as the maximum degree of zoom, a component of D/F. the thus, the the subject image display size, which is D/F×F/D times the size of the subject image in the real world, i.e. with a size that equals the size of the subject image in the real world.

In any case, the subject image display size that is equal to or smaller than the size of the subject image in the real world.

It should be noted that the process performed in real mode, shown in Fig perform, for example, for each frame (or field).

In addition, a process that runs in real mode, shown in Fig, can be performed only for the case in which the subject image display with a size that exceeds the size of the subject image in the real world.

In other words, the process in real mode can be performed, for example, for the case in which the user who performed the search of the subject image displayed with a size that exceeds the size of the subject image in the real world, performs operations on the input unit 56 for supplying instructions to improve the display.

On Fig shows the data (information), which serves to block 54 generating, when the device 12 performs reception process in the mode of amusement.

In the process mode of entertainment material data and information eff the KTA, which is included in the additional information passed from block 53 separation in block 54 generation.

In addition, in the process mode of entertainment, when the user performs the operation input unit 56 for supplying instructions to the data processing of the material, information statement, the corresponding operation is passed from the input unit 56 in block 54 generation.

Then, in the process mode of entertainment, block 54 generating machines material data that have been transferred from block 53 separation processing based on the information effect is transmitted from the block 53 division, or information instructions transmitted from the input unit 56, generating, thus, the output data.

In other words, when the information statement is not passed from the input unit 56 in block 54 generation, block 54 generating machines material data that have been transferred from block 53 separation processing based on the information effect, supplied from the block 53 division, generating, thus, the output data.

In addition, when the information instruction supplied from the input unit 56 in block 54 generation, block 54 generating machines material data that have been transferred from block 53 separation processing based on the information of the instruction, generating, thus, the output data.

Further, for example, on the basis of p is apologise, the output unit 55 is a display device, and the output is an image intended for display in the output unit 55, which is a display device, will be described the process in the mode of amusement.

On Fig illustrates an example of the configuration of the block 54 generation when the device 12 performs reception process in the mode of amusement.

Here it is assumed that the information of the switch, the information display frame image and the data used for the combination, which is described with reference to Fig.3-6, included in the information effect. It is assumed that the N streams of images S No. 1 to S # N exist as part of the data material.

On Fig block 54 generating includes BF (buffer) 81 of the image data, the module 82 analysis of additional information, the buffer 83 data intended for combination and module 84 image processing.

N streams of images S No. 1 to S # N, which are part of the data material is passed from block 53 division (Fig) in the buffer 81 of the image data. The buffer 81, the image data stores the N streams of images S No. 1 to S # N, which are pieces of material data transmitted from the block 53 division.

The information effect is included in the additional information passed from block 3 of the separation module 82 analysis of additional information.

Module 82 analysis of additional information, analyzes the information effect, which was transferred from block 53 division, to highlight, for example, data switch, the data frame of the displayed image and the data used for combining, which were included in the information effect.

In addition, the module 82 analysis of additional information selects the data to be combined, which should be combined with parts of the data material, and time information that indicates the time at which perform the combining of the data used for the combination.

Then the module 82 analysis of additional information transmits data intended for combining, in the buffer 83 data intended for the combination. In addition, the module 82 analysis of additional information transmits the information of the switch, the information frame of the displayed image and time information in the module 84 image processing.

The information in the switch, the information frame of the displayed image and time information, which module 82 analysis of the additional information transmitting module 84 image processing, that is, information of the switch, the information frame of the displayed image and time information included in the information eff the KTA, jointly referred to as the accepted default information.

In the buffer 83 data intended to combine, save data, intended for the combination supplied from module 82 analysis of additional information.

Adopted by default, the information is transferred from the module 82 analysis of additional information in the module 84 image processing, as described above. In addition, the N streams of images S No. 1 to S # N, which are pieces of material data stored in the buffer 81, the image data serving module 84 image processing. In addition, data intended for combining stored in the buffer 83 data intended for combination, served in the module 84 image processing.

In addition, when the user performs the operation input unit 56 (Fig) to provide instructions for processing parts of the data material, etc., information, instruction corresponding to the operation is supplied from the input unit 56 of the module 84 image processing.

Here, the user performs the operation input unit 56, causing the user can provide instructions (below also called a statement PTZ (panoramic shooting, shooting with tilt and shooting with zoom)), in accordance with which the camera that captures the image displayed in o the bottom block 55 (Fig), which is a display device, operates as if the camera were performed operations panoramic shooting operations shooting inclined or recording operation with the zoom.

In addition, the user performs the operation input unit 56, causing the user can provide instructions (below also called a manual switch)to switch the image displayed in the output unit 55, which is a display device, from a single stream of images S # i in another stream of images S # i' among the N streams of images S No. 1 to S # N, which are part of the data material.

In addition, the user performs the operation input unit 56, causing the user can provide instructions (below also called a statement of combination), to combine (overlay) data for combining with (on) the image displayed in the output unit 55, that is, in the display device.

When the user gives the instruction PTZ, performing the operation with the input unit 56, a display area (figure 5) is moved, or the size of the display area is changed in accordance with the instructions PTZ. The input unit 56 generates information of a frame of the displayed image, in accordance with which the set area is ü display, and transmits the information statement, including the information frame of the displayed image, the module 84 image processing.

In addition, when the user gives an instruction of switching, performing the operation with the input unit 56, an input unit 56 generates information switch (figure 4) in accordance with the instruction switch, and delivers the information statement, including the information switch module 84 image processing.

In addition, when the user is combining statement, performing the operation with the input unit 56, an input unit 56 generates information used to select the data to be combined, which should be combined with parts of the data material, in accordance with the instructions combining, and transmits the information statement, including the information selection module 84 image processing.

Module 84 image processing machines processing the data portion of the material stored in the buffer 81 of the image data using the data intended for the combination, which is stored in the buffer 83 data intended for combining, based on the adopted default information transmitted from the module 82 analysis of additional information, or information instructions transmitted from the input module is 56, in accordance with the necessity, generating, thus, the image that is the output. Module 84 image processing transfers the image to the output unit 55 (Fig), which is a display device.

In other words, when the information statement is not passed from the input unit 56 of the module 84, the image processing module 84 image processing machines processing part of the data material, which is stored in the buffer 81 of the image data, based on the adopted default information transmitted from the module 82 analysis of additional information, generating, thus, the image that is the output.

In particular, for example, the module 84 image processing selects the image quality (below also referred to as the image used for output), which is used as output data, one stream of images among the N streams of images S No. 1 to S # N, which are pieces of material data stored in the buffer 81 of the image data, based on the information of the switch included in the adopted default information.

In addition, for example, the module 84 image processing selects the image corresponding to the display area indicated by the information frame of the displayed image among images, usage is used for output, on the basis of the information frame of the displayed image included in the adopted default information.

In addition, for example, the module 84 image processing combines data intended for the combination, which is stored in the buffer 83 data for combining with the image corresponding to the display area, based on the time information included in the accepted default information. It should be noted here that, when multiple pieces of data, intended for the combination, it is stored in the buffer 83 data intended to be combined, it is assumed that the information relating to which part of the data intended for combining, among the many pieces of data, intended for the combination will be combined with the image corresponding to the display area included in the time information included in the accepted default information.

As described above, when the information statement is not available from the input unit 56 of the module 84 image processing, the material is treated in the module 84 image processing on the basis of the accepted default information transmitted from the module 82 analysis of additional information, i.e. the information of the switch, the information frame of the displayed image and the time information, included is in) information effect, generated by the device 32 editing (figure 2) of the transmission device 11. Thus, the display image, which is obtained by editing performed on the basis of the operation device 32 editing the producer of the program, the output unit 55, which is a display device.

In contrast, when the information instruction supplied from the input unit 56 of the module 84, the image processing module 84 image processing exposes part of the data material, which is stored in the buffer 81 of the image data, the processing based on the information, instructions, generating, thus, the image that is the output.

In particular, for example, when the information of the switch included in the information statement, the module 84 image processing selects how the image is used to output a single stream of images among the N streams of images S No. 1 to S # N, which are pieces of material data stored in the buffer 81 of the image data, based on the information of the switch included in the information statement instead of the information of the switch included in the adopted default information.

In addition, for example, when the information frame of the displayed image included in the information statement, the module 84 image processing selects the image, sootvetstvuyshee the display area, indicated by the information frame of the displayed image, from the image used for outputting, on the basis of the information frame of the displayed image included in the information statement, instead of the information frame of the displayed image included in the adopted default information.

In addition, for example, when the information selection is included in the information statement, the module 84 image processing selects a portion of the data intended for the combination, which must be combined with the image corresponding to the display area, among pieces of data, intended for the combination stored in the buffer 83 data intended for combining, in accordance with the information of choice, included in the information statement, instead of the time information included in the accepted default information. Module 84 image processing combines the portion of the data intended to be combined with the image corresponding to the display area.

It should be noted that, in the case of selecting the data to be combined, using the time information or the information of choice can be selected multiple pieces of data, intended for the combination (such as a set of subtitles, or one subtitle and one picture PinP).

In addition, the boron may designate, what you do not want to choose any data intended for the combination. In the case when the information selection means that data intended for the combination, you do not need to choose, even when the time information included in the accepted default information indicates that certain data intended for the combination should be combined with the image corresponding to the display area, combining the data to combine, do not perform in the module 84 image processing.

As described above, when the information instruction supplied from the input unit 56 of the module 84, the image processing module 84 image processing machines parts of material data stored in the buffer 81 image data processing on the basis of the information statement, instead accepted the default information, thus generating an image that is the output.

It should be noted that, when performing the stop of giving information with instructions after submitting the information to the instruction from the input unit 56 of the module 84, the image processing module 84 image processing machines parts of material data stored in the buffer 81 image data processing on the basis of the accepted default information, generating, thus, the image to the which is the output.

Accordingly, for example, in the case where the time information indicates that the subtitles used as specific data intended for the combination should be combined with the image corresponding to the display area when the user performs the operation input unit 56 to off (not run) combining the data intended for the combination, the subtitles do not display in accordance with the operation performed on the input unit 56.

Then, when the user performs the operation input unit 56 to cancel mute combining data for combining, subtitles combined with the image corresponding to the display area, in accordance with the time information included in the accepted default information, and display.

It should be noted that in block 54 generation, shown in Fig, real-time information can be transferred from the block 53 separation module 84 image processing. In addition, the characteristic information may be submitted from the output unit 55 in module 84 image processing. In module 84 image processing can be executed in real mode, as in the module 73, the image processing shown in Fig.

In other words the, when the information statement is not available from the input unit 56 of the module 84, the image processing module 84 image processing can be performed, the processing in steps S51, S53 and S57 included in the process in real mode, shown in Fig. In this case, the module 84 image processing the image, which represents the output generated so that the size (physical quantity) of the subject image, recognizable to display (output) the output unit 55, which is a display device will be identified as having the size (the size of the subject image of the real world), the indicated information D the length of the image included in the information of the real world.

Then, on Fig illustrates an example configuration of the module 84, the image processing shown in Fig.

On Fig module 84 image processing includes part 91 select the image portion 92 of the select statement, part 93 selecting the display area, a portion 94 of the logic gate and part 95 of the processing of the application.

N streams of images S No. 1 to S # N, which are pieces of material data stored in the buffer 81 of the image data, serves in part 91 select an image.

In addition, adopted by default, information is supplied from the module 82 analysis will Supplement Inoi information (Fig) in the part 92 of the select statement. In addition, information instruction supplied from the input unit 56 (Fig) in the part 92 of the select statement.

Part 92 of the select statement provides information switch included in the adopted default information, which was filed from the module 82 analysis of the additional information in part 91 select an image. In addition, part 92 of the selection instruction transmits the information frame of the displayed image included in the adopted default information, in part 93 installing the display area.

In addition, part 92 of the select statement generates, based on the time information included in the accepted default information, information selection, designed to select the data to be combined, which are used for combining, in the moments of time at which the data is intended to be combined must be combined. Part 92 of the select statement provides information selection part 94 logic gate.

However, when the information instruction supplied from the input unit 56 (Fig) in the part 92 of the select statement, part 92 of selection instructions preferably selects information statement.

In other words, when the information instruction supplied from the input unit 56 in the portion 92 of selection instructions, and information of the switch included in the information statement, part 92 of the select statement podaa the information switch, included in the information statement, instead of the information of the switch included in the adopted default information, in part 91 select an image.

In addition, when the information instruction supplied from the input unit 56 in the portion 92 of selection instructions, and information of the frame of the displayed image included in the information statement, part 92 of the select statement provides information frame of the displayed image included in the information statement, instead of the information frame of the displayed image included in the adopted default information, in part 93 installing the display area.

In addition, when the information instruction supplied from the input unit 56 in the portion 92 of selection instructions, and information selection included in the information statement, part 92 of the select statement provides information selection included in the information statement instead of information selection, generated from the time information included in the accepted default information, in part 94 of the logic gate.

Part 91 select an image, selects, as the image to be used for output, based on the information switching transferred from part 92 of the select statement, one stream of images among the N streams of images S No. 1 to S # N, which are pieces of material data transmitted from the buffer 81 of the image data. Part 91 selecting the image the Oia displays an image, used for output, in part 93 installation in the display area.

Part 93 installation in the display area selects the image corresponding to the display area indicated by the information frame of the displayed image, which was transferred from part 92 of the select statement, among the images used in the output, which was filed from part 91 select an image. Part 93 installing the display area gives an image corresponding to the display area, in part 95 of the processing of the application.

At the same time, the information selection served from part 92 of the select statement in part 94 of a logic gate as described above. In addition, a portion of the data intended for combining stored in the buffer 83 data intended for combination, serves in part 94 of the logic gate.

Part 94 logic gate selects zero or more pieces of data intended for combining, among pieces of data, intended for the combination stored in the buffer 83 data intended for combining, in accordance with the information of choice, supplied from the portion 92 of the select statement. Part 94 logic gate takes the selected portion of the data intended for combining, in part 95 of the processing of the application.

Part 95 processing overlay combines (imposes) zero or the more pieces of data, designed for layering, which were transferred from part 94 logic gate, (on) the image corresponding to the display area, which was transferred from part 93 installing the display area. Part 95 processing overlay delivers the combined image as output data to the output unit 55 (Fig).

It should be noted that, when served zero parts of the data to be combined, that is, when not serving data intended for the combination of part 94 of a logic gate in the part 95 processing overlay, part 95 processing overlay transmits the image corresponding to the display area, which was transferred from part 93 installing the display area, as output data, the output unit 55, without performing any processing for the image corresponding to the display area.

Then Fig illustrates an example of another configuration of the device 12 receiving, shown in figure 1.

It should be noted that in the drawing items, shown in Fig.9, are denoted by the same numbers reference position, and therefore the following description, accordingly, are excluded.

As shown in figure 2, the device 11 to the transfer of various types of sensors such as a camera that detects light and outputs an image, a temperature sensor, which userauthenticator, a humidity sensor which detects humidity, vehicle speed sensor air flow, which determines the flow rate of air, a vibration sensor that detects vibrations, and a brightness sensor, which determines the brightness, can be used as a block 31 of the sensor.

When, for example, camera, temperature sensor, humidity sensor and the vibration sensor is used as the block 31 of the sensor unit 11 of the transmission, each of the pieces of data corresponding to the image received by the camera, some data related to the temperature obtained by the temperature sensor, part of the data related to the moisture content obtained by the humidity sensor, and part of the data related to the vibrations received by the vibration sensor, transmit, together with the corresponding one of the pieces of additional information.

In this case, the block 53 of the separation device 12 receiving shared data that is supplied from block 52 decoding individually on the part of the data related to the image of the data related to the temperature, of the data related to humidity, and a portion of the data relating to vibrations that are part of the data material, and additional information corresponding to individual parts of the data material, as shown in Fig.

Then the block 53 separate passes of the data related to the image, which pre is is a part of the data material, and the corresponding part of the additional information in block 54 generation.

In addition, the block 53 division takes part data corresponding to the temperature, and the portion of the data related to humidity, which are part of the data material, and the corresponding additional information in block 101 generation. In addition, the block 53 division submits some data relating to vibrations that are part of the data material, and the corresponding part of the additional information in block 111 generation.

As described above, the block 54 generating processes based on the part data corresponding to the image that represents the portion of material data transmitted from the block 53 division, and additional information or part of the information statement, which was transferred from the input unit 56, a portion of the data material, generating, thus, an image that represents a portion of the output. Block 54 generating delivers a portion of the output to the output unit 55.

The output unit 55 is, for example, a display device, as described above, and performs (forms) display (output) image on the basis of the output supplied from block 54 generation.

At the same time, the portion of the data related to the temperature, and a part of the data of the bearing to moisture, which are part of the data material, and the corresponding part of the additional information supplied from the block 53 separation in block 101 generation. In addition, some of the information statement is passed from the input unit 103, which is operated by the user, as in the case of the input block 56, block 101 generation.

Block 101 generation processes on the basis of data relating to temperature, and part of the data related to humidity, which are pieces of material data transmitted from the block 53 division, and additional information or part of the information statement, which was transferred from the input unit 103, the data portion of the material, generating, thus, a portion of the output. Block 101 generation transmits part of the output data to the output unit 102.

In other words, the output unit 102 is, for example, the unit air-conditioning, such as air conditioning. Block 101 generation processing of the data related to the temperature, and the portion of the data related to moisture, generating, thus, part of the information management and air to control the output unit 102, that is, the unit air-conditioning. Block 101 generate displays of the information control air conditioning as part of the output data to the output unit 102.

The output unit 102 outputs the air flow (air), with the direction of air flow, temperature, air velocity, humidity, etc. that govern on the basis of the information control air conditioning, which is a part of the output data supplied from the block 101 generation.

In addition, some data related to vibration, which is a part of the data material, and the corresponding part of the additional information supplied from the block 53 separation in block 111 generation. In addition, some information instructions supplied from the input unit 113, the operation which the user performs, as in the case of the input block 56, block 111 generation.

Block 111 generating processes, on the basis of data relating to vibration, which is a part of material data supplied from the block 53 division, and additional information or part of the information statement, which was transferred from the input unit 113, the data portion of the material, generating, thus, a portion of the output. Block 111 generation delivers a portion of the output to the output unit 112.

In other words, the output unit 112 is, for example, a chair or shaking the device, which shakes (vibrates) the output unit 55, which is a device the display. Block 111 generating processes of the data related to vibration, generating, thus, part of the vibration information to control the output unit 112, which is a standard device. Unit 111 to generate displays of the information vibration as part of the output data to the output unit 112.

The output unit 112 shakes the chair or the like on the basis of the vibration information, which represents the portion of the output data supplied from the block 111 generation.

In addition, for example, the device 11 may be passed as part of the data material of the data related to the brightness obtained by the brightness sensor, which determines the brightness, together with the relevant part of the additional information. In this case, the device 12 can be adjusted on the basis of output data generated from the data portion of the material, and additional information, for example, the illumination provided by the lighting device provided in the room where the device 12 receiving.

In addition, for example, the device 11 may be passed as part of the data material of the data related to the air flow rate obtained by the speed sensor air flow, which determines the flow rate of air, together with the corresponding cha is thew additional information. In this case, the device 12 can be adjusted on the basis of output data generated from the data portion of the material, and additional information, the flow rate of air for air flow output from the output unit 102, which is a device for air-conditioning.

As described above, the transmission device 11 transmits the data portion of the material related to the image, temperature, humidity, air flow, vibration, brightness, etc. received by the block 31 of the sensor, together with the relevant parts of the additional information. The device 12 generates reception of the output-based parts of the data material and parts for more information. The display device, the unit air-conditioning, a shaking device, lighting, etc. that form out, get this output on the basis of pieces of output data, resulting in a realistic landscape plays in the environment, where the data portion of the material were obtained by using the block 31 of the sensor, temperature, air velocity, humidity, vibration, brightness, etc. so that the user may receive a sense of reality.

Then the sequence described above processing can also be performed using the hardware and that the same can be done using software tools. In the case when the processing sequence is performed using software, a program configuring the software is installed in a General computer or the like.

Thus, Fig illustrates an example of configuration options for performing the computer where you installed the program that performs the above processing sequence.

The program can be recorded beforehand in a hard disk 205 or the ROM 203, is used as the recording medium, installed in the computer.

Alternatively, the program may be temporarily or permanently stored (recorded) on a removable media 211 record, such as a flexible disk, CD-ROM (permanent memory on compact disc), MO (magneto-optical) disk, DVD (digital versatile disc), magnetic disk or semiconductor storage device. Removable media 211 records of this type can be provided as so-called packaged software.

It should be noted that the program installed in the computer from the removable media 211 entries, as described above. In other cases, the program may be transferred to the computer from a download site via an artificial satellite for digital satellite broadcasting, wirelessly or can be transferred to the computer cerussite, such as a LAN (LAN, local area network) or the Internet through the cable. In the computer program, so assigned, may be received by module 208 and data installed on the hard disk 205, embedded in it.

Your computer is equipped with CPU (Central processing unit) 202. The CPU 202 is connected to the interface 210 of the I/o through the bus 201. When the user enters an instruction to the CPU 202 via the interface 210 of the I/o when performing operations with the input module 207, which is made on the basis of a keyboard, mouse, microphone, or the like, the CPU 202 executes the program stored in ROM (read only memory) device 203 in accordance with the instruction. Alternatively, the CPU 202 loads into RAM (random access memory) 204 the program stored on the hard disk 205, the program is transmitted through a satellite or a network, which takes module 208 and data installed on the hard disk 205, or the program read from the removable media 211 entries installed in the actuator 209, and which is installed on the hard disk 205. The CPU 202 executes the program. Thus, the CPU 202 performs processing in accordance with the flowchart of the sequence of operations described above, or processing carried out by using the configurations presented on the above-described flowcharts. For the eat, in accordance with the necessity, for example, via the interface 210 I/o, the CPU 202 outputs the processing result from an output module 206, which is made using LCD (LCD, liquid crystal display), a speaker or the like, or passes the result from the module 208 data, and the like, optionally, providing a record of the hard disk 205.

Here, in the present description, the processing steps describing the program for performing various computer processing, may not necessarily processed in chronological order described as the flowchart of the sequence of operations, and include processing executed in parallel or individually (e.g. parallel processing or object-oriented processing).

In addition, the program may be processed by one computer or may be subjected to distributed processing using multiple computers. In addition, the program may be transferred to a remote computer and executed on it.

As described above, in the broadcasting system shown in figure 1, the device 11 of the transfer material data obtained by the block 31 of the sensor, and additional relevant information. Because the device 12 receiving generates output data based on the data material the additional information and generates output based on the output data, appropriate processing can be performed on the side of the device 12 receiving, which makes it possible, for example, providing sensations user-reality or display image suitable for the user preferences.

In other words, the device 12 can be executed in real mode, as the corresponding process. Using the process in real mode, for example, the subject image display size that is larger than the size of the subject image in the real world, resulting in preventable loss of reality.

In particular, for example, if I may say so, mismatch, such as a display face of a man with a size that is larger than the actual size of the face, eliminate such that the user is provided the sense of reality.

It should be noted that, when the display device having a large display screen (for example, 100 inches or more)is used as the output unit 55 when processing in real mode, the image may be displayed so that the image looks, if I may say so, as borrowed scenery.

In addition, the process in real mode, particularly useful, for example, in the case when the image is landscape or part of music reproduction to the e orchestra plays provided as material data.

In other words, for example, data related to the image of the actual landscape display with the camera, using the sensor, and the process in real perform to these data related to the image, using the data related to the image, as material data, resulting in an image of trees, mountains, etc. that make up the landscape, can be displayed with a size that provides for the user the sensation of viewing the landscape in the place where the user could see the actual landscape.

In addition, for example, data related to the part of music reproduction that is actually played by the orchestra, was produced using the microphone that is used as a sensor, and processing real-do for data related to the musical piece, using data related to the musical piece data of the material, resulting in a musical piece (sound) can be deduced from the volume, which gives a sensation of the user is listening to this music plays, he plays the orchestra, committing actions which actually perform this musical play.

In addition, the device 12 receiving, processing mode of entertainment can be performed as appropriate treatment. When about is abode mode entertainment for example, there may be displayed an image suitable for the user preferences.

In other words, when the processing mode of entertainment, eliminating the effect of allocation, which was applied by the producer of the program when editing and which is not necessary for the user can be implemented by adding the effect of selection corresponding to the user's preferences, choices, strain mapping, etc.

In particular, for example, when processing a mode of entertainment, when the user performs the operation input unit 56, providing, thus, instructions for combining the information to select the data to be combined, which should be combined with the data of the material generated in the input unit 56, in accordance with the instruction combining. In module 84 image processing (Fig), the choice (or unselected) data intended for the combination, such as subtitles (telop) or the PinP image, which should be combined with the image, perform in accordance with the information of the selection.

Accordingly, the user feels that the number of telop or images PinP large, such that the effects of selection using a telop and images PinP is not necessary in the news program, performs operations on the input unit 56, the cut is ltate which the user can view the news program in the state, where there is no telop or the PinP image.

In addition, for example, when processing a mode of entertainment, when the user performs the operation input unit 56, providing, thus, instructions PTZ information frame of the displayed image, to specify the display area (figure 5) is generated in the input unit 56, in accordance with the instructions PTZ. In module 84 image processing (Fig), the image corresponding to the display area indicated by the information frame of the displayed image, are extracted from the data material.

In line with this, in the sports program or music program, a user who is a fan of a particular player or a famous person that performs operations on the input unit 56, causing the user can view the image, which is obtained, followed by a certain player or a famous person.

It should be noted that, as described above, when the information statement is passed from the input unit 56 of the module 84 image processing (Fig), the module 84 image processing preferably selects information statement. Module 84 image processing processes the data material on the basis of the information statement, showing, thus, an image corresponding to the user's preferences. However, when info is the information statement is not available from the input unit 56 of the module 84 image processing, material data processed on the basis of the accepted default information, i.e. information of the switch, the information frame of the displayed image and the time information included in the information effect, which is generated through the device 32 editing device 11 of the transfer.

In line with this, the user performs the operation input unit 56, causing the user can provide a display output unit 55, which is a display device, the image corresponding to the user preferences. The user performs operations on the input unit 56 (the user performs the operation input unit 56 so that the instruction PTZ, manual switch or instruction combining, which were provided by input operations in the input unit 56, cancelled), then the user may provide a display of the output unit 55, which is a display device, the image obtained by editing that was performed by executions of the operations with the device 32 editing the producer of the program.

It should be noted that embodiments of the present invention is not limited to the above-described variants of execution. Various modifications can be made without going beyond with whom snasti of the present invention.

1. The system broadcasts containing a transmission device that transmits data and a reception device that receives data, the transmission device comprising a transmission medium designed to transmit data material, which was produced using the sensor, and additional information, which is used for data processing of the material in the pickup device,
the pickup device includes
the tool reception designed for reception of material data and the additional information
the generating tool that is designed to generate output data based on the data material and additional information, and
the tool output, designed for generation of output based on the output data.

2. The broadcasting system according to claim 1,
in which the additional information includes information in the real world, denoting a physical quantity in the real world, and in which the generating tool generates output based on the material information of the real world and the characteristic information indicating the characteristics of the means of exit so that the physical value detected by the output means output, identify how physical quantity indicated by the information of the real world.

3. The broadcasting system according to claim 2,
in which date the IR is a means for shooting an image, designed to capture images,
in which material data represent an image captured by means of the image,
in which the tool output is a display device that displays an image,
in which the data output represent the image displayed by the display device,
in which information of the real world includes information length of the images, indicating the length of the image in the image acquisition, and
in which the characteristic information includes information of a size that indicates the size of the display screen on which the display device displays the image.

4. The broadcasting system according to claim 3,
in which a means of generating generates the image as output data based on the image which represents the material data, the information length of the actual image, which is information of the real world, and the information size, which is a characteristic information, so that the size of the object recognized by the image displayed by the display device, does not exceed the size of the object in the real world.

5. The broadcasting system according to claim 3,
in which a means of generating generates the image is s, which is the output data, on the basis of the image which represents the material data, information about the length of the actual image, which is information of the real world, and the information size, which is a characteristic information so that the size of the object detected in the image displayed by the display device, identifies as the size of the object in the real world.

6. The broadcasting system according to claim 2,
in which the means for transmitting transmits the set of data types of material that are obtained from multiple types of sensors and parts for more information, corresponding to the number of separate types of material data, and
in which the pickup device includes
the tool reception
many types of tools for generating individual generate many types of data output based on multiple data types, material and parts of the additional information corresponding to the multiple types of material data, and
many types of means of access to obtain release based on multiple types of data output.

7. The broadcasting system according to claim 6,
in which one specified media type of the output among the many types of means of access is a means for air-conditioning, in which ar is n given the data type of the output among the many types of data output represents the data related to the temperature or flow rate of air, and
in which the vehicle air-conditioning forms the basis of the data relating to the temperature or flow rate of the air output to reproduce the temperature or flow rate of air in the environment in which many types of material data were obtained using multiple types of sensors.

8. The broadcasting system according to claim 1,
in which the additional information includes information effect, which is information used for data processing of the material, and
in which a means of generating generates the data output by performing the data for material processing based on the information effect.

9. The broadcasting system of claim 8,
in which the pickup device additionally includes means input instructions designed for reception processing instructions provided by the user, and
in which the generating tool is tested on the basis of the information effect or information statement in relation to the processing instructions provided by the user, material data, which is required for the user.

10. The broadcasting system according to claim 9,
in which the sensor is a means for shooting an image that is intended to capture the image,
in which d is installed material represent the image, shooting means shooting images
in which the tool output is a display device that displays an image,
in which the data output represent the image displayed by the display device,
in which information statement includes information frame of the displayed image, indicating the area corresponding to the area that you want your device to display the display image, which is material data, and
in which a means of generating generates the data output by selecting an image corresponding to the area indicated by the information frame of the displayed image, from the image which represents the material data.

11. The broadcasting system of claim 8,
in which the sensor is a means for shooting an image that is intended to capture the image,
in which material data represent an image captured by means of the image,
in which the output medium is a display device that displays an image,
in which the data output represent the image displayed by the display device, and
which effect includes data intended for the combination that you want to combine with Dan the burnt image which are material data, and a means of generating generates the data output by combining based on the information effect, which should be combined with an image which represents the material data.

12. The broadcasting system according to claim 11,
in which data intended for combining represent data related to the subtitles, or data to represent the image of the other image programs.

13. The broadcasting system of claim 8,
in which the sensor is a means for shooting an image that is intended to capture the image,
in which material data represent an image captured by means of the image,
in which the output medium is a display device that displays an image,
in which data is output represent the image displayed in the display device,
in which the information effect includes information frame of the displayed image that includes a region corresponding to the area that you want to display on a display device for an image which represents the material data, and
in which a means of generating generates the data output by selecting an image corresponding to the region of the aforesaid information frame of the displayed image, the image which represents the material data.

14. The broadcasting system of claim 8,
in which the sensor is a means for shooting an image that is intended to capture the image,
in which material data include part of the data material, and part of the data material are lots of moving images captured by means of the image,
in which the output medium is a display device that displays an image,
in which the data output represent the image displayed by the display device,
which effect includes information switch to switch between the images displayed in the display device, by selecting one of the stream of images among a set of moving images, which are part of the data material, and
in which a means of generating generates the data output by selection on the basis of the information switching the moving image among a set of moving images, which are part of the data material.

15. The broadcasting system according to claim 9,
in which additional information additionally includes information in the real world, denoting a physical quantity in the real world, and
in which, when the and the user does not enter the processing instruction, a means of generating generates the output based on the data material, real-world information and characteristic information representing characteristics of the output medium, so that the physical quantity is detected on the output generated by the output was identified as the physical quantity indicated by the information of the real world.

16. The broadcasting system according to claim 11,
in which the pickup device additionally includes means input instructions designed for reception processing instructions provided by the user, and
in which the generating tool is tested on the basis of the information effect or information statement in relation to the processing instructions provided by the user, material data required by the user.

17. The broadcasting system according to clause 16,
in which the information effect includes information points in time, representing the time in which perform a combination of pre-combined data, and
in which, when the user enters an instruction for processing, a means of generating generates the data output by combining based on the time information data intended to be combined with the image which represents the material data./p>

18. A transmission device, which, together with a reception device that receives data, is a system of broadcasting which transmits the data transmission device that contains
the vehicle, designed to transmit data material, which was produced using the sensor, and additional information, which is used for data processing of the material, in the pickup device.

19. A transfer method, intended for the transmission device, which, together with a reception device that receives data, forms a system of broadcasting which transmits the data transmission method comprising
the phase of the transmission device data material obtained using the sensor, and additional information, which is used for data processing of the material, in the pickup device.

20. The pickup device, which, together with a transmission device that transmits data, forms a system of broadcasting and which receives the data, the reception device containing:
when the transmission device transmits the material data obtained using the sensor, and additional information used for processing material with a pickup device,
the tool reception designed for reception of material data and the additional information;
the generating tool, designed on the I generate the data output on the basis of material data and the additional information; and
the tool output, designed for generation of output based on the data output.

21. The method of appointment for pickup device, which, together with a transmission device that transmits data, forms a system of broadcasting and receiving data, the method containing the following steps:
when the transmission device transmits the data material, which was produced using the sensor, and additional information used for data processing of the material, in the reception device, the reception device
take the material data and the additional information;
generate data for output on the basis of material data and the additional information and
generate output based on the data output from the output medium, to generate output.



 

Same patents:

FIELD: information technology.

SUBSTANCE: when a High Definition Multimedia Interface (HDMI) (R) source (71) performs bidirectional transmission of Internet Protocol (IP) data with an HDMI (R) user (72) using a consumer electronics control (CEC) line (84) and a signal line (141), a switching control module (121) controls the switch (133) so that when data are transmitted, the switch (133) selects the component of the signal which forms a differential signal which is output from a conversion module (131), and when data are transmitted, the switch (133) selects the component of the signal which forms the differential signal output from the receiver (82). During bidirectional data transmission using the CEC line (84) only, the switching control module (121) controls the switch (133) such that the CEC signal coming from the HDMI (R) source (71) or the receiver (82) is selected.

EFFECT: providing a high-speed bidirectional data transmission interface, which is compatible with a data transmission interface which enables one-way transmission of pixel data of uncompressed images with high speed.

12 cl, 24 dwg

FIELD: information technology.

SUBSTANCE: system is suggested to provide multimedia services where intermediate service rendering software receives information on multimedia services location which is updated by users, multimedia services planning policy and information on device service from service control agent and loads it in service location register, and starts or stops corresponding service control agent according to information of services control agent device service. Service location register authenticates request for subscriber multimedia services management according to information about multimedia services location and determines service management agent for user via authentication according to multimedia services planning policy. Request for user multimedia services control is forwarded to certain service control agent which provides multimedia service online control through online electronic program graphic (EPG) and multimedia services management using service rendering server.

EFFECT: providing effective management for multiple multimedia services of various contents and types.

16 cl, 11 dwg, 3 tbl

FIELD: information technologies.

SUBSTANCE: method includes stages of multiple data production on preferences on content, containing the first data on preferences on content of the first user and the second data on preferences on content of the second user, production of dependence data that specifies dependence of the first data on content preferences from the second data on content preferences, and using the specified multiple data on content preferences to select combined content under control of the specified dependence data. The invention also relates to a combining device to make it possible for the first user and the second user to receive the combined content.

EFFECT: expansion of functional capabilities through making it possible for one user to give priority to the other user to include a specific content element into combined content.

10 cl, 2 dwg, 1 tbl

FIELD: information technology.

SUBSTANCE: like or dislike of a content element played on a personalised content channel is determined based on feedback from the user; the profile is updated based on the determined like or dislike, wherein that profile is associated with the personalised content channel and contains a plurality of attributes and attribute values associated with said content element, where during update, if like has been determined, a classification flag associated with each of said attributes and attribute values is set; the degree of liking is determined for at least on next content element based on said profile; and that at least one next content element is selected for playing on the personalised content channel based on the calculated degree of liking.

EFFECT: method for personalised filtration of content elements which does not require logic input or user identification procedures.

5 cl, 1 dwg

FIELD: information technologies.

SUBSTANCE: user is offered a set of types of typical virtual channels based on a certain previously determined or to be determined category, for instance, a news channel, which contains previously determined default settings and a procedure of actions. Templates of typical virtual channels considerably simplify setting a virtual channel to a viewer. For instance, for a news channel a procedure of default actions consists in storing only the latest news, for series - a procedure of defaults actions is in storing everything until viewed.

EFFECT: provision of simple and efficient device to determine television channels for users in compliance with their own interests and habits, using previously specified templates.

21 cl, 7 dwg

FIELD: information technology.

SUBSTANCE: when an HDMI (R) source performs bidirectional PI data with an HDMI (R) user (72) using an UBE line (84) and a signal line (141), a switching control module (121) controls the switch (133) so that when data are transmitted, the switch (133) selects the component of the signal which forms a differential signal which is output from a conversion module (131), and when data are transmitted, the switch (133) selects the component of the signal which forms the differential signal output from the receiver (82), and during bidirectional data transmission using the UBE line (84) only, the switching control module (121) controls the switch (133) such that the UBE signal coming from the HDMI (R) source (71) or the receiver (82) is selected.

EFFECT: high-speed bidirectional data transmission while maintaining compatibility.

34 cl, 21 dwg

FIELD: information technology.

SUBSTANCE: method involves providing a communication line between a user interface and a multimedia data management apparatus, providing a second communication line between the multimedia data management apparatus and a digital multimedia data processor connected to a memory device for storing and extracting digital multimedia data, wherein said digital multimedia data processor is connected with possibility of communication with at least one digital audio-visual player.

EFFECT: faster processing of user requests.

19 cl, 7 dwg

FIELD: information technology.

SUBSTANCE: device and method for providing and presenting customised channel information involve receiving service attribute information corresponding to a basic service where the basic service is for providing a device at least part of presentation. The device and methods also involve customising service attribute information with the received customised definition of attributes, thus defining customised channel information and providing a user device with customised channel attribute information. Channel information contains customised service attribute information, thus provides the type of channel setting.

EFFECT: possibility of setting up an access channel and associated additional data based on content given by the vendor.

57 cl, 22 dwg

FIELD: information technologies.

SUBSTANCE: servers of electronic program schedules, having one and the same processing ability and/or one and the same flow of service processing, are grouped, compliance is set between a number of server group and characteristics of users determined by attributes to generate dispatching policy, which is then stored at dispatching server. User terminal initialises request of dispatching to dispatching server. Afterwards dispatching server assigns server of electronic program schedules for user terminal, this setting an interactive connection between user terminal and server of electronic program schedules. As a result, data flow bypasses city-wide network, and pass band of manifold communication network is therefore less involved, so that number of users capable of using the service, could continuously expand.

EFFECT: reduced data flow in city-wide network and increased system working characteristics and efficiency of Internet television system.

17 cl, 5 dwg

FIELD: information technology.

SUBSTANCE: system for releasing programs includes: program release unit designed for releasing programs in an IP television channel; a switching data recording unit designed for recording and monitoring switching data in the channel, as well as for issuing activation commands to the channel program switching unit in accordance with monitored data; a channel program switching unit designed for receiving said activation commands and switching the program source of the corresponding channel in the program release unit in accordance with channel switching data.

EFFECT: less manual operations and high reliability of the system.

14 cl, 5 dwg

FIELD: terminals for processing digital audio-video and multimedia data.

SUBSTANCE: device has decoder for receiving transmitted data, data-processing system and memory, while data-processing system stores data of user profiles, related to parameters or preferences of multiple users of terminal. Also, user profiles match terminal operation modes, and profiles data include data about priorities, pointing out rights of each user for using terminal resources.

EFFECT: higher efficiency, broader functional capabilities.

2 cl, 7 dwg

FIELD: cable television.

SUBSTANCE: system is made with possible reversed transfer of signals to local networks, included in system, back to main station of system in preset range of carrying frequencies, system has means for controlling interference, which include detection means, which are made with possible comparison of signals power level to one preset control level, and with possible generation of logic signals, memory devices, and also means in main station of system, using which information stored in memory means of local networks can be queried for transfer to main station for estimating this information at main station.

EFFECT: higher efficiency, broader functional capabilities.

10 cl, 3 dwg

FIELD: cabletelecasting.

SUBSTANCE: proposed local network 10 has subscriber leads 21 coupled with cabletelecasting head station 11 in desired carrier band. Network 10 also has noise suppression unit 21 whose input 24 is connected to leads 21 and output, to station 11. Unit 23 has band filters 28 - 31 connected to input 24 and using different passbands and transducers 32 - 35 designed for comparing output signal level of respective filter 28 - 31 with preset reference level and for feeding comparison result dependent logic signals to logic control unit 36 that functions to find out if these signals comply with conditions stored in unit 36 so as to convey control signal to unitized interlocking circuit 26 inserted in separate line 25 used to transfer signals from input 24 to output 27. Circuit 26 passes signals when logic signals received by unit 36 meet conditions stored therein.

EFFECT: provision for integrating interlocking means of noise-suppression unit in unitized interlocking circuit common for all carrier passbands.

9 cl, 2 dwg

FIELD: television.

SUBSTANCE: device has scaling block, two delay registers, block for forming pixel blocks, buffer register, block for calculating movement vectors, two subtracters, demultiplexer, enlargement block, pulsation filtering block, mathematical detectors block, multiplexer, reverse scaling block, as a result of interaction of which it is possible to detect and remove some series of TV frames from programs, which cause harmful effect to viewer, specifically pulsations of brightness signals and color signals with frequency 6-13 Hz.

EFFECT: higher efficiency.

1 dwg

FIELD: television.

SUBSTANCE: device has blocks: first interface block, providing receipt of data about switching of programs by subscriber, electronic watch block, first memory block for archiving data about time of viewing of each selected program, second memory block, containing electronic addresses of broadcast companies, block for rearranging data about viewing time, processor, forming packet of data about which TV program and time of its viewing, third interface block, providing output along phone network of data about viewing time of each TV program to server of company, which broadcast current TV program.

EFFECT: higher efficiency.

1 dwg

FIELD: communication systems.

SUBSTANCE: transfer system for transfer of transport flow of MPEG-2 standard from transmitter 10 to receiver 14, having check channel 16, along which receiver 14 can transfer selection criterion for selection of information blocks of MPEG-2 standard to transmitter 10, transmitter 10 has selector 38, which receivers required criteria and then filters information blocks in accordance to these criteria prior to transfer.

EFFECT: higher efficiency.

4 cl, 3 dwg

FIELD: technology for broadcast transmissions of digital television, relayed together with multimedia applications.

SUBSTANCE: method includes transmission of digital signal, having additional data flow, appropriate for compressed video images and data flow, appropriate for at least multimedia application, and also service signals, meant for controlling aforementioned data flows, service signals is determined, appropriate for series of synchronization signal, including series, meant for assignment of multimedia signal, meant for recording execution parameters of aforementioned assigned multimedia application, after that multimedia application is loaded and multimedia application is initialized with aforementioned execution parameters.

EFFECT: possible interactive operation of multimedia application with user.

2 cl, 2 dwg

FIELD: engineering of receivers-decoders used in broadcasting systems such as television broadcasting system, radio broadcasting system, cell phone communication system or other similar systems.

SUBSTANCE: method includes performing transmission to receiver-decoders through broadcasting system of a command, ordering receiver-decoders to perform an action; when command is receiver, identifier of command stored in current command is compared to identifiers of commands stored in memory of current decoder-receiver; aforementioned action is only executed in case when command identifier is not stored in memory.

EFFECT: transmitted commands are only executed once and information related to each receiver-decoder malfunction and may be useful for detecting and repairing malfunction is extracted remotely.

2 cl, 10 dwg

FIELD: technology for providing centralized remote control over digital television systems.

SUBSTANCE: interface of global WAN network is emulated for IP datagram over original remote interface of adapter and simple IP datagram transfer function is added between global WAN network interface and original Ethernet network interface in accordance to protocols stack. Therefore, system for controlling local network of digital television system performs IP connection to systems for controlling local area networks LANs of other digital television systems, then datagram is transformed to transport packets and transferred jointly with other transport packets via one and the same channel.

EFFECT: possible exchange of control data via network without mounting an additional commutation network.

9 cl, 8 dwg

FIELD: computer science, technology for dynamic control over volume of video data, sent to terminal from server, on basis of data transfer speed in the network.

SUBSTANCE: in the method for providing video data stream transfer service, server determines, whether filling is less than threshold value, or not less, than second threshold value of service, while filling represents amount of data, filling queue generation buffer in a terminal, while first threshold value is less than second threshold value. If the filling is less than first threshold value, than server provides data stream transfer service at predetermined bit transfer speed of service, which is less than current bit transfer speed of service. If the filling is equal to or greater than second threshold value, then server provides service of data stream transfer at predetermined service data transfer speed, which is greater than current bit transfer speed of service.

EFFECT: prevented sudden interruption or delay of data reproduction in the terminal.

6 cl, 3 dwg

Up!