Image capturing device and control method thereof

FIELD: physics.

SUBSTANCE: disclosed is an image capturing device which captures a plurality of primary images in continuous capturing mode. The image capturing device includes an image capturing means configured to capture a plurality of auxiliary images during the interval between capturing a primary image and capturing the next primary image. The device also comprises a means of determining a primary object configured to determine a primary object. Furthermore, the device comprises first and second object tracking processing means. The first object tracking processing means is configured to detect an area where an object identical to a primary object exists, from a first area which is part of a first auxiliary image, from a plurality of auxiliary images.

EFFECT: high accuracy of the object tracking function of an image capturing device during continuous capture owing to elimination of time delay between detection of an object and obtaining focus information at the position of the object.

10 cl, 12 dwg

 

Area of technology

[0001] the Present invention relates to a device registration image having function of object tracking and continuous shooting function to continuously capture a plurality of still images, and to method of its management.

The level of technology

[0002] the brightness of the object or the position of the object changes over time. For this reason, when continuous shooting of still images is performed using the device registration image having a continuous shooting function, the exposure and focus have to be adjusted every time you capture a still image. Accordingly, there is the device registration image that determine the value of adjusting the exposure for the next still image based on the still images captured immediately before this, and the device registration image that has the function of implementing continuous shooting and automatically adjusts the exposure and focus by using the information obtained from the sensor of the AE (automatic exposure), AF sensor (auto focus), etc., which are provided separately from the element to register the image. In addition, there are cameras with features, such as object tracking using the data of Fig�tion, obtained during continuous shooting, and the detection information of the human face.

[0003] for Example, in the following PTL 1 describes a single-lens reflex camera which is in continuous shooting mode detects an object in the still image captured immediately before, and captures a still image of the next frame with the focus set to follow this object.

Bibliography

Patent source

[0004] PTL 1: Japanese laid patent No. 2010-72283

Summary of the invention

Technical problem

[0005] However, as described in the above PTL 1, one "mirror down" is performed between the image capture for use in the process of object detection and implementation process of focus detection on the basis of the object position. Since there is some time delay between the detection object and receiving the information of the focus position of the object, it may be an invalid distinction between the position of the object during the detection process of the object and the position of the object during the detection process focus. Accordingly it is considered that there remains a margin for improvement in the tracking of the object in the device registration image.

[0006] in addition, in mirrorless SLR camera�, which does not include such a mirror, the shorter the time from the start of detection of the object prior to adjusting the focus on an object, the more effective it is possible to suppress the above-described difference.

[0007] the Present invention made in connection with this problem, and its objective is to provide a mechanism to improve the tracking of the object in the device registration image during continuous shooting.

The solution to the problem

[0008] the Invention according to claim 1 of the present invention is a device registration image that captures many of the main image in continuous shooting mode. The device registration image includes a registration tool image, made with the possibility of capture of the plurality of auxiliary images during the interval between capture of the main image and the capturing of the next main image; a means of determining the main object, made with the possibility of determining the main object; first processing means of tracking the object, made with the possibility of the detection area where an object that is identical to the main object from the first area, which is part of the first auxiliary image, of the plurality of auxiliary images; and second processing means the GSM�ing of the object, made with the possibility of the detection area where an object that is identical to the main object, from a second region of the second auxiliary image from the plurality of auxiliary images, the second region than the first region, and the result of detection performed by the first processing means of tracking the object, is used when adjusting the focus, which is performed before the capturing of the next main image, and the result of detection performed by the second processing means tracking of the object is used in the detection area where an object that is identical to the main object, and the detection is carried out after the capture of the next main image.

Advantages of the invention

[0009] According to the present invention, it is possible to improve the accuracy of tracking of the object in the device registration image during continuous shooting.

Brief description of the drawings

[0010] Fig. 1 is a diagram showing an example of a mechanical configuration of the device for registering image according to the first variant implementation.

Fig. 2 - diagram showing the arrangement of the focus detection device registration image according to the first variant implementation.

Fig. 3 is a diagram showing an example of electric configurationdata registering image according to the first variant implementation.

Fig. 4 is a flowchart of operations illustrating an example procedure of a process performed by the recording device image according to the first variant implementation in continuous shooting mode.

Fig. 5 is a diagram explaining the range monitor according to the first variant implementation.

Fig. 6 is a schematic diagram for explaining temporal modes, which are separate processes according to the first variant implementation.

Fig. 7 is a diagram showing an example of a mechanical configuration of the device for registering image according to the second embodiment of the implementation.

Fig. 8 diagram showing the configuration of a pixel for detecting the phase difference element registering image according to the second embodiment of the implementation.

Fig. 9 - diagram showing the arrangement of the focus detection device registration image according to the second embodiment of the implementation.

Fig. 10 is a diagram showing an example electrical configuration of a device for registering image according to the second embodiment of the implementation.

Fig. 11 is a flowchart of operations illustrating an example procedure of a process performed by the recording device image according to the second embodiment of the in continuous shooting mode.

Fig. 12 is a diagram explaining the range monitor according to a second VA�into implementation.

Description of embodiments of

[0011] Preferred embodiments of the present invention will be described below with reference to the drawings.

[0012] the First variant implementation

First of all, with reference to Fig. 1 to 6 will be described the device registration image according to the first embodiment of the present invention.

[0013] Fig. 1 is a diagram showing an example of a mechanical configuration of a digital single-lens reflex camera, which is capable of recording image according to the first variant implementation.

[0014] In the device 200 registration image shown in Fig. 1, the lens unit 202 of the image capturing attached to the front surface of the housing 201 of the camera. In the housing 201 of the camera lens unit 202 of the image capture subject to connection, allows the replacement of another lens unit of the image capture. This housing 201 of the camera and the lens unit 202 of the image capturing communicate through the mounting-pin connectors, not shown. The lens unit 202 of the image capturing includes a group of lenses 213 and the aperture 214. The housing 201 of the camera is able to regulate the amount of light entering the camera by adjusting the open diameter of the aperture 214 and adjust the focus position by adjusting the position of the lens group 213, through the control�tion on the basis of communication through the mounting-pin connectors.

[0015] the Main mirror 203 is formed by a semitransparent mirror. It is the main mirror 203 is tilted relative to the optical path of the image capture with the exception of the capture of the main image, when the captured still image. In this state, the portion of the stream of light coming from the lens block 202 of image capture, is reflected by the main mirror 203 and is directed into the optical system of the viewfinder, while the remaining light flux that has passed through the main mirror 203, is reflected by an auxiliary mirror 204 and is sent to the block 205 AF.

[0016] Block 205 formed AF sensor of the AF-based method for the detection of the phase difference. Block 205 AF forms a secondary plane of the imaging lens unit 202 of the image capture in a linear sensor detecting the focus. The housing 201 of the camera detects the state of focus adjustment of the lens unit 202 to capture an image from the output signal of the linear sensor of the focus detection, and outputs the control signal for driving the lens group 213 on the basis of the result of detection, thus enabling automatic focus adjustment. Since the focus detection based on a detection of the phase difference is known in the art, the description relating to the specific management, is here omitted.

[0017] Fig. 2 shows a diagram, demonstrating�I the arrangement of the focus detection device registration image according to the first variant implementation.

[0018] Block 205 AF has, for example, the arrangement of the focus detection shown in Fig. 2.

[0019] the Screen 206 of focus shown in Fig. 1, is expected in the plane of the imaging lens unit 202 of the image capturing, which forms the optical system of the viewfinder. Pentaprism 207 represents a pentaprism to change the optical path of the viewfinder.

[0020] the Operator observes the screen 206 focus through the eyepiece 209 in such a way that allows him to visually recognize the object you want to capture.

[0021] Block 208 AE (AE sensor) capable of receiving an output signal related to the brightness of the subject, from the output signal of a large number of two-dimensional pixels placed, to observe the brightness of the subject. In this embodiment, the implementation of the AE sensor observes the outer frame (range registration image) shown in Fig. 2. The AE sensor pixels for color filters of R (red), G (green) and B (blue) are arranged in the form of a strip separately for each color filter. In this embodiment, the implementation of the brightness of the object is observed by the sensor 208 AE and at the same time, the process of tracking the object using the image data obtained by the sensor 208 AE.

[0022] the Housing 201 of the camera also includes a shutter 210 in the focal plane (f�Calne shutter) and the element 211 to register the image. When implementing the exposure, the main mirror 203 and the sub mirror 204 is disposed closer to the screen 206 of focus so that the light flux is not blocked, and focal plane shutter 210 is opened, whereby the element 211 registration image is exposed to light. To avoid misunderstandings, the image capture carried out by the element 211 to register the image with the aim of preserving the image data, hereinafter referred to as "the capture of the main image, whereas the image capturing performed by the sensor 208 AE, hereinafter referred to as "the supplementary image capture". In addition, the image data generated by the capture of the main image, referred to as "main image data", whereas the image data generated by capturing an auxiliary image, referred to as "auxiliary data image". Thus, the element 211 of the image registration, which generates data of the main image, functions as the first element of the registration image, while the sensor 208 AE that generates image data from the auxiliary, functions as the second element image registration.

[0023] furthermore, the block 212, the display displays the information of the image capture and the captured image that allows the user to check its containing�s.

[0024] Next will be described the electrical configuration of the device registration image according to this variant implementation.

Fig. 3 shows a diagram showing an example of electric configuration of the reception device 200 of the image according to the first variant implementation. The components shown in Fig. 1, are denoted by the same positions.

[0025] Block 308 detection operations shown in Fig. 3, detects the operation performed by the user via a button, switch or rotary switch attached to the body of the camera 201, and sends a signal corresponding to the substance of the transaction to the system unit 303 controls. In particular, block 308 detection operation outputs the signal SW1 on the system unit 303 of the control when half pressing the shutter button and outputs a signal SW2 on the system unit 303 of the control when fully pressing the shutter button. Here, the state in which the shutter button is held halfway down by the user, referred to as a "hold SW1" whereas the state in which the release button is kept fully depressed, referred to as a "hold SW2". Additionally, the block 308 detection operation outputs the cancellation signal SW1 on the system unit 303, when the user stops to press the shutter button in the�of Argania SW1, and outputs the cancellation signal SW2 on the system unit 303, when the user stops to press the shutter button on hold SW2.

[0026] in Block 309 control mirror controls the movement of the main mirror 203 and the auxiliary mirror 204 based on the control signal sent from the system unit 303 of the control.

[0027] by Adopting the signal SW1 from block 308 detection operation and in a state of "mirror down" mode, continuous shooting, system unit 303 of the control reads the accumulated data from the line sensor corresponding to each point of the focus detection unit 205 AF, selects the locator point of focus on which to perform focus adjustment, and carries out the calculation of the focus control. Then the system unit 303 sends a control signal to drive the lens based on the result of the calculation at block 314 of the actuator of the lens.

[0028] Block 314 of the actuator moves the lens group of the lens 213 on the basis of signal to drive the lens that shipped with the system unit 303 of the control, thus performing a focusing operation.

[0029] the Element 211 image recording converts light incoming through the lens group 213, into an electric signal to generate image data and outputs the image data to the system unit 303 controls. The system unit 303 management in�leads the image data, received from element 211 register the image, at block 312 controls the display, and records (stores) the image data in device 311 store images.

[0030] Block 312 controls the display displays the image based on the image data on the display based on the image data derived from the system unit 303 of the control.

[0031] the Main memory 307 is a storage device for storing data required in the calculation carried out by the system unit 303 of the control unit 304 of the image processing AE.

[0032] Block 304 image processing AE carries out the calculation of the exposure control on the basis of image data read from sensor 208 AE, and outputs the calculation result to the system unit 303 controls. The system unit 303 sends a control signal to control the aperture on the block 313 iris control on the basis of the calculation result of the adjusting the exposure received from block 304 of the image processing AE. In addition, the system unit 303 sends a control signal to gate control to block 310 to gate control during the descent.

[0033] in Block 313, the control diaphragm actuates the diaphragm 214 based on the control signal diaphragm taken from the system unit 303 of the control.

[0034] in Block 310, the gate driving privodit the action of the focal plane shutter 210 based on the control signal gate, shipped with system unit 303 of the control.

[0035] Block 304 image processing AE is also in the process of tracking the object in continuous shooting mode and detects the position of the main object in the auxiliary data of the image read from the sensor 208 AE. Here, the above-mentioned process of tracking the object has two kinds of algorithms, namely, the first process object tracking and the second tracking process of the object. The result of the first tracking process of the object from the output to the system unit 303 controls. The result of the second tracking process of the object is written to the main memory 307. Two kinds of algorithms processes the tracking of the object will be described below.

[0036] the System unit 303 of the control selects single point focus detection points of the focus detection shown in Fig. 2, on the basis of the result of the first tracking process of the object obtained from the block 304 of the image processing AE. Then the system unit 303 sends a control signal to drive the lens to block 314 of the lens drive based on the calculation result of focus adjustment with respect to the locator point of focus.

[0037] Next will be described the operation of the device registration image according to this embodiment of the in continuous shooting mode.

[0038] Fig. 4 shows a block CX�mA operations showing the example of the procedure of the process performed by the recording device image according to the first variant implementation in continuous shooting mode. The device registration image according to this embodiment of the captures still images from a variety of frames in continuous shooting mode.

[0039] Steps S401-S405 represent processing which is performed on hold SW1, and correspond to the operation of the preparation for continuous shooting. Steps S406-S416 represent processing which is performed when fully pressed the release button upon completion of the above preparation operation for continuous shooting and transition on hold SW2.

[0040] When the user half-presses the shutter button, and the signal SW1 is output to the system unit 303 of the control unit 308 of the detection operation, it starts the process represented by the flowchart of operations in Fig. 4. The following paragraphs will describe each stage.

[0041] First, in step S401, the system unit 303 performs control point selection focus detection and calculation of the focus control based on the output signal of block 205 AF. For example, the system unit 303 of the control selects the detection point of focus, superimposed on the object that supposedly exists at the position close to the reception device 200 of the image./p>

[0042] Then, in step S402, the block 304 of the image processing AE reads the data of the auxiliary image from the sensor 208 AE and allocates a predetermined area in the center of which is the detection point of focus chosen in step S401. Unit 304 of the image processing AE (or the system unit 303 controls) writes the image data of the selected area and the center coordinates of the selected area in the auxiliary data of the image into the main memory 307. It is used in the following tracking process. Next, the image data of the selected area that are recorded in the main memory 307, referred to as "image data" template, while the coordinates of the center of the selected area referred to as the "immediately preceding position of the object". Here, as shown in Fig. 5, described further below, the image data of the template associated with the main object. Unit 304 of the image processing of AE, which allocates the data of the image template, forms a means of determining the main object, made with the possibility of determining the main object to start continuous shooting.

[0043] Then, in step S403, the block 304 of the image processing AE performs exposure calculation using the data of the auxiliary image read from the sensor 208 AE in step S402, and outputs the result to the system�system unit 303 of the control.

[0044] Then, in step S404, the system unit 303 sends a control signal control unit 313 iris control and block 314 lens drive based on the calculation result of focus adjustment performed in step S401, and the calculation result of the exposure performed in step S403. Thus, block 313 iris adjusts the aperture (exposure) on the basis of a control signal, and the block 314 of the actuator of the lens adjusts the focus based on the control signal.

[0045] the Next step S405 is the stage at which it is expected the next operation of the user. In particular, in step S405 the system unit 303 of the control waits until the user releases the shutter button (cancelling SW1) or does not effect the operation of fully pressing the shutter release button (inserting SW2). When the system unit 303 of the management accepts the cancellation signal or signal SW1 SW2 from the block 308 detection operation, the process moves to step S406.

[0046] After the process moves to step S406, after a signal to cancel the waiting state in step S405, the system unit 303 determines whether the input signal SW2. If this definition gives the result that the input signal is not a signal SW2 (is signal cancellation SW1), the continuous shooting process, represented in the block diagram operations Fig. 4, ends. � the other hand, if the input signal is a signal SW2, the process proceeds to step S407, which starts continuous shooting.

[0047] in the processing after the start of continuous shooting steps S407-S409 and steps S415-S416 are performed in parallel. Accordingly, in this embodiment, the implementation of the system unit 303 performs control steps S407-S409, and the block 304 of the image processing AE performs the steps S415-S416, thus enabling parallel processing.

[0048] the Steps S407-S409 are processing in which the system unit 303 to control the capture of the main image.

[0049] First, in step S407, the system unit 303 sends a control signal to the control unit 309 controls the mirror to lift the main mirror 203 and its removal from the optical path of the capture of the main image.

[0050] Then, in step S408, the system unit 303 sends a control signal to the control unit 310 of the gate control for the descent and the imaging element 211 registration image light, thus enabling the capture of the main image. Then the system unit 303 of the control reads the image data generated by the element 211 image registration, and records (stores) them to the device 311 store images.

[0051] Then, in step S409, the system unit 303 sends a control signal n� unit 309 control mirror to lower the main mirror 203 and the auxiliary mirror 204 and places them into the optical path of the capture of the main image.

[0052] the Processing at steps S415-S416 will be described below.

[0053] Then, in step S410, the system unit 303 management determines is cancelled if the hold status of SW2. If cancellation signal SW2 has been adopted, and as a result of this definition is established that hold SW2 is cancelled, the continuous shooting process, represented in the block diagram operations Fig. 4, ends. On the other hand, if cancellation signal SW2 is not accepted, and hold SW2 is not canceled, the process proceeds to step S411.

[0054] After the process moves to step S411, unit 304 of the image processing AE carries out detection of the position of the main object in the first process of tracking the object. At this time, the first process object tracking is performed using the method of matching patterns.

[0055] In step S411, the sensor 208 AE first carries out the accumulation of charges only during the period (first period), which is shorter than the period required to implement appropriate computing exposure control. After the sensor 208 AE finishes for charge accumulation unit 304 of the image processing AE reads the first data from the auxiliary image sensor 208 AE and also reads the image data of the template from the main memory 307. Then the block 304 of the image processing AE defines the correlation miditime two segments of image data, thus detecting the position of the main object in the first data of the auxiliary image. Then the block 304 of the image processing AE outputs the result of detection to the system unit 303 controls. Asking the first collection period to obtain the first data of the auxiliary image is shorter than the period required to implement appropriate computing exposure control, the process of detecting the position of the main object can start earlier.

[0056] Here, in this variant implementation, the sensor 208 AE forms registration tool auxiliary image, configured to capture a plurality of pieces of data of the auxiliary image during the capture interval of the main image in continuous shooting mode. Additionally, for example, the result of detection of the position of the main object obtained in the first process of tracking the object, is used for calculating the focus control for capturing of the next main image.

[0057] In Fig. 5 shows a diagram explaining the range monitor according to the first variant implementation.

Here, the first process object tracking, region, where the agreement is limited to the range (the range of tracking of the first tracking process of the object), which of prohodit immediately preceding position of the object, read from the main memory 307 via a single point of focus detection in each direction up and down and the two points of focus detection in each of the left and right directions, as shown in Fig. 5. Thus, in the first tracking process of the object is detected, the position of the main object associated with a partial range of the data of the auxiliary image. In addition, coordination is carried out in a state in which the resolution of the data of the auxiliary image and the image data of the template is converted to half its original permissions. Thus, it is possible to carry out the agreement at high speed and to obtain the time to move lenses, etc.

[0058] Next, the first auxiliary data image read from the sensor 208 AE in step S411, referred to as "data 1 auxiliary image". In addition, the range where the agreement, referred to as the "range monitor".

[0059] Then, in step S412, the sensor 208 AE carries out the accumulation of charges only for the second collection period, which is determined by subtracting the first period of accumulation of period accumulation necessary for the implementation of appropriate computing exposure control. Unit 304 of the image processing AE reads the second data from the auxiliary image sensor 208 AE � again generates the combined data of the auxiliary image, obtained by summation of the second supplementary image data and data 1 auxiliary image read from the sensor 208 AE in step S411. Unit 304 of the image processing AE carries out the calculation of the exposure control using the combined data generated auxiliary image, and outputs the calculation result of the adjusting the exposure on the system unit 303 of the control.

[0060] Next, the data of the auxiliary image read from the sensor 208 AE in step S412, referred to as "data 2 auxiliary image", while the combined data of the auxiliary image data 1 auxiliary image data and 2 auxiliary image are referred to as "data 12 auxiliary image".

[0061] the Sum of the periods of accumulation in the sensor 208 AE to retrieve data 1 auxiliary image data and 2 auxiliary image is a collection period, required to implement appropriate computing exposure control. Accordingly, the block 304 of the image processing AE can perform the exact computation of adjusting the exposure using the data 12 of the auxiliary image. In addition, by combining pieces of data of the auxiliary image can be expected that the calculation of the exposure yavl�is more stable by reducing the noise and increasing the amount of light than in the case of unmerged image data.

[0062] Then, in step S413, the system unit 303 performs control calculation of the focus control at the point of detection of the focus position of the object detected in step S411.

[0063] Then, in step S414, the system unit 303 sends a control signal control unit 313 of the iris and the block 314 of the actuator of the lens for the implementation of the exposure control and focus control based on the results of the calculations performed in step S412 and step S413.

[0064] At the end of step S414, the process continues to operation of the descent. The system unit 303 of the management carries out the processing of steps S407-S409, whereas unit 304 of the image processing AE carries out the processing at steps S415-S416.

[0065] In step S415 unit 304 of the image processing AE determines whether a capture image for the first frame. If this determination indicates that the image capturing is carried out not for the first frame (image capture is performed for the second and subsequent frame), the process moves to step S416.

[0066] After the process moves to step S416, unit 304 of the image processing AE carries out detection of the position of the main object in the second process object tracking, reviewing, and correcting the result of the first process uslejiva�Oia object, which was performed immediately before (immediately preceding frame). As the first process object tracking, the second tracking process of the object is performed using the method of matching templates. Unit 304 of the image processing AE defines the correlation between these 2 auxiliary image read in step S412, and the image data of the template, stored in the main memory 307, thereby detecting the position of the main object. However, unlike the first process object tracking, the second tracking process of the object range, covered by the agreement (the range of the second tracking process of tracking the object), for all data sets 2 auxiliary image, and the alignment is performed without changing the resolution, as shown in Fig. 5. Thus, we can obtain the result of tracking for a wider range, which had a higher precision than the result of the first tracking process of the object.

[0067] using this result tracking validation and correction are carried out as a result of the first tracking object, which was carried out immediately before the step S411. Here verifies the following three points.

[0068] first, since the range of tracking is limited to the first �the process of tracking the object, the correct result is not obtained, when the main subject is moving quickly beyond the range of tracking. Accordingly, the search is performed on all data 2 auxiliary image in the second tracking process of the object to check whether the main object is out of range of the first tracking process of the object.

[0069] second, in the presence of an object, similar to the main object in the range of tracking these objects may be indistinguishable, and the tracked object may be selected by mistake, because the resolution of image data is reduced in the first tracking process of the object. Accordingly, the implementation of the second tracking process of the object using data 2 auxiliary image, the resolution of which is not reduced, allows to distinguish between the main object and the similar object and to check the correctness of the result of the first tracking process of the object.

[0070] thirdly, since the resolution of image data used in the first process of tracking of the object is converted to a half of the original resolution, the detected position of the object is close, and there may be a difference between the detected position and the true position of the object. Accordingly, in implementing the second process of tracing the object with a higher razresheniya object is detected more accurately.

[0071] If the first process object tracking and the second tracking process of the object give different results when testing the above three points, the information about the immediately preceding position of the object recorded in the main memory 307 is adjusted in accordance with the result of the second tracking process of the object.

[0072] Additionally, when the probability that an object detected in the first process of tracking the object, and the object detected in the second process of tracking the object, the same high, and when the difference between the positions of these objects greater than or equal to a predetermined threshold, the moving object can be predicted on the basis of the position change of the object.

[0073] if in step S415 is determined that the image capturing is performed for the first frame, or if the step S416 ends, the process moves to step S410 at the end of the steps S407-S409, and then the following processing is performed.

[0074] In Fig. 6 shows a schematic diagram for explaining temporal modes in which separate processes are carried out in this variant implementation.

[0075] After the capture of the main image and the subsequent lowering of the main mirror 203 and the auxiliary mirror unit 204 304 image processing prescribes AE sensor AE 208 to perform accumulation �of Aratov only during the first collection period 1 and reads the data of the auxiliary image. Because the collection period for these data 1 auxiliary image is set short, the implementation of computing exposure control the brightness is not enough.

[0076] Next, block 304 of the image processing AE carries out the development process on read data 1 auxiliary image and performs a first tracking process of the object using the data 1 auxiliary image undergone the development process. Then the result of the first tracking process of the object is displayed on the system unit 303 controls. Based on this result, the system unit 303 selects a new control point detection focus. Then the system unit 303 of the control calculates the state of focus adjustment of the lens unit 202 of the image capture in the newly selected point of focus detection from the output signal of the linear sensor of the focus detection sent from block 205 AF, and generates a control signal for driving the lens group 213.

[0077] Instructing the sensor 208 AE to perform accumulation of charges only for the first collection period, the unit 304 of the image processing prescribes AE sensor AE 208 to perform accumulation of charges only for the second period of accumulation and 2 reads the data of the auxiliary image. Since the accumulation period is also specified for these short d�nnyh 2 auxiliary image, to implement the calculation of the exposure control the brightness is not enough.

[0078] Next, block 304 of the image processing AE carries out the development process on the read data 2 auxiliary image and combines data 1 auxiliary image data and 2 auxiliary image, the past development process. Combining data 1 auxiliary image data and 2 auxiliary image, you can retrieve data 12 auxiliary image having the brightness necessary for the implementation of appropriate computing exposure control. Unit 304 of the image processing AE carries out the calculation of the exposure control using the data 12 of the auxiliary image, and outputs the calculation result to the system unit 303 of the control.

[0079] the System unit 303 sends a control signal control unit 313 iris control and block 314 lens drive based on the results of the calculations adjust the focus and calculate the exposure control for the implementation of the exposure control and focus control. At the end of these exposure control and focus control system block 303, the control raises the main mirror 203 and the sub mirror 204 and performs the capture of the main image.

[0080] in addition, the unit 304 of the image processing AE is�street second tracking process of the object, the accuracy of which is higher than the first process object tracking, using data 2 auxiliary image. Since this second tracking process of the object requires more time than the first tracking process of the object begins later than the first process object tracking, the second tracking process of the object can not be reflected in the subsequent capture directly the main image. In addition, while unit 304 of the image processing AE continues the second tracking process of the object begins the capturing of the next main image.

[0081] Instead, through the process of tracing the object with high precision, it is possible to correctly set the range for the track" carried out subsequently to the first tracking process of the object. Accordingly, the second tracking process of the object can contribute to improving the accuracy carried out subsequently to the first tracking process of the object.

[0082] In this variant implementation, because the first fragment of the plurality of pieces of data of the auxiliary image is read at a relatively early stage, as described above, it is possible to obtain the processing time of the tracking process of the object. These data 1 auxiliary image data and 2 auxiliary image obtained POS�it are combined, and the resulting data 12 of an auxiliary image used when calculating the exposure control, the processing time of which is relatively small. Thus, it is possible to obtain the time to process the tracking facility and the implementation of the calculation of the exposure control using the image data for which adequate period of accumulation.

[0083] in addition, carrying out a detailed process of tracking the object using the data 2 auxiliary parallel to the image capturing process of the main image, you can implement the correction of the position of the object and to properly install the range next track the process of tracking the object. According to the aforementioned method, the accuracy of object tracking can be improved by checking and correcting the result of the first tracking process of the object compared with the case where the tracking process of the object is performed only once for one frame. Although the second tracking process of the object begins before surgery recovery mirrors in Fig. 6, the second process object tracking can commence after surgery recovery mirrors. The beginning of the second tracking process of the object can be set arbitrarily, provided that the second process of otslezhivanie� object is completed before the subsequent implementation of the first tracking process of the object.

[0084] Although the above describes the preferred embodiment of the present invention, the present invention is not limited to this variant of implementation, and to the extent of his essence allow various modifications and changes. For example, as for the method of detecting the position of an object, you can use not only the tracking of the object based on the matching templates and tracking using color information and the result of the recognition. in addition, you can use the analysis of moving objects using optical flow and technology scene detection based on edges detection. Although only one piece of image data that is stored first is used as a template for use in aligning the templates in this variant of implementation, the main object can be extracted from the image data, the newly captured during continuous shooting, and the selected area can be set as a new template. Here only the first and the second processes the tracking of the object must be of such a configuration algorithm that processes are designed to track the same object by analyzing the auxiliary image, a first tracking process of the object has a relatively low accuracy but high speed, and the second p�using the tracking object tracks with low speed, but with high accuracy.

[0085] Additional data 2 auxiliary image generated in step S412, is used as the image data, which are used in determining the correlation with image data of the template in the second tracking process of the object in step S416 shown in Fig. 4. However, the present invention is not necessarily limited to this configuration. For example, assuming that the image data are one piece of data of the auxiliary image from the set of data captured by the auxiliary image, the image data can be used in the present invention. For example, the correlation with image data of the template can be determined using a single fragment or multiple fragments of a second or subsequent fragments of the image data. Alternative second tracking process of the object can be performed using data from 12 of the auxiliary image obtained by combining data 1 auxiliary image data and 2 auxiliary image, instead of the data 2 auxiliary image.

[0086] furthermore, although the selected point of detection of the focus placed on the object whose existence is assumed at the position close to the reception device 200 of the image when the processing of step S401 to �IG. 4, the locator point of focus can be selected in accordance with the user command. If the block 304 of the image processing AE has a face detection function to detect human faces from the image data read from sensor 208 AE, the person can choose as the object, and the image data of the area surrounding the detected face can be used as the image data of the template.

[0087] the Second variant of implementation

Next, with reference to Fig. 7 - 12 will be described, the device registration image according to the second embodiment of the implementation.

Fig. 7 shows a diagram showing an example of a mechanical configuration of a mirrorless digital single-lens reflex camera, which is capable of recording image according to the second embodiment of the implementation.

[0088] the device 700 registration image shown in Fig. 7, the lens unit 702 image capture is attached to the front surface of the housing 701 of the camera. In the housing 701 of the camera lens unit 702 of the image capture subject to connection, allows the replacement of another lens unit of the image capture. These housing 701 of the camera and the lens unit 702 of the image capture communicate through the mounting-pin connectors, not shown. The lens unit 702 image capture includes a group of lenses 706 and �iaphragm 707. The housing 701 of the camera is able to regulate the amount of light entering the camera by adjusting the open diameter of the aperture 707 and adjust the focus position by adjusting the position of the lens group 706, through the control based on the communication through the mounting-pin connectors.

[0089] the Housing 701 of the camera includes a focal plane shutter element 703 and 704 of the registration image. When implementing the exposure focal plane shutter 703 is opened, whereby the element 704 registration image is exposed to light. Element 704 register the image has pixels for generating the image and the pixels for detecting a phase difference. Here the pixels for generating the image is a pixel for generating the image data during the exposure, whereas the pixels for detecting a phase difference of a pixel for detecting the phase difference and the implementation of the focus control.

[0090] it Will be described the structure of the pixel for the implementation of the focus control of this element 704 register the image.

[0091] Fig. 8(a) illustrates a planar view of the pixel element 704 register the image, whereas Fig. 8(b) illustrates a sectional view of a pixel. Fig. 8 position 801 denotes a microlens, and each of the positions 802 and 803 is a photodiode. The image data is read using two photodiodes for single microlens, thanks �him the segmentation of the pupil is performed in the left and right sides of the drawing. Comparing the image formed by collecting the output signal of this left pixel, with the image generated by collecting the output signal of the right pixel, it is possible to detect the state of focus adjustment of the lens unit 702 of the capture image on the basis of a detection of the phase difference, as in the linear sensor of the focus detection unit AF.

[0092] In Fig. 9 shows a diagram showing the arrangement of the focus detection device registration image (element 704 registration image) according to a second embodiment of the. Focus adjustment is performed using output signals of pixels for detecting a phase difference, which correspond to a mesh areas for points of detection of the focus.

[0093] Block 705 display shown in Fig. 7, displays the information of the image capture and the captured image that allows the user to check its contents.

[0094] Next will be described the electrical configuration of the device registration image according to this variant implementation.

[0095] In Fig. 10 shows a diagram showing an example electrical configuration of a device for registering image according to the second embodiment of the. The components shown in Fig. 7, are denoted by the same positions.

[0096] Block 1005 discovered�I operations shown in Fig. 10, detects the operation performed by the user via a button, switch or rotary switch attached to the housing 701 of the camera, and sends a signal corresponding to the substance of the transaction to the system unit 1003 controls. In particular, block 1005 detection operation outputs the signal SW1 on the system unit 1003 control when half pressing the shutter button and outputs a signal SW2 on the system unit 1003 control when fully pressing the shutter button. Additionally, block 1005 detection operation outputs the cancellation signal SW1 on the system unit 1003, when the user stops to press the shutter button on hold SW1, and outputs the cancellation signal SW2 on the system unit 1003, when the user stops to press the shutter button on hold SW2.

[0097] the System unit 1003 control reads the moving image generated by an element 704 registration image, and displays a moving image at block 1008 controls the display. This is done to implement the so-called display of the current view. Block 1008 controls the display displays images based on the received moving image at block 705 display.

[0098] by Adopting the signal SW1 from block 1005 detection operation, the system unit 1003 management scity�software information of the phase difference pixels for detecting a phase difference element 704 registration image and performs the calculation of the focus control. After completing the calculation of the focus control, the system unit 1003 sends a control signal to drive the lens based on the calculation result of focus adjustment at block 1010 lens drive.

[0099] Block 1010 lens actuator moves the lens group 706 on the basis of signal to drive the lens, taken from the system unit 1003 controls, thus performing a focusing operation.

[0100] Item 704 image recording converts light entering through the lens unit 702 capture image, into an electric signal to generate image data, and outputs the image data to the system unit 1003 controls. The system unit 1003 outputs the data management of images obtained from pixels for generating the image of an element 704 register the image, at block 1008 controls the display, and records (stores) the image data in the device 1007 storing images during the capture of the main image.

[0101] Block 1008 controls the display displays the image based on the image data at block 705 display, on the basis of image data derived from the system unit 1003 controls. Here, the image capturing for obtaining the image data be written to the device 1007 storing images to be saved in response to a signal SW2, among the fragments of the data� image, output from the element 704 registration image, referred to as the "capture of the main image". The image is not copied to the device 1007 storing images and is designed to display the current view, referred to as "auxiliary image".

[0102] the Main memory 1004 is a storage device for storing data required in the calculation carried out by the system unit 1003 controls.

[0103] Block 1009 iris control actuates the diaphragm 707 on the basis of a control signal diaphragm taken from the system unit 1003 controls.

[0104] Block 1006 control shutter drives the focal plane shutter 703 on the basis of a signal of a gate control shipped with the system unit 1003 controls.

[0105] in addition, the system unit 1003 management is in the process of tracking the object in continuous shooting mode and detects the position of the main object from the image data read from the element 704 register the image. Here, the above-mentioned process of tracking the object has two kinds of algorithms, namely, the first process object tracking and the second tracking process of the object. These two kinds of processes object tracking is performed using the same algorithms as in the first variant implementation.

[0106] systems�unit 1003 control selects single point focus detection point detection focus illustrated with Fig. 9, on the basis of the result of the first tracking process of the object. Then the system unit 1003 sends a control signal to drive the lens to block 1010 lens drive based on the calculation result of focus adjustment with respect to the locator point of focus.

[0107] Next will be described the operation of the device registration image according to this embodiment of the in continuous shooting mode.

[0108] In Fig. 11 shows a flowchart of operations illustrating an example procedure of a process of recording images according to the second embodiment of the in continuous shooting mode. The device registration image according to this embodiment of the captures still images from a variety of frames in continuous shooting mode.

[0109] Steps S1101-S1105 represent processing which is performed in the hold status of SW1 and match preparation operation to continuous shooting. Steps S1106-S1115 represent processing which is performed when fully pressed the release button upon completion of the above preparation operation for continuous shooting and transition on hold SW2.

[0110] When the user half-presses the shutter button, and the signal SW1 is output to the system unit 1003 control unit 1005 detecting the inten�and, begins the process according to the flowchart of the operations shown in Fig. 11. The following paragraphs will describe each stage. It is assumed that the display of the current view starts before the beginning of this block diagram operations Fig. 11.

[0111] First, in step S1101 system unit 1003 control calculation of the focus control based on the output signal of the pixel for detecting the phase difference element 704 register the image. For example, the system unit 1003 control selects the detection point of focus, superimposed on the object that supposedly exists at the position near the device 700 register the image.

[0112] Then, in step S1102, the system unit 1003 control reads the image data from the pixels to generate the image of an element 704 registration image and allocates a predetermined area in which the detection point of focus chosen in step S1101, located in the center. Then the system unit 1003 management writes the image data of the selected area and the center coordinates of the selected area in the image data in the main memory 1004. It is used in the following tracking process. Next, the image data of the selected area that are recorded in the main memory 1004, referred to as "image data" template, while the coordinates of the center of the selected area referred to as the"immediately preceding position of the object". Here, as shown in Fig. 12, which will be described below, the image data of the template associated with the main object. The system unit 1003 control, which allocates the image of the pattern, forms a means of determining the main object, made with the possibility of determining the main object to start continuous shooting.

[0113] Then, in step S1103, the system unit 1003 control exposure calculation by using image data read in step S1102.

[0114] Then, in step S1104, the system unit 1003 controller sends control signals at block 1009 iris control and block 1010 lens drive based on the calculation result of focus adjustment performed in step S1101, and the calculation result of the exposure performed in step S1103. Thus, block 1009 iris adjusts the aperture (exposure) on the basis of a control signal, and a block 1010 of the actuator of the lens adjusts the focus based on the control signal.

[0115] In step S1105 system unit 1003, the control waits until the user releases the shutter button (cancelling SW1) or does not effect the operation of fully pressing the shutter release button (inserting SW2). When the system unit 1003 control adopts cancellation signal or signal SW1 SW2 from block 1005 detection operation (when the discard status hold SW1), the process per�goes to step S1106.

[0116] After the process moves to step S1106, after the signal to cancel the waiting state in step S1105 is entered, the system unit 1003 determines whether the input signal SW2. If this definition gives the result that the input signal is not a signal SW2 (is signal cancellation SW1), the continuous shooting process, represented in the block diagram operations Fig. 11, ends. On the other hand, if the input signal is a signal SW2, the process proceeds to step S1107, where ends the display of the current view, and starts continuous shooting.

[0117] when the process proceeds to step S1107, where the system unit 1003, the control finishes the display of the current view and sends the control signal at block 1006, the gate control for the descent and exhibiting element 704 registration image light, carry out the capture of the main image. Then the system unit 1003 control reads the image data generated by the element 704 image registration, and records (stores) them to the device 1007 store images.

[0118] Then, in step S1108, the system unit 1003 management determines is cancelled if the hold status of SW2. If cancellation signal SW2 has been adopted and as a result of this definition is established that hold SW2 is cancelled,the continuous shooting process, presents a flowchart of the operations of Fig. 11, ends. On the other hand, if cancellation signal SW2 is not accepted, and hold SW2 is not canceled, the process proceeds to step S1109.

[0119] In step S1109 element 704 of registration of the first image provides an accumulation of charges only during the period (first period), which is shorter than the period required to implement appropriate computing exposure control. After the element 704 image registration ends the accumulation of charges, the system unit 1003 reads the first management data of the auxiliary image data 1 auxiliary image) from the pixel for generating the image of an element 704 registration image and also reads the image data of the template from the main memory 1004. Then the system unit 1003 determines a correlation between the two fragments of the image data, thereby detecting the position of the main object in the data 1 auxiliary image. As described above, as for the tracking algorithms, using the same algorithms as in the first variant implementation. In this embodiment, the implementation element 704 image registration forms registration tool auxiliary image, made with the possibility of capturing many fragments �data of the auxiliary image during the capture interval of the main image in continuous shooting mode.

[0120] Fig. 12 shows a diagram explaining the range tracking in the second variant of implementation.

[0121] In the first process of tracking the object according to this embodiment of the region, where the agreement is set for the range (the range of tracking of the first tracking process of the object) that passes from the immediately preceding position of the object through a single point of focus detection in each direction up and down and through three points of focus detection in each of the left and right directions, as shown in Fig. 12.

[0122] Then, in step S1110, the element 704 registration image accumulates the charges only during the second collection period, which is determined by subtracting the first period of accumulation of period accumulation necessary for the implementation of appropriate computing exposure control. The system unit 1003 control reads the second image data from the auxiliary data (data 2 secondary image) of the item 704 image registration and re-generates the combined data of the auxiliary image obtained by summarizing data 2 auxiliary image data and 1 auxiliary image read from the element 704 register the image in step S1109. The system unit 1003 manages�management carries out the calculation of exposure control with the use of these combined data of the auxiliary image.

[0123] Then, in step S1111, the system unit 1003 control calculation of the focus control by using an output signal from the pixel for detecting the phase difference, which corresponds to the position of the object detected in step S1109.

[0124] Then, in step S1112, the system unit 1003 controller sends control signals at block 1009 iris control and block 1010 lens drive for the implementation of the exposure control and focus control based on the results of the calculations performed in step S1110 and the step S1111.

[0125] While unit 1010 of the lens actuator moves the lens group 706 in step S1112, the system unit 1003 management handles the display of the current view and the second tracking process of the object in steps S1113-S1115.

[0126] First, in step S1113, the system unit 1003, the control restarts the display of the current view. Element 704 image registration continues displaying the current view using the combined data of the auxiliary image only for the first frame and using the image data obtained in one accumulation for the second and subsequent frames, as usual. This current view continues to re-implement the capture of the main image.

[0127] Then, in step S1114, the system unit 1003 management combined data sets all the accessories�gateleg image, obtained in step S1110, the range tracking, and starts the detection of the position of the main object in the second tracking process of the object. In particular, the system unit 1003 determines the correlation between the combined data of the auxiliary image generated in step S1110, the image data of the template, stored in the main memory 1004, thereby detecting the position of the main object. This second process of tracking the object uses the same algorithm as in the first variant implementation.

[0128] After the start of the second tracking process of the object in step S1114 system unit 1003 determines in step S1115, completed if the exposure adjustment and focus adjustment in step S1112. If the result of this determination found that the exposure adjustment and focus adjustment in step S1112 is not completed, the system unit 1003, the control goes into standby mode. If the exposure adjustment and focus adjustment is completed, the system unit 1003, the control finishes the display of the current view, and the process returns to step S1107. If in step S1115 is determined that the exposure adjustment and focus adjustment is completed, the process returns to step S1107, even if the second tracking process of the object is not completed. This second process of tracking object $ �Yong to be completed just before the subsequent first tracking process of the object.

[0129] Here, the first tracking process of the object in step S1109 is a process that is carried out between the capture of the main image and another capture of the main image during continuous shooting, and, thus, the continuous shooting speed is improved because the process is faster. On the contrary, since the second process of tracking the object in step S1114 is a process that is independent of the capture of the main image, the process of tracking high precision can be performed, taking the time. If the second tracking process of the object is completed during the process of capture of the main image, validation and correction are carried out as a result of the first tracking process of the object using the result and the information about the immediately preceding position of the object recorded in the main memory 1004 is updated.

[0130] in addition, in this variant implementation, because the first fragment of the plurality of pieces of data of the auxiliary image is read at a relatively early stage, as described above, it is possible to obtain the processing time of the tracking process of the object. These first data of the auxiliary image data and auxiliary image obtained then are combined and the resulting combined data ancillary�tion image is used when calculating the exposure control, the processing time of which is relatively small. Thus, it is possible to obtain the time to process the tracking facility and the implementation of the calculation of the exposure control using the image data for which adequate period of accumulation.

[0131] in addition, carrying out a detailed process for tracking an object using these combined data parallel to the image capturing process of the main image, you can implement the correction of the position of the object and to properly install the range tracking for the next process of tracking the object.

[0132] Although the above describes the preferred embodiment of the present invention, the present invention is not limited to this variant of implementation, and to the extent of his essence allow various modifications and changes. For example, although it was the description of the example associated with a mirrorless digital single-lens reflex camera, from which you can detach the lens unit, image capturing, a similar configuration can be applied to a camera in which a lens unit of the image capture and the camera body is made as a single whole.

[0133] Other embodiments of

Additionally, the present invention is implemented by executing the next process.

In particular, the percent�with is a process in which software (program) which implements the functions of the above-described variants of implementation, enters the system or device through a network or various storage media and a computer (or CPU or MPU, etc.) of the system or device reads and executes the program. This program and a computer-readable nonvolatile recording medium that stores the program, included in the present invention.

[0134] the Present invention is not limited to the above variants of implementation, and allow various modifications and changes without going beyond the nature and scope of the present invention. Accordingly, the following formula of the invention is attached to clarify the scope of the present invention.

[0135] This application claims the priority of Japanese patent application No. 2011-059425, filed on March 17, 2011, which, thus, in full included in this description by reference.

The list of reference positions

[0136] 201 camera case

202 lens unit of the image capture

303 system control unit

304 the image processing unit AE

307 main memory

308 a detection unit transactions

309, the control unit mirror

310, the control unit shutter

311 storage of images

312 unit controls the display

313 control unit dear�gmoi

314 unit lens drive

1. The device registration image that captures many of the main image in continuous shooting mode, wherein the device registration image contains:
registration tool image, made with the possibility of capture of the plurality of auxiliary images during the interval between capture of the main image and the capturing of the next main image;
a means of determining the main object, made with the possibility of determining the main object;
first processing means of tracking the object, made with the possibility of the detection area where an object that is identical to the main object from the first area, which is part of the first auxiliary image, of the plurality of auxiliary images; and
second processing means of tracking the object, made with the possibility of the detection area where an object that is identical to the main object, from a second region of the second auxiliary image from the plurality of auxiliary images, the second region than the first region,
and the result of detection performed by the first processing means of tracking the object, is used when adjusting the focus, which is performed before the capturing of the next main image, and �esultat detection, performed by the second processing means tracking of the object is used in the detection area where an object that is identical to the main object, and the detection is carried out after the capture of the next main image.

2. The device registration image according to claim 1, wherein the detection area where an object that is identical to the main object, and the detection is carried out after the capture of the next main image, is the first means of monitoring processing of the object.

3. The device registration image according to claim 2, wherein the first processing means tracking object specifies the first area based on the result of detection performed by the second processing means tracking the object.

4. The device registration image according to any one of claims.1-3, further comprising a means of associations made with the possibility of combining the plurality of auxiliary images; and a means of adjusting the exposure, made with the possibility of calculating exposure control device registration image using the auxiliary image obtained as a result of the merger carried out by means of Association.

5. The device registration image according to claim 1, further comprising a means �of regulirovki focus made with the possibility of exercising focus control in an area that is well correlated with the main object and is detected by the first means of monitoring processing of the object.

6. The device registration image according to claim 1, wherein the auxiliary image used by the second processing means of tracking the object has a resolution that is higher than the resolution of an auxiliary image used by the first processing means tracking the object.

7. The device registration image according to claim 1, wherein the registration tool image, made with the possibility of capture of the plurality of auxiliary images, includes a second element of the registration image, configured to generate the plurality of auxiliary images, separately from the first component image recording, made with the possibility of generation of the main image.

8. The device registration image according to claim 1, further comprising;
the mirror is arranged to direct the light flux that has passed through the lens group, the second element image registration,
moreover, the first processing means of tracking the object detects the position of the main object during the interval between the departure of the mirror from the optical path for capturing the key images after Zach�ATA main image and the advent of the mirrors on the optical path of the capture of the main image before capturing of the next main image.

9. The device registration image according to claim 1, wherein the capturing of the next main image is performed, while the second processing means track the object continuously detects the area is well correlated with the main object.

10. A method of controlling a recording device image that captures many of the main image in continuous shooting mode, the method includes:
the registration phase images, which capture a lot of auxiliary images during the interval between capture of the main image and the capturing of the next main image;
the step of determining the main object, which define the main subject;
the first stage of processing, the tracking of the object on which find an area where there is an object identical to the main object from the first area, which is part of the first auxiliary image, of the plurality of auxiliary images; and
the second stage of processing, the tracking of the object on which find an area where there is an object identical to the main object, from a second region of the second auxiliary image from the plurality of auxiliary images, the second region than the first region,
and the result of detection performed in the first stage of processing otsluzhiv�of the object, used when adjusting the focus, which is performed before the capturing of the next main image, and the result of detection performed in the second stage of processing the tracking of the object is used in the detection area where an object that is identical to the main object, and the detection is carried out after the capture of the next main image.



 

Same patents:

FIELD: physics.

SUBSTANCE: this device comprises electrical drive of remote control, motion transfer system, actuators and at least one video observation chamber. Besides, two electrical drives located at operator room transmit rotation by shafts through radiation protection wall to research "hot" chamber. Note here that first motor drives via worm gearing the remote control device. Second motor displaces flat rack with video camera in circle in plane perpendicular to horizon.

EFFECT: expanded range of video observation.

4 cl, 1 dwg

FIELD: physics, video.

SUBSTANCE: invention relates to transmission of packetised video data. The camera system comprises a plurality of camera devices and a relay device, which includes: a receiving unit having input ports, each connected to corresponding camera devices for receiving packetised video data from the corresponding one camera from the plurality of camera devices, wherein each port has an address; an output means having at least one port connected to a destination device and having an output port address; a control unit for determining sampled packetised video data from a specific one of the plurality of camera devices based on a destination address included in said packetised video data, wherein the control unit has a switching table for communication of the input port address with the output port address; and a switching unit which responds to user input for sampled change of the input port address, associated with the determined output port address, or for changing the output port address associated with the input port address, and for sampling and outputting at the associated port sampled packetised video data based on frames.

EFFECT: simple interconnection using IP and facilitating system modification.

14 cl, 12 dwg, 4 tbl

FIELD: physics, photography.

SUBSTANCE: invention relates to digital imaging devices. The result is achieved due to that statistics logic may determine a coarse position which indicates an optimal focus area which, in one embodiment, may be determined by searching for the first coarse position in which a coarse auto-focus score decreases with respect to a coarse auto-focus score at a previous position. Using this position as a starting point for fine score searching, the optimal focal position may be determined by searching for a peak in fine auto-focus scores. In another embodiment, auto-focus statistics may also be determined based on each colour of the Bayer RGB, such that, even in the presence of chromatic aberrations, relative auto-focus scores for each colour may be used to determine the direction of focus.

EFFECT: determining an optimal focal position using auto-focus statistics.

19 cl, 4 tbl, 97 dwg

FIELD: physics, video.

SUBSTANCE: invention relates to camera control means. The method comprises acquiring first information used to control a first area specified within a full image captured by a camera unit; acquiring second information used to control a second area specified within the full image; controlling mechanical movement of the camera unit based on the first information input; acquiring an image of the first area from the full image captured by the camera unit and extracting an image of the second area from the first area based on the second information.

EFFECT: high range of the obtained image.

18 cl, 20 dwg

FIELD: physics.

SUBSTANCE: image processing system may include a control circuit configured to determine if a device is operating in single sensor mode (with one active sensor) or double sensor mode (with two active sensors). When operating in single sensor mode, data may be provided directly to a pixel preprocessing unit from the sensor interface of the active sensor. When operating in double sensor mode, image frames from the first and second sensors are transmitted to a pixel preprocessing unit alternately. For example, in one embodiment, image frames from the first and second sensors are recorded in memory and then read in the pixel preprocessing unit alternately.

EFFECT: wider range of technical capabilities of an image forming apparatus, particularly image data processing.

19 cl, 79 dwg, 4 tbl

FIELD: physics, optics.

SUBSTANCE: invention relates to a camera and a system having a camera, wherein the ratio of the distance between the lens and the sensor to the focal distance varies during exposure. The invention also relates to a method of deconvoluting image data. A variation frequency which enables to form an image which is invariant with respect to movement is set.

EFFECT: reduced blur due to movement.

17 cl, 24 dwg, 1 tbl

FIELD: physics, photography.

SUBSTANCE: invention relates to image capturing devices. The result is achieved due to that the image capturing device includes a photographic lens which forms an image of an object, a photoelectric conversion unit located in the predicted image plane of the photographic lens, a display unit which displays the photographed image obtained by the photoelectric conversion unit, an image display control unit which displays the photographed image through the display unit after obtaining the photographed image through the photoelectric conversion unit, a distance information acquisition unit which obtains information on distance in the photographed image, and a blur correction unit which corrects blurring on the photographed image based on information on distance obtained by the distance information acquisition unit. The image display control unit displays the photographed image, where multiple distances in the photographed image are focused.

EFFECT: correcting blurring based on information on distance of an object included in the photographed imaged.

13 cl, 25 dwg

FIELD: physics, video.

SUBSTANCE: invention relates to a video surveillance and camera control system capable of performing panoramic turning and tilted turning of the camera. The camera platform system has a camera which captures an object image to generate a frame image, camera platforms which turn a camera about a panning axis and a tilt axis and image processors which generate a visual image based on the frame image. When a camera passes through a predefined angular position for turning about the tilt axis, an image processor generates a first visual image corresponding to the image formed by turning the frame image by an angle greater than 0 degrees but less than 180 degrees about the panning axis in a predefined angular position before generating a second visual image corresponding to the image formed by turning the frame image 180 degrees about the panning axis.

EFFECT: reducing unnaturalness of change in direction of movement of an object in a visual image in order to reduce errors when tracking an object.

8 cl, 15 dwg

FIELD: physics.

SUBSTANCE: method is carried out using, in a displacement metre, a correlator which performs the function of determining the variance of signal increments based on squaring difference values of correlated signals from linear photodetectors in digital form, and an interpolator is made in form of a unit which performs interpolation using the formula: χ^=Δm(D1D1)/[2(D12D0+D1)], where D-1, D1, D0 denote signal variances, χ^ is displacement, Δm is the pixel size of the auxiliary photodetector.

EFFECT: reduced image displacement measurement error.

4 dwg

FIELD: physics, computation hardware.

SUBSTANCE: in compliance with this invention, sequence of images including multiple lower-resolution images is contracted. Vectors of motion between reference image in sequence and one or several nest images in sequence are defined. The next forecast image is generated by application of motion vectors to reconstructed version of reference image. Difference between next actual image and next forecast image is generated. Image in sequence from set to set is decoded and SR technology is applied to every decoded set for generation of higher-resolution image by rime interpolation and/or spatial interpolation of reference and difference images. Compression of sequence of images includes steps of determination of vectors of motion between reference image and at least one of extra image of sequence of images. Note here that obtained vector of motion is applied to forecast at least one extra image to calculate difference in mages between at least one extra image and forecast of at least one extra image, respectively.

EFFECT: high-resolution imaging by superhigh resolution technology.

13 cl, 5 dwg

FIELD: transport.

SUBSTANCE: proposed device comprises image capture device mounted at the vehicle to catch the imaged sideways therefrom, nearby vehicle detection unit light projecting candidate object detector projecting the light with luminescence equal to or higher than the first threshold magnitude, unit to detect the causes for revealing if light projecting candidate object exists or not as well as supplier of data to driver on nearby vehicle.

EFFECT: higher accuracy of nearby vehicle.

9 cl, 9 dwg

FIELD: physics, computer engineering.

SUBSTANCE: invention relates to computer engineering and specifically to image processing devices and methods. Disclosed is an image processing device for converting a source image. The device comprises an adjustment means, a computing means and an assembling means. The computing means generates a differential image. The assembling means assembles the differential image and the source image based on the density of the differential image. The adjustment means adjusts the blur level of the source image according to one of capture adjustments of the source image and an object included in the source image, wherein the object is defined by analysing the source image.

EFFECT: converting a source image into an image similar to a painting by adjusting the blur level according to a parameter of adjusting capture of the source image and the object included in the source image.

27 cl, 45 dwg

FIELD: physics.

SUBSTANCE: method comprises, at the sender, dividing an electronic image into macroblocks, each macroblock divided into N≥2 blocks from which K<N synchronisation blocks are selected; calculating a transmission synchronisation sequence and embedding into synchronisation blocks which allow embedding; at the recipient, establishing synchronisation of the digital watermark in the received electronic image for which, beginning with the selected initial point of the recipient, the received electronic image is successively divided into macroblocks and blocks, from which blocks for the intended synchronisation are selected; extracting therefrom verification subsequences and merging into a verification sequence which is compared bitwise with all shifts of the calculated receiving synchronisation sequence; the electronic image received by the recipient is considered an electronic image with established synchronisation of the digital watermark, corresponding to the intended synchronisation sequence with the least number of mismatches. The disclosed method can be used to increase the probability of establishing synchronisation of a digital watermark of an electronic image, divided into component parts of an arbitrary size, and for preventing visually noticeable distortions caused by embedding into blocks of the electronic image with virtually unchanged statistical characteristics of transmission synchronisation sequences.

EFFECT: improved establishment of synchronisation of a digital watermark of an electronic image when dividing the electronic image with an embedded digital watermark into component parts of an arbitrary size.

4 cl, 9 dwg

FIELD: physics.

SUBSTANCE: apparatus comprises: modules for detecting three-dimensional (3D) objects based on information on images of the back part of a vehicle from a camera, a module for evaluating natural objects for determining if a detected 3D object is a natural object, including roadside hedgerows or snow, based on an irregularity estimate calculated based on a number of first pixels for first pixels which represent a first predetermined difference in a difference image containing the detected 3D object, and a number of second pixels for second pixels corresponding to the 3D object and representing a second predetermined difference greater than the first predetermined difference, and a control module for controlling different processes. The control module inhibits the evaluation that a detected 3D object is another vehicle when the detected 3D object is evaluated using the module for evaluating natural objects as a natural object.

EFFECT: higher accuracy of evaluating natural 3D objects.

26 cl, 30 dwg

FIELD: physics.

SUBSTANCE: automatic performance apparatus includes a second performance data receiving section, which receives performance data transmitted without passing through a moving image distribution server from a server device which stores performance data, which are a group of performance information of an instrument terminal and data and time information indicating the date and time of performance of said performance information, a synchronisation signal receiving section which receives a synchronisation signal transmitted via a route for transmitting an audio signal from the moving image distribution server, and a playback unit which plays back performance information from the received performance data synchronously with image distribution during propagation of the synchronisation signal with a time corresponding to the date and time indicated by the data and time information from the performance data received by the second data receiving section for musical performance, and the date and time indicated by the synchronisation signal received by the synchronisation signal receiving unit.

EFFECT: automatic instrumental performance synchronously with video.

7 cl, 15 dwg

FIELD: physics, video.

SUBSTANCE: invention relates to means of controlling video playback. The method comprises presenting an image which corresponds to a section of video data having an image presentation status on a second device (13), in response to a change in image presentation status transmitting a command to a first device (12) to adapt the playback status of the section of video data on the first device depending on the change in the image presentation status. In the method, image presentation on said second device (13) comprises selecting an image during a playback time interval of the corresponding section of video data on the first device (12).

EFFECT: adapting playback of a section of video data depending on change in image presentation status.

17 cl, 11 dwg

FIELD: physics.

SUBSTANCE: in the method, computed tomography is carried out with a scanning step of not more than 2.5 mm, analysis of the bitmap image of a structural component is carried out using a viewing program which realises a RGB colour model; the method includes assigning a coordinate system and boundaries of the analysis region, obtaining a digital matrix of colour indices of the pixels of the analysis region, determining the outline of the section of the structural component, determining the average value of the colour index of pixels of sections of the structural component; the distribution of modulus of elasticity values in the sections of the structural components is judged from the values of the elements of the digital matrix generated by the viewing program during analysis.

EFFECT: high accuracy of determining modulus of elasticity of heterogeneous materials of the analysed component.

1 tbl, 21 dwg

FIELD: physics, computer engineering.

SUBSTANCE: invention relates to methods of presenting animated objects. Disclosed is a method of presenting an animated object in the form of an animation sequence by creating a sequence of vector-based separate objects for a defined moment of an animation sequence of an object and connecting the vector-based separate objects for the defined moment to form an animation sequence. The method includes calculating surface changes of the object as textural animation, where the textural animation of the object is created using a graphics program and is merged with the object using a program with a vector-based page description language. The method also includes projecting the textural animation of the object in the animation sequence using a program with a vector-based page description language to form an animation sequence with textural animation of the object.

EFFECT: faster operation and resource saving when presenting an animated object with interactive change of said presentation by the user.

15 cl, 5 dwg

FIELD: physics, photography.

SUBSTANCE: invention relates to an image processing device and method, which can improve encoding efficiency, thereby preventing increase in load. The technical result is achieved due to that a selection scheme 71 from a prediction scheme 64 by filtering selects a motion compensation image for generating a prediction image at a high-resolution extension level from key frames at a low-resolution base level. The filter scheme 72 of the prediction scheme 64 by filtering performs filtration, which includes high-frequency conversion and which uses analysis in the time direction of a plurality of motion compensation images at the base level, selected by the selection scheme 71, in order to generate a prediction image at the extension level.

EFFECT: reducing load in terms of the amount of processing owing to spatial increase in sampling frequency at the base level for encoding the current frame.

19 cl, 26 dwg

FIELD: physics.

SUBSTANCE: method comprises phases, during which video bit stream containing a sequence of image frames is received, the error occurrence is determined in the display frame segment, time distribution and spatial distribution of the named error is identified by means of the motion vector information and internal prediction relating a segment, affected by the error, the quality of video bit stream is assessed on the basis of the named error distribution.

EFFECT: improvement of accuracy of assessment at the expense of identification of error spatial distribution in the display frame segment.

18 cl, 16 dwg

FIELD: information technology.

SUBSTANCE: method and apparatus with a computer program product are provided for generating a plurality of compressed feature descriptors that can be represented by a relatively small number of bits, thereby simplifying transmission and storage of the feature descriptors. The method involves comparing a compressed representation of a feature descriptor with a plurality of compressed representations of feature descriptors of respective predefined features, and the respective feature descriptor may be identified without having to first decompress the feature descriptor, thereby potentially increasing the efficiency of identifying feature descriptors.

EFFECT: faster and more accurate identification of features.

24 cl, 21 dwg

Up!