Image processing device and image processing method

FIELD: physics, computer engineering.

SUBSTANCE: group of inventions relates to image processing technologies. An image processing device for reconstruction processing for correcting image quality deterioration due to aberration in an optical image-forming system. The image processing device comprises a dividing means for dividing image data of colours of colour filters into image data of corresponding colours of colour filters. The device also includes a plurality of image processing means, each designed to perform reconstruction processing by processing using an image data filter of one of the corresponding colours divided by said dividing means.

EFFECT: fewer false colours through image reconstruction processing in a RAW image, as well as reduced load on image reconstruction processing.

10 cl, 33 dwg

 

The level of technology

The technical field to which the invention relates

[0001] the Present group of inventions relates to a device and method of image processing, and more particularly to a device and method of image processing that corrects the degraded image by using the processing recovery image.

The level of technology

[0002] Since the digitization of information provides the ability to process images as values of the signals have been proposed various methods of correction processing for the read image. When the object is read and rendered by a digital camera, the resulting image has had some degree of degradation. The deterioration of image quality is caused, in particular, aberrations of the optical system imaging, used to form the image of the object.

[0003] the reasons for the presence of components of the blur in the image include spherical aberration, coma aberration, field curvature and astigmatism of the optical system. Each of the components of the image blur due to these aberrations, indicates that the light beam emerging from one point of the object forms an image with dissipation (extension), which should converge to the bottom point on the plane of image formation in the absence of any aberration or the influence of diffraction. This condition is called PSF (function point spread) from the optical point of view, but in this document is called a component of the blur from the point of view of the image. The blurring in the image may indicate the image is out of focus, but in this document is used to specify the image blurred due to the influences of the above aberrations of the optical system, even if it is in focus. In addition, the color fringing (or fringes) on the color image due to chromatic aberration on the axis, the spherical aberration of color aberration and coma colors in optical systems can be considered as different ways of blur at different wavelengths.

[0004] OTF (optical transfer function), obtained by Fourier transform of the PSF for, is information about the frequency component of the aberration, which is expressed by a complex number. The absolute value of the OTF, i.e. the amplitude component, hereinafter called MTF (transfer function modulation), and the phase component is called next PTF (phase transfer function). Thus, MTF and PTF are the frequency characteristics, respectively, of the amplitude component and the phase component deterioration of image quality due to aberration. In this case, the phase component is represented as phase angle:

PTF=tan-1(Im(OTF)/Re(OTF)) (1)

where Re(OTF) and Im(OTF), respectively, represent the real part and imaginary part of the OTF.

[0005] As described above, the OTF of the optical system of the imaging leads to the degradation of the amplitude component and the phase component of the image. For this reason, the degraded image asymmetrically eroded at each point of the rendered object, by analogy with the coma aberration.

[0006] in Addition, there is chromatic aberration of magnification, when the imaging position is shifted due to the difference in the increase in the formation of images at different wavelengths of light, the device imaging detects such shifts in color as R-, G - and B-components according to their spectral characteristics. The scattering image occurs due to changes to the imaging positions at different wavelengths for each color component, i.e. phase shifts, and shifts to the imaging position between the R-, G - and B-components. More specifically, the chromatic aberration of magnification is not just color fringing due to horizontal shifts. However, the color fringing will be used herein as a synonym of chromatic aberration of magnification.

[0007] as a method of the correction of the deterioration in the amplitude (MTF) and the phase (PTF), there is a method to correct them by using information OTF of the optical system imaging. This method is called the "recovery image" and "image reconstruction". Processing of correcting the degradation in the image by using information OTF of the optical system imaging is called next, processing of image reconstruction.

[0008] the structure of processing of image reconstruction. Let g(x, y) is the degraded image, f(x, y) is the original image, and h(x, y) is the PSF obtained through the inverse Fourier transform of the optical transfer function, then the true equation (2)below:

g(x, y)=h(x, y)*f(x, y) (2),

where * represents convolution, and (x, y) represents coordinates on the image.

[0009] When this equation is converted to the form displaying the frequency plane by means of Fourier transform, it takes the form of the work for each frequency as shown in equation (3):

G(u, v)=H(u, v).(u, v) (3),

where H is a function obtained by Fourier transform of the PSF for, and, therefore, represents OTF (u, v) represents coordinates on a two-dimensional frequency plane, i.e. the frequency.

[0010] in Other words, to obtain the original image from the read de is radiogennogo image, both sides of the equation (3) can be divided into H, as shown in equation (4)below.

G(u, v)/H(u, v)=F(u, v) (4)

Return F(u, v) in the real plane by the inverse Fourier transform, you can obtain the original image f(x, y) as the reconstructed image.

[0011] When defining R as the value obtained by the inverse Fourier transform of equation (4), you can also get the original image by performing convolution processing for the image on the material surface, as shown in equation (5):

g(x, y)*R(x, y)=f(x, y) (5),

where R(x, y) is called the filter recovery image. Actual picture, however, includes noise components. For this reason, the use of filter recovery image formed through the taking of an ideal inverse functions of the OTF in the above manner, will increase noise components together with the degraded image. In General, therefore, a good image cannot be obtained. In this regard, for example, a known method of suppressing reduction factor on the high frequency side of the image in accordance with the ratio of the intensity between the image signal and noise signal, such as a method using the Wiener lter. As a way to the rectii deterioration in the component color fringe image, for example, the deterioration is corrected by correcting the above-mentioned components of the blur so that the amount of blur was made uniform for the respective color components of the image.

[0012] In this case, since the OTF varies in accordance with the terms of picking up images, such as the position change of the image magnification and the diameter of the aperture, it is necessary to change the filter recovery image used for the processing of image reconstruction.

[0013] for Example, patent application laid Japan No. 2006-238032 discloses processing of image reconstruction, which is performed when setting small scattering PSF after restoring the image. The Japan patent No. 03532368 discloses technology exceptions blur image in the endoscope for observing the internal part of a living organism through the use of a PSF corresponding to the wavelength of luminescence, which should be used in relation to the range outside the range of the focused image means for picking up images. Since luminescence is weak, you want the object optical system with a small f number. This leads to reduced depth of focus. Therefore, this technology is designed to get SF is kashirovannoy image by performing processing of image reconstruction in respect of the range, in which the optical system is out of focus.

[0014] As described above, the processing is complete recovery image to read the input image can improve the image quality by correcting aberrations.

[0015] Methods for processing recovery image include a method of applying processing to restore the image to a RAW image with a signal corresponding to one color component, namely, one of the R-, G - and B - color components; and a method of processing a recovery image for each color plane after performing interpolation so that each pixel had a signal corresponding to all the color components, namely, the color of R-, G - and B-components.

[0016] the Method of treatment recovery image for each color plane exceeds a method of treatment recovery image to a RAW image from the point of view of the number of pixels applied to the processing of image reconstruction and the number of taps of the filter recovery. This leads to a significant increase in processing load on the processing of the recovery image.

[0017] In General, the color components of the respective pixels constituting the RAW image, often arranged according to the layout of the BA the EPA (layout Bayer), for example, as shown in Fig. 2. In this case, the number of pixels of the G component exceeds the number of R - or B-components. For this reason, the frequency characteristics of the layout of pixels of the G component in the RAW image are different from the frequency characteristics of the layout of the pixels R and B components. As described above, since the processing of image reconstruction is equivalent to the correction of the frequency characteristics, the frequency band of the G-component is different from the frequency band of the R component or the B component. In this case, the G-component can be restored in the band of higher frequencies than the frequency band of the R component or the B component. If only the G component among the R-, G - and B-components is restored above the high-frequency band, the processing of image reconstruction sometimes generates false not present in the original image color area, which includes a high-frequency component in the image. This is because the ratio between the frequency characteristics of the R-, G - and B-components in the band of high frequencies of the image before and after processing recovery image is changed. As described above, the execution of processing of image reconstruction for the components of the signal in different frequency bands generates a false color. False color in this case is formed due to changes in the pixel data, it is which is read by the image sensor, unlike the false color generated due to pixel interpolation for the image in the layout Bayer. Consequently, the use of pixel interpolation algorithm, designed to suppress the formation of false colors, it is not possible to suppress a false color generated due to the processing of image reconstruction.

Disclosure of invention

[0018] the Present invention was created in view of the above case and reduces false color generated by processing the recovery image in the RAW image, but also reduces the load on the processing of the recovery image.

[0019] According to the present invention, is provided an imaging device for performing recovery processing of the image data to correct the deterioration of image quality due to aberration in an optical system, imaging, and image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered with one of a set of colored filters, and the imaging device contains a means of separation for separating the image data m is Oresta colors of color filters on the image data of the respective colors of the color filters; many image processing, each of which is intended to perform recovery processing by processing using a filter to the image data of one of the respective colors separated by means of separation; and a means of processing the interpolation processing of the interpolation color of each pixel of image data subjected to the processing, recovery and sharing tool is additionally configured to divide image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of a set of colored filters in a variety of colors, many of the image data referred to the same color, so that the set of image data referred to one the colors had the same spatial-frequency characteristic, as image data of another color, but a means of interpolation processing is additionally performed with the ability to handle interpolation of colors through the use of image data of multiple colors, as if it were the image data of one color.

[0020] Further, according to the present invention, is provided an imaging device for processing recovery dannythebruce, to correct the deterioration of image quality due to aberration in an optical system, imaging, and image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered with one of a set of colored filters, and the imaging device contains many tools for processing images, arranged in series, each of which is intended to perform recovery processing by processing using a filter to the portion of the image data of multiple colors; and a means of processing the interpolation processing of the interpolation color of each pixel of image data subjected to processing recovery, with each of the multiple image processing performed with the opportunity to perform recovery processing for the respective image data of each color that have not been subjected to processing, recovery, and separately drawn with image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters mn is the set of colors, as with image data of multiple colors, so that the portion of the image data, which need to apply separately, as with image data of multiple colors, and which can be processed by many image processing, had the same frequency response as the image data of another color, and the means of interpolation processing is additionally performed with the ability to handle interpolation of colors through the use of image data of multiple colors, as if it were the image data of one color.

[0021] in Addition, according to the present invention, is provided a method of image processing to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in an optical system, imaging, and image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered by one of the many color filters of multiple colors, the method includes splitting step for splitting the image data of multiple colors, multiple color filters on the data set shows the I respective colors; the stage of image processing for processing recovery by processing using a filter for each of the image data of the respective colors separated by the splitting step; and a processing stage interpolation processing of the interpolation color of each pixel of image data subjected to processing, recovery, and phase separation image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, divided into a multitude of image data referred to the same color, so that the set of image data referred to the same color have the same spatial-frequency the characteristics of image data of another color, and on the stage of processing, interpolation processing, color interpolation is performed by using the image data of multiple colors, as if it were the image data of one color.

[0022] Additionally, according to the present invention, is provided a method of image processing to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system for forming the image is in relation to the image data, obtained by reading the image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered by one of the many color filters of multiple colors, the method includes a step of image processing for the sequential execution of the recovery processing by processing using a filter for each piece of image data corresponding to multiple colors using a variety of appropriate means of processing images arranged in series; and a processing stage interpolation processing of the interpolation color of each pixel of image data subjected to processing recovery, while at the stage of image processing, each of the many image processing performs processing recovery in relation to the image data of each color that have not been subjected to processing, recovery, and separately drawn with image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, as with image data of multiple colors, so that a portion of the data from the images, handled separately as with many colors and which is processed by the multiple image processing, had the same frequency response as the image data of another color, and on the stage of processing, interpolation processing, color interpolation is performed by using the image data of multiple colors, as if it were the image data of one color.

[0023] Additional features of the present invention will become clear from the following description of embodiments of the invention (with reference to the accompanying drawings).

Brief description of drawings

[0024] the Accompanying drawings, which are contained in and form part of the specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention.

[0025] Fig. 1 is a block diagram showing the layout of an imaging device according to a variant implementation of the present invention;

[0026] Fig. 2 is a view showing an example of arrangement of color components;

[0027] Fig. 3 is a block diagram showing the layout of the module image processing according to the first variant of implementation;

[0028] Fig. 4A-4E are views showing color components and components of image reconstruction according to the first embodiment of implementation;

[0029] Fig. 5A and 5B are views showing specified for the color components of the frequency characteristics according to the first variant of implementation;

[0030] Fig. 6 is a flowchart of the operational sequence of the method for processing of image reconstruction according to the first variant of implementation;

[0031] Fig. 7A and 7B are schematic views for explanation of the filter of image reconstruction;

[0032] Fig. 8A and 8B are schematic views for explanation of the filter recovery image according to the first variant of implementation;

[0033] Fig. 9A and 9B are graphs showing an example of the gain recovery for the filter recovery image and an example of the MTF of the image according to the first variant of implementation;

[0034] Fig. 10A-10K are views showing an example of processing of the color interpolation according to the first variant of implementation;

[0035] Fig. 11A and 11B are views showing an example of arrangement of pixels of another image sensor according to the first variant of implementation;

[0036] Fig. 12 is a view showing the other color components and components of image reconstruction according to the first variant of implementation;

[0037] Fig. 13 is a block diagram showing the layout of the module image processing according to the second variant of implementation; and

<> [0038] Fig. 14 is a flowchart of the operational sequence of the method showing the processing of image reconstruction according to the second variant implementation.

The implementation of the invention

[0039] embodiments of the present invention are described in detail hereinafter with reference to the accompanying drawings.

[0040]The first option exercise

Fig. 1 shows an example of the basic layout of the readout image according to a variant implementation of the present invention. The image of the object (not shown) onto the sensor 102 of the image passing through the optical system 101 of the imaging, which includes the aperture 101a and the focusing lens 101b. The sensor 102 of the image is covered, for example, color filters placed under the so-called layout Bayer shown in Fig. 2. Each pixel constituting the sensor 102 images, and outputs the color signal component corresponding to the color filter, from among red (R), green (G) and blue (B) color filters, which are covered with the pixel. The sensor 102 converts the light images forming the image into an electrical signal. Analog-to-digital Converter 103 converts this signal into a digital signal and injects it into the module 104 of the image processing. The module 104 of the image processing consists of m is module 111 processing recovery image and another module 112 of the image processing, which performs the predetermined processing. Module 112 performs image processing, among other things, the processing of the interpolation of colors. Each pixel of the image output module 111 processing recovery image includes only the color component signal corresponding to a single color filter. For this reason, the module 112 of the image processing performs processing interpolation of colors for the reconstructed image to be assigned to the corresponding pixel signals of the color components corresponding to all the colors of the filter.

[0041] First of all, the module 104 of the image processing receives information about the terms of picking up images in the reader imaging module 107 detects conditions. Module 107 detects conditions may obtain information about the conditions of the read images directly from the system controller 110 and may receive information about the conditions of the read image, for example, regarding the optical system imaging, module 106 controls the optical system of the imaging. Module 111 processing recovery image then selects the filter recovery image corresponding to the conditions of reading images from storage module 108, and performs recovery processing in the image is the position for the image, the input module 104 of the image processing. The data contained in the module storage 108 may be information related to OTF required for the formation of the filter recovery image, not filters restore the image. In this case, the module 111 processing of image reconstruction selects information about OTF corresponding to the conditions of reading images from storage module 108, and generates a filter recovery image corresponding to the conditions of reading images. Module 111 processing recovery image and then performs the processing of image reconstruction for image input module 104 of the image processing.

[0042] the Carrier 109 recording images contains the output image processed by the module 104 for processing images in a predefined format. Module 105, the display may display an image obtained by performing predetermined display processing in the image subjected to processing of image reconstruction, or may display an image that has not been subjected to processing of image reconstruction, or which has been subjected to simple recovery processing.

[0043] the System controller 110 performs a sequence of operations management. The module 106 is Board optical system imaging mechanically actuates the optical system of forming images in accordance with instructions from the system controller 110.

[0044] the System controller 110 controls the diameter of the aperture 101a, setting the terms of picking up images for the f / number. The mechanism of AF (AF) or manual focus mechanism controls the position of the focusing lens 101b thus, in order to adjust the focus according to the distance to the object. This optical system 101 of the imaging may include an optical element, such as a low pass filter or an infrared cut-off filter. When using this item as a lowpass filter, which affects the characteristics of the OTF, during the formation of the filter recovery image must take into account the change in OTF caused by the optical element. IR cut filter also affects the PSF in RGB channels, which are integer values PSF spectral wavelengths, in particular, the PSF in the R-channel. Therefore, the change in the PSF due to the IR cut filter is taken into account during the formation of the filter recovery image.

[0045] Additionally, the optical system 101 of the imaging is performed as part of the reader images, but may be removable, as in odnolistovoy SLR camera.

[0046] Fig. 3 shows the layout module 04 image processing according to the first variant implementation. As described above, the input image module 111 processing of image reconstruction represents the RAW data, in which each pixel has one of the color components, namely, the color of R-, G - and B-components according to the layout Bayer, for example, shown in Fig. 2.

[0047] In the first embodiment, the module 1101 signal separation module 111 processing of image reconstruction divides G-component G1 and G2 to obtain the four components of image reconstruction: R, G1, G2 and B. Four components of the recovery image then entered into modules 1110-1113 application recovery filters to apply to the components filters restore the image.

[0048] Fig. 4A-4E show an example of each color component in the RAW data and each component of the recovery image. Fig. 4A-4E show three color component in the RAW data. Fig. 4A shows the G-component. Fig. 4B shows the R-component. Fig. 4C shows the B-component. Each pixel is represented by a white square in Fig. 4A-4E, indicates the corresponding color component. In the first embodiment, the G-component shown in Fig. 4A, is divided into G1 - and G2-components shown in Fig. 4D and 4E, and applies processing of image reconstruction. Assume that the signal of the G component output from the pixel that is adjacent in the horizontal resolution is the detailed direction with the pixel, which outputs a signal R-component is a signal G1-component, and the signal of the G component output from the pixel adjacent to the pixel that outputs a signal B-component is a signal G2-component. In Fig. 4A-4E, Fig. 4B shows the component R of image reconstruction Fig. 4C shows the B component of image reconstruction Fig. 4D shows the component G1 recovery image, and Fig. 4E shows the component G2 restore the image.

[0049] Fig. 5A and 5B are views showing the spatial-frequency characteristics defined for the color components of the layout of pixels in the image sensor. The respective components shown in Fig. 4A-4E, respectively, are represented by m_G(x, y), m_R(x, y), m_B(x, y), m_G1(x, y), and m_G2(x, y), assuming that 1 represents each pixel (represented by a white square), which can read the light, and 0 represents each pixel (represented by a black square), which cannot read the light. Spatial-frequency characteristics shown in Fig. 5A and 5B correspond to data obtained by Fourier transform m_G(x, y), m_R(x, y), m_B(x, y), m_G1(x, y) and m_G2(x, y).

[0050] Fig. 5A shows the G-component, i.e. the spatial-frequency characteristics in Fig. 4A, which is a comb function, which is 1 (one) are present only in the position "•". the IG. 5B shows the spatial-frequency characteristics of R - and B-components shown in Fig. 4B and 4C. Fig. 5B differs from the Fig. 5A, showing the spatial frequency characteristics of the G-component. In addition, the spatial-frequency characteristics obtained when the G-component is divided into components G1 and G2 of image reconstruction are identical spatial-frequency characteristics shown in Fig. 5B, showing the spatial-frequency characteristics of R - and B-components.

[0051] When the processing of image reconstruction is performed directly for the three color components R, G and B, since the spatial-frequency characteristics of the G-component different from the spatial-frequency characteristics of R - and B-components, as shown in Fig. 5A and 5B, false colours, which do not exist in the original image can be generated in the field, including high-frequency components of the image, as described above. In contrast, the division G-component on the components G1 and G2 of image reconstruction leads to the fact that the layout of the pixels of the four components of R, G1, G2 and B recovery images show the same spatial-frequency characteristics. This provides the ability to perform processing of image reconstruction for General the th frequency band, and thus, it is possible to suppress the generation of false colors due to processing of image reconstruction.

[0052] When the direct execution of processing of image reconstruction for the three color components R, G and B, it is possible to make the frequency band of the G-component, which need to be adjusted to coincide with the frequency band of the R - and B-components, depending on the method of forming filter recovery image, which should be applied to the G-component. However, the frequency band that should be restored in this processing is equivalent to the frequency band in the processing division G-component on the components G1 and G2 recovery image. Processing division G-component on the components G1 and G2 of image reconstruction is preferable from the viewpoint of computational load during convolution filters of image reconstruction as described below.

[0053] the Following describes in detail the procedure for processing of image reconstruction module 111 processing of image reconstruction in the first embodiment, with reference to the block diagram of the sequence of operations of the method of Fig. 6.

[0054] At step S11 module 111 processing of image reconstruction receives information about the actual conditions of the read images from a module 107 detects conditions as what you described above. Terms of picking up images include, for example, the position change of the image magnification, the diameter of the aperture and the distance to the object. At step S12 module 1101 signal separation divides the RAW data consisting of R-, G - and B-components, four components of R, G1, G2 and B recovery image. More specifically, can be prepared in four copies of the image data for R, G1, G2 and B, respectively, each of which has 0 (zero)set out in part that enters each pixel corresponding to a color component other than the target component of the recovery image. Alternatively, it may be prepared in four copies of the image data for R, G1, G2 and B, respectively, each of which has a size in 1/4, obtained by thinning part, represented as a pixel corresponding to a color component other than the target component recovery image.

[0055] At step S13 module 111 processing of image reconstruction selects the filters recovery image, suitable for the detected conditions of the read image and the four components of R, G1, G2 and B of image reconstruction from the storage module 108. At this time, you can adjust the selected filters restore the image if necessary. This represents the operation of thesis the specific preparation condition data of the read image, to reduce the number of data for a filter recovery image prepared in advance in the module 108 is stored, and correction filters restore the image during the actual process of image reconstruction. In addition, if the storage module 108 contains information about OTF required for formation of a filter recovery image, not the filters of image reconstruction filters restore the image formed from the selected information on OTF in accordance with the terms of picking up images.

[0056] the Following describes the filter recovery image.

[0057] Fig. 7A and 7B are schematic views showing an example of a filter recovery image, which should be applied to each color plane of the image, where each pixel contains the respective color components R, G and B. you Can define the number of taps of the filter recovery image in accordance with the amount of aberration of the optical system imaging. In this case we use a two-dimensional filter with 11×11 taps. The convolution processing is performed for the image in the process of image reconstruction, each tap of the filter corresponds to one pixel of the image. As shown in Fig. 7A, the use of two-dimensional filter, receive the frame separation filter recovery image on 100 or more filters, allows you to perform recovery for aberrations, are scattered widely from the formation of images, such as spherical aberration, coma aberration, the chromatic aberration on axis, off-axis color flare and the like, in an optical system for forming the image.

[0058] Fig. 7A does not show the values in the respective branches. Fig. 7B shows one section of the filter. This filter recovery image can be formed by the method described above, i.e. by calculating or measuring the OTF of the optical element of the optical system imaging and performing the inverse Fourier transform for inverse functions. In General, since it is necessary to consider the influence of noise, it is possible to selectively use the method of forming the Wiener lter or the associated filter recovery. Using the Wiener lter allows you to restore the deterioration phase (PTF) and restore the deterioration in amplitude (MTF) with different levels for the respective frequency bands. In addition, the OTF can be a factor that affects the OTF in relation to not only the optical system of forming images, but also images, which must be entered. For example, a lowpass filter is used to suppress high-frequency components in relation to the frequency the characteristics of the OTF. In addition, the shape and the aspect ratio of opening of the pixel apertures of the image sensor also affect the frequency response. In addition, such factors include the spectral characteristics of the light source and the spectral characteristics of the filters at different wavelengths. It is preferable to form filters of image reconstruction based on the OTF to cover many possible factors that may contribute to the deterioration of characteristics of the OTF.

[0059] If the input image is a color RGB image, it is preferable to form the three filter recovery image corresponding to the respective color components, i.e. the R-, G - and B-components. Optical imaging has the chromatic aberration, and varies according to the method of erosion for each color component. For this reason, the filter characteristics of the recovery image for the respective color components are slightly different from each other on the basis of chromatic aberration. In other words, viewed in cross section in Fig. 7A differs for the respective color components. The number of taps of the filter recovery image in vertical and horizontal directions do not necessarily have to be determined in accordance with a square layout and can prod olno to change if account is taken of the convolution processing.

[0060] an example of a filter recovery image, which should be applied to the RAW image, where each pixel has one color component that is used in the first embodiment, described with reference to Fig. 8A and 8B, as compared with the filter of image reconstruction described above, which should be applied to each color plane of the image, where each pixel has all the color components R, G and B. This filter is a filter to restore the image containing the coefficients for pixels in which there are target color components, each part containing the coefficient represented by a white square, and the remaining part containing 0, is represented by black squares.

[0061] When performing processing of image reconstruction for the three color components, i.e. the R-, G - and B-components, without separation of the G component, the filter recovery image, which should be applied to the R - and B-components, becomes the filter shown in Fig. 8A, and the filter recovery image, which should be applied to the G-component becomes the filter shown in Fig. 8B. In contrast, the first version of the implementation applies the filter recovery image when the division G-component on the components G1 and G2 image and therefore, can use a filter recovery image, for example, shown in Fig. 8A, for any of the components R, G1, G2 and B.

[0062] Again referring to Fig. 6, at step S14, using the filter recovery image selected at step S13, the modules 1110-1113 application recovery filters perform the convolution processing by processing using a filter for each pixel of the components of R, G1, G2 and B restore the image of the input image that is read. This allows to eliminate or reduce components of the image blur due to aberrations caused in the optical system imaging. As described above, the use of filter recovery image that is appropriate for each of the components R, G1, G2 and B restore color images, also allows to correct chromatic aberration.

[0063] the convolution Processing in the first embodiment, is a convolution processing by processing using a filter with components R, G1, G2 and B of image reconstruction is shown in Fig. 4B-4E, and filter recovery image shown in Fig. 8A. It is preferable to change the way storage filter recovery image or the way of the filters in accordance with the way each component restored the I image, separated at step S12, the feature data, as necessary. If, for example, used four sets of image data with zeros (0)specified for parts other than the target component recovery image for R, G1, G2 and B, respectively, unnecessary computations can be eliminated by limiting the target component recovery image those pixels that are to be processed convolution. In addition, when preparing four sets of image data, each of which has a size in 1/4 and is obtained by thinning of parts other than components restore the image to be processed, for R, G1, G2 and B, respectively, this device also contains the actual filter recovery image, while factors other than the factors to be used, were subjected to thinning out. This allows you to apply the filter directly to the data image size in 1/4.

[0064] In any case, the number of effective coefficients of the filter to be used is clearly less effective number of filter coefficients of image reconstruction is shown in Fig. 7A, which is applied to the image, all pixels which have all the color components R, G and B, and the number of effective coefficients of filter no the of the image, it is shown in Fig. 8B, which is applied to the G component, which is not divided. This reduces the load on the convolution processing.

[0065] After performing the processing of image reconstruction for each image pixel module 111 processing of image reconstruction completes processing. Since the OTF is changed in accordance with the angle of view (image height) of the optical system imaging even within the same conditions read images, it is preferable to perform processing of image reconstruction according to the present invention after changing OTF for each segmented image area in accordance with the image height. It is preferable to scan the filter recovery image for the image when performing the convolution processing and incrementally change the filter for each area. In other words, the device executes steps S13 and S14 for each target pixel of each component of the recovery image.

[0066] the image Data for which the module 111 processing of image reconstruction is performed processing of image reconstruction are introduced in module 112 of the image processing. Since the image data for the processing of image reconstruction remain in the layout Bayer, another module image processing performs processing interpolation of colors for each of the three color components, contained in the image sensor. Another module 112 of the image processing generates an image file in JPEG or similar format by performing the known processing of color formation for RAW data, such as gamma correction and color management balance, in addition to handling the interpolation of colors.

[0067] the Gain recovery for the filter recovery image and the frequency characteristics of the image based on the execution/absence of execution of the processing of image reconstruction are described with reference to Fig. 9A and 9B. It should be noted that the gain recovery is defined as the gain of the MTF in the processing of image reconstruction. The graph shown in Fig. 9A is an example of a strengthening recovery filter recovery image in the first embodiment. According to the spatial-frequency characteristics in the above arrangement, the pixel components of image reconstruction can be considered that the frequency band, which is reversed through the filter of image reconstruction is up to 1/2 of the Nyquist frequency of the image sensor. In contrast, when the processing is complete recovery image for each color plane of the image, where each pixel has all the color components R, G and B, can MF is a thief, the band, which is reversed through the filter of image reconstruction is up to the Nyquist frequency of the image sensor according to the spatial-frequency characteristics in the layout of the pixel components of the recovery image.

[0068] the Improvement of the MTF in the output image in JPEG or similar format, processed restore the image according to the first variant implementation is not limited to the frequency band up to 1/2 of the Nyquist frequency of the image sensor. Fig. 9B shows an example of the MTF in the area of the output image according to the first variant implementation. Obviously, when executing the processing of image reconstruction MTF is improved even in the frequency band equal to or higher than the Nyquist frequency of the image sensor, compared with the output image obtained without processing the recovery image. This is affected by processing of the color interpolation performed by another module 112 of the image processing. Research has already been conducted on the processing of the interpolation of colors in the image sensor, for example, with the layout Bayer, and have been described various techniques of interpolation. Commonly used method is adaptive processing interpolation of colors for forming the interpolated pixel through the use of information of other neighboring pixels of the color components. It is a way of determining the interpolation method for the component R of a given pixel by using information of the components G and B are neighboring pixels when forming the pixel value component of R for a given pixel through interpolation processing. In contrast to the way to perform a simple linear interpolation of one color component, such adaptive processing interpolation of colors allows you to suppress the generation of a false color or deterioration of sharpness due to the interpolation processing.

[0069] the Following explains an example of an adaptive processing method of color interpolation using the example pixel interpolation in the marginal parts shown in Fig. 10A-10K. Fig. 10A is a view in cross section of this region. Assume that the edge is colorless monochrome part, and each of the color components R, G and B has a layout of pixels composed of the pixel values between 100 and 200, as shown in Fig. 10B, when the values of the respective color components R, G and B are obtained the corresponding pixels of the sensor 102 images. In practice, since the RAW image read by the sensor 102 of the image layout Bayer, has one color component in each pixel, extracting values for each color component must produce a layout of pixels, display the data in Fig. 10C-10E. Each pixel is represented by a black square in the layout of the pixels of the respective color components shown in Fig. 10C-10E is a pixel requiring processing interpolation of colors. In this case, it is obvious that the respective color components after processing interpolation of colors ideally have pixel values shown in Fig. 10B. Then, the layout of the pixels shown in Fig. 10C-10E, written as G(x, y), R(x, y) and B(x, y), where x represents the coordinate in the horizontal direction, and y represents the coordinate in the vertical direction, each of which has a value in the range from 0 to 4 in Fig. 10A-10K.

[0070] First of all, below is an example of executing linear interpolation for each color component in Fig. 10C-10E. Equation (6) is calculated by using four pixels adjacent to the G-component, when performing linear interpolation of the G component:

G(x, y)=(G(x, y-1)+G(x-1, y)+G(x+1, y)+G(x, y,+1))/4 (6)

Linear interpolation for the R-component is done through the use of different templates depending on the position of the pixel subject to interpolation. In other words, a linear interpolation is performed by using one of the following three templates, represented by equations (7):

If adjacent left and right pixels have values (EmOC is emer, R(2, 0)):

R(x, y)=(R(x-1, y)+R(x+1, y))/2

If adjacent upper and lower pixels have values (for example, R(1, 1)):

R(x, y)=(R(x, y-1)+R(x, y+1))/2

If diagonally adjacent pixels have the values (for example, R(2, 1)):

R(x, y)=(R(x-1, y-1)+R(x+1, y-1))+R(x-1, y+1)+R(x+1, y+1))/4 (7)

[0071] This device performs linear interpolation for the B-component in the same way as for the R-component, by applying one of three patterns presented in equation (7), as described above, in accordance with the location of the pixel to be interpolated.

[0072] Fig. 10F-10H show an example of each color component is applied to the above-mentioned linear interpolation. Processing interpolation generates the pixel values other than 100 and 200. Therefore, it is obvious that the sharpness of these pixel values below the sharpness values of pixels shown in Fig. 10B.

[0073] the Following is an example of adaptive processing interpolation of colors forming the interpolated pixel by using information of pixels of the other color components around a given pixel. This adaptive processing interpolation of colors referred to as "adaptive interpolation". The device performs adaptive interpolation for G-component as follows.

[0074] When the formation of the G-component pixel that has a value in the R-component (e.g. the, G(1, 2)):

H_DIFF=(R(x, y)-R(x-2, y))+(R(x, y)-R(x+2, y))

V_DIFF=(R(x, y)-R(x, y-2))+(R(x, y)-R(x, y+2))

IF (|H_DIFF |>|V_DIFF|) {

G(x, y)=(G(x, y-1)+G(x, y+1))/2

}

ELSE {

G(x, y)=(G(x-1, y)+G(x+1, y))/2

}

[0075] the Determination of the direction of interpolation on the basis of H_DIFF and V_DIFF, calculated from the R-component, therefore, can suppress deterioration of sharpness due to linear interpolation. Although the above description relates to the formation of the G-component pixel that has a value in the R-component, the device may interpolate the G-component pixel that has a value in the B-component (for example, G(2, 1)), in the same way. As described above, when performing the interpolation, the device communicates with the G-component, with one color component without dividing it into G1 and G2 components, and can therefore use the values of a larger number of neighboring pixels than the separation of the G-component in G1 - and G2-components. This should improve the MTF in the band of high frequencies.

[0076] the Device performs adaptive interpolation for R-component through the use of the G signal interpolated as described above, as indicated in equations (8)below.

[0077] If the adjacent left and right pixels have values (for example, R(2, 0)):

Cr=(R(x-1, y)-G(x-1, y)+R(x+1, y)-G(x+1, y))/2

R(x, y)=G(x, y)+Cr

[0078] If the adjacent upper and lower pixels have values (for example, R(1, 1)):

Cr=(R(x, y-1)-G(x, y-1)+R(x, y+1)-G(x, y+1)) /2

R(x, y)=G(x, y)+Cr

[0079] If the diagonally adjacent pixels have the values (for example, R(2, 1)):

Cr=(R(x-1, y-1)-G(x-1, y-1)+R(x+1, y-1)-G(x+1, y-1)+R(x-1, y+1)-G(x-1, y+1)+R(x+1, y+1)-G(x+1, y+1))/4

R(x, y)=G(x, y)+Cr (8)

[0080] Thus, the device performs adaptive interpolation by the interpolation information of the color differences (R-G), obtained from adjacent pixels.

[0081] the Device performs adaptive interpolation for B-component in the same way as for the R-component, by applying one of the three templates, represented by equations (8)described above, depending on the location of the pixel subject to interpolation, and the interpolation information of a color difference (B-G), obtained from adjacent pixels.

[0082] Fig. 10I-10K show an example of each color component that applies the above adaptive interpolation. Values of R-, G - and B-pixels coincide with each other. Each pixel value matches the value of the pixel shown in Fig. 10B. The implementation of adaptive interpolation in order to generate an interpolated pixel using the pixel information of other neighboring color components makes it possible to form an image without deterioration of the sharpness for the layout of pixels shown in Fig. 10B.

[0083] As described above, the device uses on the working recovery image for each component of the recovery image with matching bands and then performs adaptive processing interpolation of colors for each restored color component, thereby improving the MTF even in the frequency band exceeding the bandwidth, which has a layout of pixels of each color component. This indicates that the above effect is not limited to one interpolation method, and that MTF corrected by processing of image reconstruction is maintained until the Nyquist frequency of the image sensor by processing the interpolation of colors in the processing of image reconstruction and the value of improvements MTF is variable depending on the method used for pixel interpolation.

[0084] the Composition and processing module 104 of the image processing to which is applied the present invention described above. The first version of the implementation illustrates the overall layout of the Bayer consisting of R-, G - and B-components. However, the present invention can be applied not only to the layout of the pixels comprising color components R, G and B, but also to the arrangement of pixels comprising color components of many colors other than R, G and B. in Addition, the present invention can be applied to various types of configurations of pixels in the image sensors. General layout of the Bayer consisting of R-, G - and B-components, can be expressed as shown in Fig. 11A, by means of the pattern layout of the pixels without specifying a color is tov. C1, C2 and C3, each indicate one color component.

[0085] In contrast, consider the layout of the color components, for example, shown in Fig. 11B. It represents the layout of the pixels of the image sensor, consisting of four color components C1, C2, C3 and C4. Fig. 12 shows an example of each color component with the layout of the pixels, and each component of the recovery image, obtained by applying the present invention. Component C1 is divided into four components restore the image. C2 - and C3-components are used as components of the recovery image without any changes. C4-component is split into two components of image reconstruction. That is, the pixel color component of the higher spatial-frequency characteristics than other color components, divided into a variety of color components, which must have the same spatial-frequency characteristics as the spatial-frequency characteristics of the other color components. This allows you to perform recovery processing in the image after the unification of the spatial-frequency characteristics of the layout of pixels corresponding components of the recovery image.

[0086] Thus, the present invention mod which should be applied not only to the General layout Bayer, consisting of R-, G - and B-components, but also to various types of configurations of pixels, consisting of different types of color components. Obviously, each arrangement of the pixels is not limited to a matrix layout. The present invention can be applied to any layout, which can unify the frequency characteristics of the respective components of the recovery image by separating a color component of the image sensor.

[0087]The second option exercise

The following describes the processing of image reconstruction according to the second variant of implementation of the present invention. Because the basic layout of the capture device image according to the second variant implementation is similar to the layout of the first variant implementation, shown in Fig. 1, its description is omitted.

[0088] Fig. 13 shows the layout module 104 of the image processing in the second embodiment. The image input module 111 of the processing of image reconstruction is RAW data, where each pixel has a color component of one of the colors R, G and B in the layout Bayer shown in Fig. 2. The second variant implementation applies a filter to restore the image to each of the input RAW data in the layout Bayer without any changes and without the segment is the RAW data on the components of the recovery image. Filters recovery image, respectively, are applied to the four components of image reconstruction in modules 1114-1117 application recovery filters connected in series.

[0089] the Following describes in detail the procedure for processing of image reconstruction in the second embodiment, in the module 111 processing of image reconstruction with reference to Fig. 14. At step S21 module 111 processing of image reconstruction receives information about the actual conditions of the read images from a module 107 detects conditions as described above. Terms of picking up images include, for example, the position change of the image magnification, the diameter of the aperture and the distance to the object.

[0090] At step S22, the filter recovery image for the R-component that is suitable for the detected conditions of the read image is fetched from the filter recovery image stored in the module storage 108 in Fig. 1. At this time, the selected filter recovery image can be adjusted as needed. This operation is to pre-adjust the filter recovery image prepared in module 108 storing, by preparing in advance discrete data of the read image and the correction filter recovery and what the considerations applying during the actual processing of image reconstruction to reduce the number of data for a filter restore the image.

[0091] At step S23, using the filter restore the image of the R component selected at step S22, a module 1114 application recovery filters performs convolution processing by processing using a filter for each pixel of the R component is read in the input image. This allows to eliminate or reduce component blur R-component image due to aberrations caused in the optical system imaging. As described above, the use of filter recovery image that is appropriate for each component of image reconstruction also allows to correct chromatic aberration.

[0092] In steps S24-S29 device performs processing of image reconstruction for component recovery image G1, G2 and B. it Should be noted that the content of the processing of image reconstruction is performed in this case, is identical to the content of the steps S22 and S23, except for the color components, which are subject to change, and therefore, the description of this processing is omitted.

[0093] the convolution Processing for each component of the recovery image on the steps S23, S25, S27 and S29 is the convolution processing by processing using the filter used is of the corresponding components of image reconstruction it is shown in Fig. 4B-4E, and filter recovery image shown in Fig. 8A. Limitation of pixels subjected to the convolution processing, the target components of the recovery image allows you to eliminate unnecessary calculations. It is obvious that the number of effective filter coefficients of image reconstruction in this case is less than the number of effective coefficients of the filter recovery image applied to the image, where each pixel has a respective color components R, G and B shown in Fig. 7A, and is less than the number of effective filter coefficients of image reconstruction applied to the G-component (Fig. 8B), which is not divided. This reduces the load on the processing of convolution. In addition, since the RAW data in the layout Bayer can be used as the input image without any changes, there is no need for software module 1101 separation of the signals, or any new memory. This helps to reduce the consumption of storage space.

[0094] the device Then performs the processing of the recovery image for each pixel in the image and completes the processing module 111 processing of image reconstruction. Since the OTF is changed in accordance with the angle of view (image height) of the optical system forming the image of the events, even within the same conditions of the read image, it is preferable to perform processing of image reconstruction according to the present invention after changing OTF for each segmented image area in accordance with the image height. It is preferable to scan the filter recovery image for the image when performing the convolution processing, and sequentially change the filter for each area. In other words, this device performs the steps S22 and S23 for each pixel of the R component, and executes steps S24-S29 for each component of the recovery image, which should be treated in the same way as described above.

[0095] Although the first and second embodiments of control filters applied to restore the image as the processing of the recovery image, you can manage other types of processing such as processing of the distortion correction processing of the correction values of the peripheral field of view and processing of reducing noise levels before, after and in the middle of the procedure according to the present invention, and to manage such resulting procedure as treatment recovery image.

[0096] Aspects of the present invention can also be implemented through a computer system or device (or through devices such as a CPU or MPU), please take the calls and executes the program recorded on a memory device to perform the functions of the above-described variant(s) of implementation, and by the way, the steps of which are performed by the computer in a system or device, for example, by reading and executing the program recorded on a memory device to perform the functions of the above-described variant(s) of implementation. To this end, the program is available in the computer, for example via a network or a storage medium of various types serving as the memory device (for example, machine-readable media).

[0097] Although the present invention is described with reference to specific embodiments of, it should be understood that the invention is not limited to the disclosed variants of implementation.

1. The imaging device to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system imaging, and referred image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered with one of a set of colored filters, and device treatments is key images contains:
- sharing tool to separate image data sets of colors of color filters on the image data of the respective colors of the color filters;
- many tools image processing, each of which is intended to perform recovery processing by processing using a filter to the image data of one of the respective colors separated by the said means of separation; and
- means of interpolation processing for performing processing of interpolating the colors of each pixel of image data subjected to processing recovery
and the said means of separation is additionally configured to divide image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of a set of colored filters in a variety of colors, many of the image data referred to the same color, so that this set of image data referred to the same color have the same spatial-frequency characteristic, as image data of another color, and
mentioned means of interpolation processing is additionally performed with the ability to handle interpolation of colors through the use of image data sets, the colors, as if it were the image data of one color.

2. The device according to claim 1, wherein a set of color filters of multiple colors includes color filters in the layout Bayer, and
mentioned sharing tool additionally made with the possibility to divide the G-component image data to image data of two G-components, thus, in order to achieve coincidence of the frequency characteristics of the G-component with the frequency characteristics of the R-component and B-component.

3. The device according to claim 1, in which the filter processing using the filter, perform the said means of image processing, contains a two-dimensional filter obtained by performing the inverse Fourier transform for the function based on the inverse function of the optical transfer function of the optical element of the optical system imaging, and
mentioned means of image processing is additionally performed with the opportunity to perform convolution processing for the above-mentioned filter.

4. The imaging device to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system imaging, and referred image data obtained by reading the image is the position of the object, passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered with a color filter, and the device includes:
many image processing arranged in series, each of which is intended to perform recovery processing by processing using a filter to the portion of the image data of multiple colors; and
- means of interpolation processing for performing processing of interpolating the colors of each pixel of image data subjected to processing recovery
- each of these many image processing performed with the opportunity to perform recovery processing for the respective image data of each color that has not been subjected to the processing, recovery, and separately drawn with image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, as with image data of multiple colors, so that the portion of the image data, which need to apply separately, as with image data of multiple colors, and Kotor is to be handled mentioned many image processing, had the same frequency response as the image data of another color, and
mentioned means of interpolation processing is additionally performed with the ability to handle interpolation of colors through the use of image data of multiple colors, as if it were the image data of one color.

5. The device according to claim 4, in which color filters of multiple colors colored filters in the layout Bayer, and
mentioned each of the multiple image processing further comprises means of processing images taken with the possibility to split the G-component image data to two G-component, whose frequency response matches the frequency characteristics of the R-component and B-component image data, and to process the image data of one G-component, and the means of processing images made with the possibility to process the image data of the other G-component, the means of processing images made with the possibility to process the image data R component and the image processing is executed with the ability to process image data of the B-component.

6. The device according to claim 4, in which the processing using the filter, perform many image processing is used to meet the speaker of the filter, containing a two-dimensional filter obtained through the inverse Fourier transform for the function based on the inverse function of the optical transfer function of the optical element of the optical system imaging, and
mentioned many image processing is additionally performed with the opportunity to perform convolution processing for the above-mentioned filter.

7. The method of image processing to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system imaging, and referred image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered by one of the many color filters of multiple colors, and the method comprises:
- phase separation, which divide the image data of multiple colors, multiple color filters on the set of image data of respective colors;
- stage image processing, which performs recovery processing by processing using a filter for each of the image data of the respective colors separated by the splitting step; and
- processing stage interpolation, which perform the processing of interpolating the colors of each pixel of image data subjected to processing recovery;
at stage separation image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, divided into a multitude of image data referred to the same color, so that the set of image data referred to the same color each had the same spatial-frequency characteristic, as image data of another color, and
- on the stage of processing, interpolation processing, color interpolation is performed by using the image data of multiple colors, as if it were the image data of one color.

8. The method of image processing to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system imaging, in relation to the image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the sensor image is of Ajani covered by one of the many color filters of multiple colors, moreover, the method includes:
- stage image processing, which sequentially executes the recovery processing by processing using a filter for each piece of image data corresponding to multiple colors using a variety of appropriate means of processing images arranged in series; and
- processing stage interpolation, which perform the processing of interpolating the colors of each pixel of image data subjected to processing recovery
at the stage of image processing each of the multiple image processing performs processing recovery in relation to the image data of each color that have not been subjected to processing, recovery, and separately drawn with image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, as with image data of multiple colors, which are treated separately so that the portion of the image data, which are treated separately, as with many colors, and which is processed by the multiple image processing, had the same frequency the characteristics of image data of another color, and
- at the stage of processing the interpolation processing of the interpolation of colors is performed by using the image data of multiple colors, as if it were the image data of one color.

9. Permanent readable media storage, having stored thereon a program that is executed by the imaging device, and the program has a program code for implementing a method of image processing to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system imaging, and referred image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor having multiple pixels, with each pixel of the image sensor is covered by one of the many color filters of multiple colors, and the method comprises:
- phase separation, which divide the image data of multiple colors, multiple color filters on the set of image data of respective colors;
- stage image processing, which performs recovery processing by processing using a filter for each of the image data of the respective colors separated by phase separation; and
- processing stage interpolation, which perform the processing of interpolating the colors of each pixel for the data to be transmitted is image, processed restore;
at stage separation image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, divided into a multitude of image data referred to the same color, so that the set of image data referred to the same color each had the same spatial-frequency characteristic, as image data of another color, and
- on the stage of processing, interpolation processing, color interpolation is performed by using the image data of multiple colors, as if it were the image data of one color.

10. Permanent readable media storage, having stored thereon a program that is executed by the imaging device, and the program has a program code for implementing a method of image processing to perform recovery processing of the image data to correct the deterioration of image quality due to aberration in the optical system imaging, in relation to the image data obtained by reading an image of an object passed through an optical system of forming images using the image sensor, the tions, having a set of pixels, with each pixel of the image sensor is covered by one of the many color filters of multiple colors, and the method comprises:
- stage image processing, which sequentially executes the recovery processing by processing using a filter for each piece of image data corresponding to multiple colors using a variety of appropriate means of processing images arranged in series; and
- processing stage interpolation, which perform the processing of interpolating the colors of each pixel of image data subjected to processing recovery
at the stage of image processing, each of the multiple image processing performs processing recovery in relation to the image data of each color that have not been subjected to processing, recovery, and separately drawn with image data of one color, whose spatial frequency characteristic of the higher spatial-frequency characteristics of a different color because of the layout of color filters of multiple colors, as with image data of multiple colors, so that the portion of the image data, which are treated separately, as with many colors, and which is processed by multiple processing tools image is of Ajani, had the same frequency response as the image data of another color, and
- on the stage of processing, interpolation processing, color interpolation is performed by using the image data of multiple colors, as if it were the image data of one color.



 

Same patents:

FIELD: physics.

SUBSTANCE: apparatus for adjusting a magnetooptical system for forming a beam of protons consists of a pulsed electromagnet which is formed by a pair or a system of pairs of thin conductors directed along the axis of a proton graphic channel spread in a transverse plane. A scaling array of metal plates mounted in a frame is placed at the output of the electromagnet. The method of adjusting a magnetic system for forming a beam of protons and a method of matching magnetic induction of an imaging system involve generating a magnetic field, through which the beam of protons is passed, the direction of said beam through the imaging system to a recording system by which the image of the scaling array is formed. Upon obtaining a distorted image, the magnetic beam forming system is adjusted and magnetic induction of the magnetooptical imaging system is adjusted by varying current of lenses of said systems and retransmitting the beam of protons until the required images are formed.

EFFECT: high quality of adjustment.

4 cl, 14 dwg

FIELD: radio engineering, communication.

SUBSTANCE: user sets, in a photograph display device 370B, the fact that a physical address 2000 represents a recording device which controls 370B display of photographs in place of the physical address 2000. According to that setting, the photograph display device 370B defines a logic address as a recording device controlled by consumer electronics control (CEC) devices. When the user performs operations with the recording device 210B on a disc, which is a CEC-incompatible device, using a remote control transmitter 277, a television receiver 250B generates a CEC control command addressed to the disc recording device 210B. The photograph display device 370B detects a CEC control command, converts the CEC control command to an infrared remote control command and transmits the infrared remote control command from the infrared transmission module 384 to the disc recording device 210B.

EFFECT: controlling operations of a controlled device, which processes only a control signal in a second format based on a control signal in a first format.

11 cl, 31 dwg

FIELD: physics.

SUBSTANCE: disclosed apparatus includes a means (100) for providing an aerosol designed to generate an aerosol stream (108) with average particle diameter of the disperse phase of less than 10 mcm in a screen formation area, a means (200) of providing a protective air stream designed to generate a protective air stream (210, 211) on two sides of the aerosol stream (108), wherein the aerosol stream (108) and the protective air stream (210, 211) have a non-laminar, locally turbulent flow near an obstacle on the flow path, wherein the Reynolds number for said streams near outlet openings (134, 215, 216) is in the range from 1300 to 3900.

EFFECT: improved method.

17 cl, 9 dwg

FIELD: physics.

SUBSTANCE: image forming process includes a first image forming process for outputting image data in accordance with illumination of a detector with radiation or light in an illumination field A, which corresponds to part of a plurality of pixels, and a second image forming process for outputting image data in accordance with illumination of a detector 104 with radiation or light in an illumination field B which is wider than the illumination field A. In accordance with transfer from illumination in the illumination field A to illumination in the illumination field B, operation of the detector is controlled such that the detector performs an initiation process for initiating conversion elements during the period between the first and second image forming processes.

EFFECT: weaker ghost image effect which can appear in an image resulting from FPD operation, and which is caused by the illumination region, and preventing considerable drop in image quality without complex image processing.

7 cl, 21 dwg

FIELD: physics.

SUBSTANCE: computer has a video card; in the prototype television camera, the first television signal sensor is based on a charge-coupled device (CCD) matrix with "row-frame transfer" and the use of an additional pulse former for clock power supply of the photodetector provides summation in the memory section of charge signals accumulated in its photodetector section. As a result, sensitivity is levelled on the entire field of the composite image.

EFFECT: high signal-to-noise ratio at the output of the CCD matrix of the first television signal sensor owing to summation, in its memory section, of charge packets formed in the photodetector section.

4 cl, 12 dwg, 3 tbl

FIELD: radio engineering, communication.

SUBSTANCE: invention provides an optical-electronic system which enables to measure density of fluorescence radiance in the UV spectral range, arising from ionisation of atmospheric nitrogen, and converting the obtained information to a visual image of distribution of levels of radioactive contamination on the underlying surface.

EFFECT: faster aerial radiological survey of an area due to shorter flight time of the aircraft and high reliability of instrument measurement data.

3 cl, 2 dwg

FIELD: radio engineering, communication.

SUBSTANCE: apparatus for obtaining location information is configured to wirelessly transmit location information obtained by a means of obtaining location information in response to a location information request from an image obtaining device, wherein a determining means is configured to determine whether the means of obtaining location information transmits location information, and a signal transmission means is configured to wirelessly transmit a signal to the image obtaining device if the determining means determines that the means of obtaining location information does not transmit location information, wherein the signal prohibits the location information request.

EFFECT: facilitating wireless communication with an image obtaining device.

14 cl, 5 dwg

FIELD: electricity.

SUBSTANCE: in a lighting device (20), which is formed by many flat light sources (21) arranged in direction of the plane and having displaced distribution of brightness at a light-emitting plane (43), flat light sources (21) have identical direction of displacement in brightness distribution. Flat light sources (21) are arranged so that some of flat light sources (21) have displacement direction that differs from other light sources (21), and a section with high brightness and a section with low brightness of the light-emitting plane (43) are arranged on the straight line. With such configuration sections with high brightness are not arranged in one row, which does not cause bright strips and accordingly improves quality of an image.

EFFECT: achievement of even distribution of brightness on a surface for light emission and higher quality of an image.

18 cl, 15 dwg

FIELD: electricity.

SUBSTANCE: lighting device in this invention includes multiple point sources 32 of light and a conducting pattern 34. Each of point sources 32 of light includes two electrodes 33a, 33b, having different polarities, and point sources 32 of light are set in multiple matrices 40 of light sources, every of which includes a predetermined number of point sources 32 of light, and excitation energy is sent to each matrix 40 of light sources. Adjacent matrices 40 of light sources are made up so that a point source 32 of light of one of adjacent matrices 40 of light sources is opposite to a point source 32 of light of the other one from adjacent matrices 40 of light sources, with each of electrodes 33a, 33a, opposite one to another. Opposite electrodes 33a, 33a are arranged so that they have identical polarity and are connected with the same conducting pattern 34.

EFFECT: in a lighting device sources of light are arranged evenly and closely between each other, in order to improve brightness of lighting and achieve heterogeneous distribution of lighting brightness.

5 cl, 12 dwg

FIELD: physics, acoustics.

SUBSTANCE: present group of inventions relates to an audio/video stand. The stand has storage space for multiple electronic devices, a connection module which is connected with identification of an identifier ID, used to transmit/receive an operation signal through electronic devices to be placed in the corresponding storage spaces, a storage for the ID of the electronic device for linking the connection ID of the connection module with the ID of the electronic device to identify the electronic device using the connection ID, and which stores these IDs, a collection module for the operation signal, a collection module for the signal for selecting a device receiving the signal for selecting a device associated with the ID of the electronic device, a collection module for the connection ID which searches in the storage for the ID of the electronic device using the ID of the electronic device, associated with the received signal for selecting the device as a key, and which receives and stores the connection ID of the connection module as the recipient of the transmission for the operation signal, and a collection module for the operation signal for the electronic device receiving the operation signal to be transmitted for the stored connection ID.

EFFECT: broader functional capabilities of the audio/video stand owing to use of connection IDs for electronic devices in the stand.

20 cl, 26 dwg

FIELD: information technology.

SUBSTANCE: image processing device stores in memory first image data having the highest frequency among a plurality of image data fragments of different frequency ranges in a state in which each pixel of the first image data includes a colour component signal of any of a plurality of colours, and further stores in memory second image data and third image data whose frequency ranges are lower than that of the first image data, in a state in which some or all pixels of the second and third image data have colour component signals of a plurality of colours.

EFFECT: image processing device can perform corresponding image processing, while simultaneously preventing increase in memory capacity for storing a plurality of image data fragments of different frequency ranges.

8 cl, 6 dwg

FIELD: physics, computer engineering.

SUBSTANCE: invention relates to digital image processing means. In the method, a multi-centre scanning (MCS) initial mesh is represented by a discrete square of nine cells; the initial mesh is scanned from the centre to the edge of the square and then while bypassing the remaining meshes around a circle; the priority is the path with the bypass direction to the left side from the centre of the square and then clockwise; the formed structure is a facet (pFas), where p is the recursion step, if p=1, the initial mesh described above is present; four types of bypassing are distinguished to construct the furthest recursion directions: bypass w1 as the initial (1Fas1), bypass w2 as the mirror from 1Fas1 to the left side (1Fas2), bypass w3 as the mirror from 1Fas2 upwards (1Fas3), bypass w4 as the mirror from 1Fas3 to the right side; bypass is performed successively with initial movement to the square to the left from 1Fas and then clockwise around 1Fas.

EFFECT: high degree of image compression without losses.

4 dwg

FIELD: physics, computation hardware.

SUBSTANCE: in compliance with this invention, sequence of images including multiple lower-resolution images is contracted. Vectors of motion between reference image in sequence and one or several nest images in sequence are defined. The next forecast image is generated by application of motion vectors to reconstructed version of reference image. Difference between next actual image and next forecast image is generated. Image in sequence from set to set is decoded and SR technology is applied to every decoded set for generation of higher-resolution image by rime interpolation and/or spatial interpolation of reference and difference images. Compression of sequence of images includes steps of determination of vectors of motion between reference image and at least one of extra image of sequence of images. Note here that obtained vector of motion is applied to forecast at least one extra image to calculate difference in mages between at least one extra image and forecast of at least one extra image, respectively.

EFFECT: high-resolution imaging by superhigh resolution technology.

13 cl, 5 dwg

FIELD: information technology.

SUBSTANCE: method involves selecting a hue conversion function based on analysis of distribution of hue of elements of the entire image, varying parameters of the hue conversion function for each image element based on analysis of said distribution of the local surrounding region, where parameters are varied gradually for neighbouring image elements; and converting hue of each image element using the hue conversion function with parameters obtained for that image element.

EFFECT: high quality of digital images by increasing global and local contrast without generating undesirable artefacts and distortions.

12 cl, 8 dwg

FIELD: information technology.

SUBSTANCE: automatic photograph retouching method involves creating a data array from photographs of different themes of classes, creating a database of references therefrom by interactive processing, based on Photoshop CS2, of predefined textures which give a comfortable perception of images of objects on the photographs, constructing a function of a photometric correction signal between the original and reference photographs, determining barcodes of the original photograph and the reference photograph by decoding brightness I(x,y) of matrices of images with the size |m×n| of elements in the matrix of intensities of tonal transitions with dimensions ||k×k|| of elements of the original and reference retouched photographs, algebraic subtraction of the matrix of barcodes while setting a threshold for positive identification of the reference, creating a reference address with barcode extension therefrom, a threshold difference and a photometric correction signal function, automatic search of the reference for the analysed photograph and retouching thereof based on the calculated barcode at the address in a reference database and the photometric correction signal function.

EFFECT: automatic search for the reference for a processed photograph by creating code features of each reference located in a reference database, and subsequent automatic photometric correction of the processed photograph on a reference texture mask stored in the reference database.

4 dwg

FIELD: information technology.

SUBSTANCE: method involves pixel by pixel browsing of each line of the depth or disparity map, formed by a pixel matrix, in a predefined direction and assigning each invalid pixel encountered on a current line a value determined as a function of pixel values associated with a predefined vicinity around the first valid pixel that follows the invalid pixel on the current line and the value of which corresponds to a greater depth or a lesser disparity relative to the value of the last valid pixel.

EFFECT: higher definition around outlines of objects when displaying a three-dimensional reconstructed image.

13 cl, 14 dwg

FIELD: medicine.

SUBSTANCE: group of inventions relates to medicine, visualization of vessels and their connection with pathological changing. Data of 3-dimensional image, reflecting spatially changing degree of connection of vessels between areas of data in 3-dimensional image and pathological changing, are created. Data can be represented by means of displaying maximal intensity projection (MIP), where image brightness represents degree of vessel participation in blood supply of pathological changing. Corresponding computer-readable carriers are used in method realisation. Described methods of visualisation can be useful in visualisation of connectedness with structures which are not pathological changes, and in visualisation of connectedness which is not connectedness of vessels.

EFFECT: inventions provide representation of degree of connectedness of vessels by means of relative brightness of vessel with increased reliability with respect to image interference.

12 cl, 4 dwg

FIELD: information technologies.

SUBSTANCE: in some versions of realisation in a method of video sequence treatment: an input video sequence is received, having resolution of the input video sequence; an image of the input video sequence is aligned; noise is reduced in aligned images; and an output video sequence is generated from images with reduced level of noise, in which the output video sequence has the same resolution as the resolution in the input video sequence.

EFFECT: reduced noise in a video image.

21 cl, 19 dwg

FIELD: information technology.

SUBSTANCE: deblocking filter 113 adjusts the value of disable_deblocking_filter-idc, slice_alpha_c0_offset_div2 or slice_beta_offset_div2 based on the Activity of an image calculated by an activity calculation unit 141, the total sum of orthogonal transformation coefficients of the image calculated by an orthogonal transformation unit 142, Complexity of the image calculated by the rate control unit 119, or the total sum of prediction errors of the image calculated by a prediction error addition unit 120.

EFFECT: improved image quality through correct deblocking.

8 cl, 7 dwg

Video camera // 2473968

FIELD: information technology.

SUBSTANCE: video camera has a portable housing having a light focusing lens, a light-sensitive device which converts the focused light into source video data, a storage device installed in the housing, and an image processing system configured to introduce predistortions into the source video data and compression thereof, wherein the compressed source video data remain essentially visual without loss after decompression, and also configured to store compressed source video data in the storage device.

EFFECT: reduced loss of quality of a compressed image during decompression and display.

22 cl, 18 dwg

FIELD: technology for processing remote probing data for detection and recognition of objects on basis of their images.

SUBSTANCE: method includes source geometrically pixel-wise combined digitized images of one and the same scene concurrently in n spectrum ranges, source signs matrix is formed, each element of which represents n-dimensional values vector of signals of pixels of source images with similar coordinates, standard is selected in form of arbitrary element of source signs matrix, final numeric matrix is formed, to each current element of which value is assigned, equal to distance in vector signs space between vector, appropriate for standard and vector, appropriate for element of source matrix with same number of row and column to those of current element, final matrix is transformed to digital image, while as signs, brightness values of pixels are used, textural and gradient characteristics of source image pixels.

EFFECT: simplified operations concerning generation of synthesized image for visual interpretation, its adaptation to objects targeted by observer, detailed reflection of chosen objects on synthesized image and compact representation of information.

4 cl, 2 dwg

Up!