RussianPatents.com

Display control device and method of display control. RU patent 2517723.

Display control device and method of display control. RU patent 2517723.
IPC classes for russian patent Display control device and method of display control. RU patent 2517723. (RU 2517723):

G06F3/041 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements (typewriters B41J; conversion of physical variables F15B0005000000, G01; image acquisition G06T0001000000, G06T0009000000; coding, decoding or code conversion, in general H03M; transmission of digital information H04L)
Another patents in same IPC classes:
Method and device for web-page browsing simulation Method and device for web-page browsing simulation / 2517381
Invention relates to computers, particularly to control over navigation over web-pages. Proposed method comprises collection of data on navigation over browser application page. Said data includes that on the page structure complying with the page layout, data on page browse area that define the correspondence between the page area and page structure data. It includes the data complying the area display time during navigation and including page area browse start and finish time intervals. Said data is stored, forecasting model is generated to forecast the page browse area and application of this model at browsing.
Touch actuated sensor configuration integrated with oled structure Touch actuated sensor configuration integrated with oled structure / 2515710
Invention relates to input devices. The device has a first substrate and a second substrate. The first substrate has a touch actuated sensor having on one of two sides of the first substrate at least a plurality of drive lines or a plurality of sense lines, a first metal coating formed on said first of two sides of the first substrate. The second substrate on the first of two sides has a first layer having driven thin-film transistors, a second layer having OLE material, first metal coating sublayers lying on one side of the OLE material, a second metal coating sublayer lying on the other side of the OLE material. The first and second layers are mutually adjacent and are arranged such that at least some of the thin-film transistors of the first layer of the second substrate can electrically drive at least a portion of the OLE material. The two substrates are oriented such that the second side of the first substrate is further from the second side of the second substrate and the first metal coating, formed on said first of two sides of the first substrate, is connected to the second metal coating on the second substrate.
Linking visual properties of charts to cells within tables Linking visual properties of charts to cells within tables / 2514102
Invention relates to applications for processing electronic spreadsheets and more specifically to linking visual properties of charts to cells within tables. The method involves first selecting a visual property of chart data display presented in an electronic spreadsheet, wherein said visual property is associated with a visually perceptible chart display property. Cells within the electronic spreadsheet are then selected. The visual property is then linked with that cell. Further, the method involves testing for changes in cell contents and updating the visual property of the chart display in response to changes in contents of said cell.
Digital image processing device and touch-based image scaling method Digital image processing device and touch-based image scaling method / 2514099
Invention relates to a digital image processing device which employs image scaling to change the scale of a displayed image using a touch procedure. The digital image processing device comprises an interface processor to identify a touch gesture of a user applied to the surface of a touch-screen display when image data are displayed, the touch gesture of the user describing an arched trajectory, and a display control processor for controlling scaling up and scaling down of the displayed image based on the centre angle of the arched trajectory described by said gesture.
Method for displaying audiovisual series on screen of mobile communication terminal, primarily relating to one event, formed by capturing with multiple cameras and user reproducing device - mobile communication terminal for realising said method Method for displaying audiovisual series on screen of mobile communication terminal, primarily relating to one event, formed by capturing with multiple cameras and user reproducing device - mobile communication terminal for realising said method / 2513722
Invention relates to mobile communication. The method of displaying an audiovisual series, primarily relating to one event, on the screen of a mobile communication terminal includes a step for simultaneously obtaining audiovisual information from at least two sources displaying an event of an ambient environment in different perspectives which enable to obtain a video image and/or sound. The method involves merging frames of original images in a single video stream, and transmitting the obtained audiovisual information from said sources to at least one mobile communication terminal while providing synchronisation thereof. Further, the audiovisual information is received with at least one mobile communication terminal. The method also involves forming on the screen of the mobile communication terminal an audiovisual series field which displays audiovisual information received from at least one source of such information, while simultaneously forming at least one audiovisual series field, closed from the view of the user by a mask of graphic primitives.
Integration of command surfaces with multiple active components Integration of command surfaces with multiple active components / 2511583
Invention relates to integration of command surfaces with active components on web-page. Command section is integrated on web-page with first component associated with first application and second component associated with second application and not associated. Note here that said command section comprises user-selected execution commands. Access to first list for first component is executed, said list identifying the command instruction section. First component is registered to allow reception notification therefrom. Access to second list for second component is executed, said list identifying the command instruction section. Second component is registered to allow reception notification therefrom. Then, defined is whether one of the commands included in said list is activated. Defined is when the component should receive the command to define when second component should receive the command. Command is directed to first component for processing when this is defined to direct the command second component processing when this is defined.
Lighting facility based on organic light-emitting diodes (led) with in-built proximity sensor Lighting facility based on organic light-emitting diodes (led) with in-built proximity sensor / 2511567
Invention is referred to devices based on organic LEDs and touch sensors for obtaining of near-field image. The device contains cathode, anode and active unit where anode is divided into a number of anode segments and each segment is defined by OLEDs (organic LEDs); capacity measurement unit made to measure current value of capacitance for each anode segment in regard to reference point and to detect changes in current values of capacitance for a number of anode segments depending on relative position of the dielectric object, which is drawn near device based on OLEDs.
Imaging device and method of its control Imaging device and method of its control / 2511499
Proposed device comprises the following components: display panel first module designed to imaging in user direction. Display panel second module designed to imaging in subject direction. Imaging processing module is designed for photoelectric conversion of light incident on subject side to generate the signal of taken image. Record processing module designed to process recording of taken image signal on record carrier. Control module to control display panel first and second modules for each of multiple operation period types varying in operation of imaging processing module or record processing module, and control over imaging wherein display panel first and second modules can have different content of images in time, at least, during one period of operation.
Video surveillance device Video surveillance device / 2510960
Invention relates to video surveillance systems using means of recognising hazardous at a secure facility. The device comprises at least one video camera which is capable of converting a video signal to a digital video signal, a video signal storage unit, a unit for converting the digital video signal to a sequence of images on a video monitor screen, and further includes a manipulator of a control unit of the conversion unit which is such that movement of the manipulator corresponds to change in images on the video monitor screen, and the direction and rotational speed of the wheel corresponds to the sequence and rate of change of successive images; the device includes an event analyser which picks up events from the digital video signal which require the attention of the operator, and generates a labelling signal upon picking up an event which requires the attention of the operator; the device further includes an event display in form of a wheel, which is such that its rotation corresponds to the rotation of the manipulator of the control unit, and the event display displays images of labels of events recorded by the camera which require the attention of the operator.
Display device Display device / 2510931
Invention relates to a display device equipped with an optical sensor in a pixel region. Display device has optical sensor having a highly sensitive photosensitive element. The photosensitive element is provided with a diode (D1), reset signal wiring (RST) that supplies a reset signal, readout signal wiring (RWS) that supplies a readout signal, a storage node whose potential (VINT) changes in accordance with the amount of light received by the diode (D1) in the period from when the reset signal is supplied until when the readout signal is supplied, an amplification element (C1) that amplifies the potential (VINT) in accordance with the readout signal, and a sensor switching element (M2) for reading out the potential amplified in the output wiring. The potential of the screening film (LS), provided on the back surface of the diode, is fixed equal to a constant potential (VLS) which satisfies the following relationship: VLS≄VRST.H.
Atm and method for its operation (variants) Atm and method for its operation (variants) / 2248044
Methods for inputting data into ATM include following steps: forming an input signal when function key or auxiliary keyboard key is pressed at ATM panel, at the same time data inputted into ATM are assigned to pressed key, conversion of input signal from key to input signal, matching coordinate data of point of a point on screen of ATM, previously matched with pressed key, redirection of coordinate data input signal into computer data flow, meant for data input from mouse. Method for data input may have additional steps: forming of signal by pressing key of auxiliary keyboard, conversion of input signal from auxiliary keyboard key to input signal, matching key on standard letter-digit keyboard, previously matched with pressed key; redirection of received input into data flow of computer, meant for inputting data from standard keyboard. ATMs have computer and at least one device for performing financial operations in response to data input in computer, performed in accordance to methods for inputting data into ATM. Devices for storing software have software, providing for control of ATM in accordance to methods for inputting data in ATM.
Device for inputting image into personal computer Device for inputting image into personal computer / 2256210
Device has CPU, control block, receipt register, buffer memory block, address counter, first and second channel transmitters blocks, PC connection block, amplifier, pulse generator, control signals generator, second receipt register, second buffer memory block, first, second and third buffer registers, receipt-transmission register, strings counter, adder, first string counter and digital comparator.
Navigation Navigation / 2265245
Objects are linked to multiple directions on basis of positions of these objects, ascension or descent for one level in hierarchy of such objects is performed, while a certain group of objects can be passed as a whole at first level, and shifting between its other elements or sub-group can take place at second level.
Method for connecting semi-bridge indicator to personal computer Method for connecting semi-bridge indicator to personal computer / 2265878
Semi-bridge indicator is connected to computer through sound card, central processor unit of which is used for forming bearing alternating voltage, and two stereo output channels form alternating current bridge together with semi-bridge indicator, from diagonal line of which signal is injected into central processor unit of sound card.
Device for making image input into computer Device for making image input into computer / 2267150
Device has alphanumeric transformer, control unit and additionally introduced amplifier and transceiver register. Data input is carried out by sequentially reading each byte saved in the transceiver register.
Method for selecting product by means of data transfer network Method for selecting product by means of data transfer network / 2287176
Server system 100 controls selection of product through data transfer network 120, by means of a series of screens. Server system 100 processes user data from user input signals to determine whether user data are synchronized with at least one of a set of products. Server system 100 transfers the earliest one of screen signals, appropriate for such earliest one of screens in series, which has no synchronized data, if selected screen is next in the order of screens and if previous screens in order, prior to selected one, all have no synchronized data.
Information input device Information input device / 2291476
Device contains commutator, storage device, two counters, synchronizer, three gates. Address inputs of commutator and storage device are connected to each other and to byte outputs of counter, which continuously changes its state influenced by clock signal. Each counter state corresponds to the number of external device and to information written in storage device, which defines this external device. Address digit count of storage device is greater than commutator address digit count by one. That is, each external device has an identifier of two words, which is passed to computer during current query cycle.
Computer spherograph, combined computer control device based on it Computer spherograph, combined computer control device based on it / 2292072
Device is equipped with a lever with controlling and controlled arms, hanging on a support with the capability of moving controlled arm inside a spatial angle of a fixed spherical surface. Buttons and scroll wheel are located on controlling arm. Controlled arm, as well as the scanned interior spherical surface, is located inside the device.
I/o device I/o device / 2295748
I/o device can be used in devices for manual data input/output. Device has keyboard mounted on panel, radiation receiver/ source couples, two reserved channels for data processing, which channels have decoder, switches and optical module, control bus, data bus and microcontroller. Radiation receivers and radiation sources are placed in case of optical module.
Data input method Data input method / 2300129
Technical effect is achieved by means of repeated execution of following: setting of beginning of counting of values of parameter value determining parameters, comparison of these values to sets of conditions of selection of spatial areas from a set of such areas, including at least condition of whether parameter values, corresponding to manipulator movement, belong to this spatial area, recording a series of input of spatial areas, comparison of this record to certain set of series of selection of spatial areas, inputting data from a set of data, connected to matching series.

FIELD: physics, computation hardware.

SUBSTANCE: invention relates to device for control over display of images that detects the proximity of the object to the display. Depending upon object displacement relative to display the controller displays the images at said display device. Images can be scrolled at said display consecutively or in response to definite object displacement it is possible to change over to definite pictures. Interval of said changeover can be set at menu which, if necessary, can be presented at said display.

EFFECT: browsing great number of pictures.

19 cl, 11 dwg

 

The level of technology

The present invention relates to a device control the display of images and the way to control the display of images, and, in particular, it refers to the device control the display of images and the way to control the display of images that allow you to more easily view multiple images.

In certain electronic devices, including as modules display the touch panel, the image displayed on the module's display, "scroll" through operations "tow" and operations "fast flip" (sharp turning motion), performed on the touch panel. In addition, the movement of the user's finger can be recognized by the camera, part of the cell phone, and the image displayed on the display module can be "spun" in accordance with the moving finger (see, for example, a Japanese publication of unexamined patent application number 2009-260907).

Disclosure of the invention

In recent years there have been developed the recording media that have a large capacity for recording content, and have the option to save in portable electronic devices large number of images. Consequently, there is a need for an operational way to easily view multiple images.

It is desirable to see more images more easily.

In accordance with the cited as an example of an implementation option controls image display unit consists of a sensor detects an object, which is located near the device; the device is displayed; and the controller is made with the possibility to replace the first image in the second image in response to movement of the object, remaining close to the display device.

In accordance with one aspect of the device sensor also has a capability to detect a touch of the object to the display device, and the controller has a capability to process touch differently than being close to the display device.

According to another aspect of the device sensor is made with the ability to detect multiple touches of the object to the display device as the operation of the "tapping".

According to another aspect of the device controller is designed with the ability to respond to the sensor detects that the object remains near the device display when you move the object by the specified distance in the lateral direction across the surface of the display device by the "scroll" the first image beyond the display device while spinning the second image on the display device.

According to another aspect of the device sensor made with the possibility to determine the speed of the object in the lateral direction and when the last object speed sideways less than the specified threshold, the controller is configured to perform the operation "fast flip in a position close", and operation is fast turning into a position close" animates "scroll" second image on the display device.

According to another aspect of the device the controller is made with the opportunity to determine whether the magnitude of the lateral movement of the object, while he remains close to a display device, smaller than the specified threshold value, and in case when the value is less, the controller is made with the possibility to return the first image in its original position on the display device that occurred before detection sensor near the object.

According to another aspect of the device the controller is made with the opportunity to determine whether the magnitude of the lateral movement of the object, while he remains close to a display device, more than the specified threshold, and in case when the value is greater, the controller is configured to "spin" the second image on the display device.

According to another aspect of the device controller is designed with the ability to display the menu, in any mode of operation "fast flip in a position close", and jump to a specific image in the ordered set of stored images, so that when discovered that the object does a "fast perelistyvaem" moving in a position close, display a specified image.

According to another aspect of the device is an ordered set of stored images ordered at least by date and/or the storage folder.

According to another aspect of the device in case the time during which the object remains in a state of intimacy, exceeds the specified threshold time, the controller has a capability to display a menu for state vicinity.

According to another aspect of the device controller is designed with the ability to display the menu for the state of intimacy in accordance with the position of the object relative to the display device.

According to another aspect of the device the controller is configured to delete menu for a state close when the last speed of the object, it is removed from the display device is less than the specified threshold.

According to another aspect of the device the controller is made with the possibility to define what the object moves fast turning in a position close"when the last object's speed when it is deleted from the device display is greater than the specified threshold, the controller has a capability to perform the animation "scroll" second image on the display device that corresponds to the contents of the menu are in a state of intimacy.

According to another aspect of the device provides for electrostatic touchpad, which includes the sensor and the display device.

According to another aspect of the device the controller is capable of running animation "scroll" the first image, and the second image with the first effect in response to the touch and with the second effect in response to being near a display device.

In accordance with another cited as an example of an implementation option controls image display device includes: the sensor is made with the possibility of detection of an object placed near the device; the device is displayed; and the controller is made with the possibility to react to the change of approach of the object change the first image the second image, and the second image is selected from a set of images stored in the specified order.

In accordance with one aspect of the device sensor also has a capability to detect a touch of the object to the display device, and the controller has a capability to process touch differently than being close to the display device.

In accordance with the cited as an example variant of realization of the way to control the display of images method includes the time that: discover through proximity sensor subjects located near a display device; and changed through the controller first image in the second image on the display device in response to movement of the object when the object near a display device.

In accordance with another cited as an example variant of realization of the way to control the display of images method includes the time that: discover through proximity sensor subjects located near a display device; and respond by the controller on the change of approach of the object change the first image the second image, and the second image choose from a variety of images stored in the specified order.

Accordingly, you can easily view a large number of images.

Brief description of drawings

Figure 1 is a block diagram illustrating the configuration of the device of the input image that serves as a vehicle for controlling the display of the image that matches one version to the implementation of this disclose the invention;

figa and 2B are views in perspective, illustrating the configuration of the appearance of the input image shown in figure 1;

figure 3 is a schematic illustrating the screen image to be displayed when the first process control the display;

figure 4 is a schematic illustrating another screen image that is displayed when you first the management process mapping;

figure 5 is a schematic illustrating the following screen image that is displayed when you first process control the display;

6 is a block diagram of the algorithm, illustrating the first process control the display;

Fig.7 is a diagram illustrating the screen image to be displayed when the second process control the display;

Fig is a block diagram of the algorithm, illustrating the second process control the display;

figure 9 is a block diagram of the algorithm, illustrating menu for state proximity, shown Fig; and

figure 10 is a block diagram that illustrates the configuration of your computer, corresponding to other variant of realization of the present disclosure of the invention.

The invention,

An example configuration of the device of input image

Figure 1 is a block diagram that illustrates the configuration of the device of the input image that serves as a vehicle for controlling the display of the image that matches one version to the implementation of this disclose the invention.

The device 1 enter the image shown in figure 1, includes modules: module 11 lens random access memory (RAM) 27.

Module 11 lens includes a camera lens, aperture, focusing the lens, and the like. On the optical path of the light from an object that (light) passes through 11 of the lens, is located item 12 of the input image, such as a sensor on the basis of the charge-coupled device (CCD).

In addition, CPU 15 digital signal processing is connected module 17 display and recording device 19. On displaying the image of the screen module 17 display is a touch panel 16. Touch panel 16 and module 17 display are touch screen 18. Module 17 display includes liquid crystal display (LCD) or the like.

Module 11 lens connected actuator 20, used to control the diaphragm, which is part of module 11 lens, and move the focusing lens, included with module 11 of the lens. Actuator 20 also connected actuator 21 of the motor. The 21 actuator motor controls the operation of the actuator 20.

The Central processing unit (CPU) 23 manages the entire unit 1 input image. Therefore, with the Central processor 23 joined: 13 processor analog signal, analog-to-digital Converter 14, processor, 15 digital signal processing, 21 actuator motor, timing generator (TG-generator) 22, operational module 24, electrically erasable programmable permanent memory (EEPROM) 25, permanent memory for programs (ROM programs) 26, random access memory (RAM) 27 and touch panel 16.

Touch panel 16, which represents, for example, electrostatic touchpad, detects tapping (touch), you perform on the touch panel 16, and displays the CPU 23 information about the place of touch on the touch panel 16. Also, the touchpad 16 might detect a user's finger, approaching the touch panel 16, within a pre-set distance from it (hereinafter referred to as, where appropriate, as "proximity"), in accordance with changes (level) electrostatic capacity, even though the finger user has not yet touched the touchpad 16. The distance within which the touchpad 16 might detect proximity, is the distance within approximately 20 mm from the screen. Within a distance of 10 mm can be recognized place on the touch panel 16, in which close finger. Note that the target object detection of contact or proximity with touch panel 16 is not limited by the user's finger, but can be detected and similar dielectric object. It is further assumed that you are detecting the user's finger.

The user can perform the operation "towing and the operation of the "fast flip", touching the touch panel 16 or holding a finger close to the touch panel 16. To detect this kind of operation the user can CPU 23 that receives a signal from the touch panel 16.

Operation "towing and the operation of "fast flip" are similar to each other in terms of operation conduct (planar motion, running a finger or some object on a plane which is parallel to the touch panel 16. However, the operation "towing and the operation of "fast flip" differ from each other by the fact that when performing the "pushback" the speed of the finger that pulls away (or which is about to move away from this plane is low (less than some predefined value), while performing operation "fast flip", which is a transaction, representing a sharp movement, the speed of the finger that pulls away (or which is about to move away from this plane, is high (exceeding a pre-set value). In other words, the operation "tow" is a transaction, containing a stop moving finger, which perform "holding," and move your finger up in the direction of the plane, which carried out while the operation "fast flip" is a transaction move a finger to perform "holding", in the transverse direction away from the plane, which carried out, while maintaining speed.

Recording device 19 formed removable recording media, including optical media, such as digital versatile disk (DVD), and semiconductor memory device such as a memory card. Recording device 19 writes the image (the image signal), obtained by means of the input image. Recording device 19 can be removed from the enclosure 1 input image.

Electrically erasable programmable permanent storage device 25 different stores setup information. In addition, electrically erasable programmable permanent storage device 25 stores other information, such as information to be saved even when the power source is transferred to the off state.

The permanent storage 26 for programs store the program to be executed by the CPU 23, and data used for the execution of these programmes.

Random access memory 27, serves as a workspace, temporarily stores the programs and data that are used in the performance of the CPU 23 different processes.

Next is a brief overview of each function of the device 1 enter the image that has the configuration shown in figure 1.

CPU 23 manages the various modules that are part of unit 1 input image, executing programs written in the ROM 26 for the program. Then the CPU 23 performs pre-defined processes, including the process image input and process control the display of images on the module 17 display in accordance with the signal coming from the touch panel 16, or a signal from the operational module 24.

The user performs an operation on the operating module 24, the module sends a signal corresponding to this operation, CPU speed, 23. Operational module 24 includes, for example, the lever 41 change the focal length (TELEPHOTO/WIDE angle LENS) and click 42 optical shutter, which should be described with reference to figure 2.

When adjusting the actuator 20 in action 11 of the lens out of the enclosure 1 of the staff of the input image or slide into this case. In addition, when adjusting the actuator 20 is done aperture control, included with module 11 lens, and moves focusing lens, included with module 11 of the lens.

Timing generator 22 running CPU 23 delivers chronology signal to the item 12 of the input image. In accordance with roniroosid signal in item 12 of the input image adjusts the exposure time and the like.

Item 12 of image input is triggered in response to roniroosid the signal from breathalyser generator 22, so that to receive the light from an object that come through 11 of the lens, and perform photovoltaic conversion. Then the item 12 of the input image takes the analog signal of the image corresponding to the number of the received light, processor 13 analog signal. The actuator 21 of the electric motor drives the actuator 20 running CPU 23.

CPU 13 analog signal performs a process on the analog signal, such as the strengthening of the analog signal image coming under the control of the CPU from 23 item 12 of the input image. The resulting analog signal image is served from 13 processor analog analog-to-digital Converter 14.

Analog-to-digital Converter 14 performs analog-to-digital conversion on the analog signal image coming under the control of the Central processor 23 from 13 processor analog signal. The resulting digital image signal moves from analog to digital Converter 14-in CPU 15 digital signal processing.

CPU 15 digital signal processing performed by the process over a digital signal, such as eliminating noise on digital image signal coming from analog to digital Converter 14 running CPU 23. CPU 15 digital signal processing makes the module 17 display to display the image corresponding to this digital image signal.

In addition, the processor 15 digital signal processing performs on digital image signal coming from analog to digital Converter 14, coding with compression in accordance with a pre-specified encoding and compression, such as how to compress JPEG (joint group of experts on native image processing). CPU 15 digital signal processing makes recording device 19 record digital image signal, which was subjected to encoding with compression.

CPU 15 digital signal processing also reads digital image signal, which was subjected to encode compressed, with a recording device 19 and performs decoding with the softening in accordance with the method of decoding with softening corresponding to the pre-set encoding compression. CPU 15 digital signal processing makes the module 17 display to display the image corresponding to this digital image signal.

In addition, the processor 15 digital signal processing generates running CPU 23 picture frame AF frame)used to perform the function of auto focus and image button menu and makes the module 17 display to display these images.

The image imposed by item 12 of the input image is displayed on the module 17 display. In this case, the auto focus frame is set on the image displayed on the module 17 display. In accordance with the image included in the frame autofocus, managed focus.

As described above, the phone 1 input image has function of auto focus. In addition, the phone 1 input image has the function of automatic exposure (AE function) and automatic white balance (AWB-function). These functions are implemented when reading and performing CPU 23 programs stored in the constant storage device 26 for programs. In addition, the function of auto focus, auto exposure and auto-white-balance adjustment are just examples of the features available in the phone 1 input image. Thus, the phone 1 input image has various functions related to photography.

Figa and 2B are views in perspective, illustrating configuration appearance 1 input image.

Note that from among surfaces of apparatus 1 input image surface that when photographing a user object is converted to an object, that is, the surface, which includes 11 lens, located on it, is referred to as "the front surface". On the other hand, from the number of planes unit 1 input image plane when photographing a user object is converted to the user, that is plane, opposite the front surface" is referred to as "back". In addition, from a number of planes unit 1 input image plane, located on the top side, and the plane located at photographing the user object from the bottom party, referred to respectively as "upper surface and lower surface".

Figa is a perspective view that illustrate the configuration of the appearance of the front surface of the phone 1 input image shown in figure 1. FIGU is a perspective view that illustrate the configuration of the appearance of the back of the phone 1 input image.

The front surface of the device 1 enter the image can be closed with a lid 47 of the lens. When the cover 47 lens is open down on the drawing, it turns out the state shown in figa. As shown in figa, the top section of the front surface, which removed the cover 47 lens, includes the lens 45 and lighting device 46 auto focus, included with module 11 of the lens, which is located in this order on the right side.

Lighting device 46 auto focus also serves as the lamp timer. The upper part of the phone 1 input image includes the lever 41 change the focal length (TELEPHOTO/WIDE-angle)button 42 optical shutter button 43 playback and click 44 power supply, located on the top surface in this order on the left side figa. The lever 41 change the focal length, the button 42 optical shutter button 43 playback button and 44 power are part of the operating module 24, shown in figure 1.

As shown in FIGU, across the back of the phone 1 input image is a touch screen 18.

On the touch screen 18 are in photo mode, intended for input of the image object, the image is displayed enter the item 12 of the input image, while in playback mode, designed to display the image, which was introduced (previously), it displays an image recorded in the recordable 19. In addition, on the touch screen 18 are displayed as graphical user interfaces (GUIs): button menu used for setting (changes) of the different elements of the settings of the machine 1 input image; button display a list that is used to display the list of images that have been saved; the delete button that is used to delete the displayed image; and button shooting mode is used to enter the photo mode.

Control the display of images in the first variant of the invention

Following are the first way to control the display, perform unit 1 input image (Central processor 23), which is a way to control the display of the image corresponding to the first variant the implementation of this disclose the invention.

In the phone 1 input image the image input, which is read from the recording device 19 and displayed on the module 17 display, can be "spun" by operation of touching and holding on the touch screen 18 user's finger operation "tow" and "fast flip") in playback mode. In addition, the phone 1 input image the image input, as in the case of the operation of the touch and "holding" on the touch screen 18, can be "spun" by the operation of the presentation of the user's finger close to the touch screen 18 and holding on the touch screen 18.

An example of the screen when you first way to control the display

Figure 3 shows the state in which the user's finger held close close touch screen 18 in playback mode.

The image input P1 appears in the center display the on-screen image in playback mode. Located on the left edge of the display area of the screen displays: button M1 menu button M2 the calendar display, button M3 list, click M4 "slide-show" mode slideshow) button M5 removal, and being with the right edge of the display area of the screen displays (like images): button M6 "General plan - zoom" button and M7 regime photography. When the finger user simply is brought close to the touch screen 18 displayed screen image will not change.

Figure 4 shows the displayed screen image in the condition in which the user's finger, carried close to the touch screen 18, as shown in figure 3, moves in the transverse direction (to the right) in the vicinity.

Machine 1 input image (CPU 23) detects the movement of the user's finger found in a position close to a certain length or more. Then in the phone 1 input image the image input P1, which is located in the center of the displayed screen image, "scrolls" in accordance with the movement of the finger. As shown in figure 4, in the "scrolling" entered image P1 right on the left side of the entered image P1 contains a part of the put image P2, subject to display the following.

Describe the mutual arrangement entered images P1 and P2. In the recordable 19 recorded many of the captured images, obtained by inputting an image made on the machine 1 input image. In the phone 1 input image entered the images are displayed one after the other in the direction of forward or backward in the pre-order, such as in order of date, in order of file name (alphanumeric characters) or the order in the recordable 19. The image input P2 is displayed after you entered image P1 in order of display of captured images in the direction of forward or backward.

Figure 5 shows the display screen image that is displayed after the user's finger moved further from the state, shown in figure 4, and moved from the touch screen is 18, while the travel speed is not reduced. Thus, figure 5 shows the display screen image that is displayed after the user's finger in a state of intimacy, performs an operation "fast flip" (operation "fast flip in a position close").

The device 1 enter the image determines the speed at a time when the finger user in a position close moves away from the touch screen 18. Because the speed equal to or greater than a pre-specified threshold, the phone 1 input image determines that the action "fast flip in a position close". After this the camera 1 input image displays all the image input P2, subject to the display in the following, which is displayed when the image input P1 "scrolls to the right, as shown in figure 4. Thus, the image input P1, shown in figure 3, entered is replaced by the image of P2. On the displayed screen image the image input P2 also "skipped" with "scroll right to put the image P1, and the image input P2 appears as animation "scroll" gradually from the right section of the entered image P2.

The block diagram of the algorithm first way to control the display

6 is a block diagram of the algorithm, illustrating the first way to control the display, described with reference to figure 3-5.

First, during S1, the phone 1 input image determines discovered whether a touch or proximity of the user's finger towards the touch panel 16.

The process at the stage S1 is repetitive way until, until it finds a touch or proximity of the user's finger. Then, in case when on the stage S1 determined that discovered the touch of the finger, the process proceeds to S2, on which the device 1 enter the image executes a predefined process (for "tapping")corresponding to the touch of a finger. After this, the process returns to step S1.

On the other hand, in case when it is determined that on stage S1 found the proximity of the user's finger, the process proceeds to S3, where the device 1 enter the image determines discovered whether a move of your finger with the condition vicinity. In case when it is determined that on stage S3 moving the finger is not found, the process returns to step S1.

On the other hand, in the case when the S4 determined that the finger moved from the first position of the proximity detection on some distance DS or more, the process moves to step S5 on which the device 1 input image "scrolls" the image input in accordance with the movement of the finger, found in a state of intimacy.

After that, the stage S6, the phone 1 input image determines postponed if the finger is found in a state of intimacy, from the screen. In case when on the stage S6 determined that the finger found in a state of intimacy, not moving away from the screen, the process returns to step S5 on which the image input "scrolls" in accordance with the movement of the finger.

On the other hand, in the case when the S6 determined that the finger found in a state of intimacy, moves away from the screen, the process proceeds to S7, on which the device 1 enter the image determines whether the latest speed of the finger (the speed just before the finger pulls away from the screen) equal to or greater than a pre-specified threshold TNA.

In case when on the stage S7 determined that the latest speed of the finger equal to or greater than a pre-specified threshold TNA, the process proceeds to S8, on which the device 1 enter the image determines that the operation is performed "fast flip in a position close", and animates "scroll" thus, to display the following entered the picture.

On the other hand, in the case when the S7 determined that the latest speed of the finger less than a pre-specified threshold TNA, the process moves to step S9 on which the device 1 enter the image determines whether the amount of movement of the finger from the first position detection proximity to the position in which the finger moves away from the screen, equal to, or greater than a pre-specified threshold THb. This determines whether the length of the moving finger from the first position detection proximity to the position in which the finger moves away from the screen, equal to, or greater than a pre-specified threshold THb.

In case when on the stage S9 determined that the sum of the moving finger equal to or greater than a pre-specified threshold THb, the process moves to step S10 on which the device 1 enter the image determines that the operation was performed "towing in a position close" and animates "scroll" thus, to display the following entered the picture.

On the other hand, in the case when the S9 determined that the sum of the moving finger less than a pre-specified threshold b, the process proceeds to S11 on which the device 1 enter the image animates "scroll" so that "scrolled" put the image was recovered in the center of the screen (the position before was launched operation "towing in a position close").

After the processes at the stages S8, S10 and S11, the process returns to the stages S1 and again performed the processes that follow the process at the stage S1.

The first process control the display described above is executed until then, until switched off the power supply unit 1 input image.

As described above, in accordance with the first way to control the display of truly disclose the invention, as in the case of operations touch, the user can display "scroll" of captured images, holding a finger close to the touch screen 18 and doing the "holding" (operation "tow" or the operation of the "fast flip"). In this case, because the finger is not in contact with touch screen 18, can be achieved a sense of ease when working. In addition, since the touchscreen 18't touch, the failure rate when the user works can be lowered and you can avoid the buildup of fingerprints and dirt due to operation of a touch.

Following are the second way to control the display, perform unit 1 input image (Central processor 23), which is a way to control the display of the image corresponding to the second variant of realization of the present disclosure of the invention. The second way to control the display in the phone 1 input images are obtained by adding a specific function in the first way to control the display described above.

In particular, in the case if the user runs the operation "save proximity", display the menu of operation, which differs from the operational menu the normal operation of towing in a position close and regular operation "fast flip in a position close". Operation "save proximity is a save operation of the state in which the user's finger is situated in a position near to a recording device 19 for a predefined period of time.

An example of the screen in the second way to control the display

Figure 7 shows an example display menu operation when detecting execution of operations "save proximity".

In the example shown in Fig.7, unlike menu normal operation of towing in a position close" and menus normal operation "fast flip in a position close" menu is displayed Q1 to go to put the image located on a hundred pictures in the front or back in a pre-specified manner, while performing the operation "fast flip in a position close". When the user performs an operation "fast flip in a position close to the right or to the left, after the "scroll", that is, after the execution of animation scroll bar appears put a picture on a hundred images in front of or behind in the pre-order.

Note that, although according to the contents menu in the example shown in Fig.7, the transition appears put a picture on a hundred images on the front or rear, the contents of the menu that is displayed when the operation "save proximity", can be defined appropriately. For example, you may see a move to put the image with date, preceding or subsequent relative to the image, or move to put the image, which is part of a different folder. Moreover, the user can select (define) menu to be displayed on the setup screen image.

The block diagram of the algorithm second way to control the display

Fig is a block diagram of the algorithm, illustrating the second way to control the display, perform unit 1 input image (Central processor 23).

As described above, the second way to control the display is obtained by adding new functions to the first way to control the display and is identical with the method shown in Fig.6, except for the part relating to additional features. To put it more specifically, the second way to control the display is the same with the method shown in Fig.6, except that added processes at the stage S25 and stage S26. Stages: stage S21 on stage S24 and stage S27 on stage S33 shown in Fig, correspond respectively stages: stage S1 on stage S4 and stage S5 on stage S11 shown in Fig.6. Therefore, descriptions, different from the descriptions of the processes associated with the newly-added stage S25 and stage S26, are omitted.

In case when on the stage S23 determined that the detected moving your finger in a position close, the process proceeds to S24. On the other hand, in the case when the S23 determined that moving the finger in a state of intimacy not found, the process proceeds to S25. In addition, also in case it is determined that the finger in a position close moved from position of the proximity detection, received in the beginning, for a pre-set distance DS or more, the process proceeds to S25. In particular, in cases when it is determined that the finger is essentially not moved from its initial position of the proximity detection, the process is running on the stage S25.

At the stage S25 device 1 enter the image determines, expire or after the first discovery proximity to some predefined interval (DT) time. In case when on the stage S25 is defined that after the first discovery proximity predefined interval (DT) time has not expired, the process returns to step S21.

On the other hand, in case when it is defined that after the first discovery proximity predefined interval (DT) time has expired, the process proceeds to S26 on which the device 1 enter the image executes the process menu display for status vicinity. After following the process menu display for status proximity, the process returns to step S21.

Now, with reference to a block diagram of the algorithm is shown in figure 9, will detail the process menu display for status proximity performed on stage S26.

After this the camera 1 input image determines the next state of the user's finger found in a state of intimacy. In particular, the phone 1 input image determines whether the finger found in a state of intimacy, in the state of touch, does he state proximity, or postponed.

In case when on the stage S52 determined that discovered the touch of the finger (the finger) found in a state of intimacy, the process proceeds to S53 on which the device 1 enter the image executes a predefined process (for "tapping")corresponding to the touch of a finger. After this, the process returns to step S21.

On the other hand, in the case when the S52 determined that the state of the proximity of the finger is saved, the process proceeds to S54 on which the device 1 enter the image determines moved if the finger from the first position detection proximity to the distance DS or more. In case when it is determined that the finger is not moved from the first position detection proximity to a pre-set distance DS or more, the process returns to step S52.

On the other hand, in the case when the S54 determined that the finger moved from the first position detection proximity to a pre-set distance DS or more, the process proceeds to S55, on which the device 1 input image "scrolls" the image input in accordance with the movement of the finger, found in a state of intimacy.

In particular, in processes at the stage S54 and stage S55 when discovered the proximity of the user's finger, a menu is displayed, and the finger moves, the image input "scrolls", as in the case of normal operation "towing" in a state of intimacy.

On the other hand, in the case when the S52 determined that the finger found in a state of intimacy, is postponed, the process proceeds to S56. After that, the stage S56, the phone 1 input image determines whether the latest speed of the finger (the speed just before the finger pulls away from the screen) equal to or greater than a pre-specified threshold TNA.

In case when on the stage S56 determined that the latest speed of the finger is less than the threshold value TNA, the process proceeds to S57 on which the device 1 enter the image removes the menu displayed on the stage S51.

On the other hand, in the case when the S56 determined that the latest speed of the finger equal to or greater than the threshold value TNA, the process proceeds to S58 on which the device 1 enter the image determines that the action "fast flip in a position close", and animates "scroll" to put the image corresponding to the content of the menu display.

On the stages: stage S56 on stage S58, in the case when the user is gradually pushing a finger in a position close determined that must be met cancellation operation menu, and the menu is deleted. In the case when the operation is performed "fast flip in a position close", runs an animation "scroll" to put the image corresponding to the contents menu.

After step S57 or stage S58 the process returns to step S21 shown in Fig.

In the second process control the display, as described above, in addition to the functions of the first way to control the display in the state in which preserves proximity in a certain position on the touch screen 18, becomes possible operation "fast flip in a position close", which is different from the normal operations of the "fast flip in a position close". Because the operation is fast turning into a position close" can be performed many operations can be performed more transaction. Consequently improves the usability.

As described above, because the device input image that serves as a device to control the display of images in accordance with option to the implementation of this disclosure of the invention, has the function of animation "scroll", corresponding operations in the state of "proximity" and "operation of conduct"that a large number of images recorded in the recordable 19, can be viewed with greater ease.

An example configuration of the computer

The sequence described above processes can be executed by the hardware or software.

In this case, it is obvious that this sequence of processes can execute the device 1 enter the image shown in figure 1, and alternatively, the sequence of processes can execute a personal computer, shown in figure 10.

Figure 10 CPU 101 performs various processes in accordance with the programs stored in the permanent memory (ROM) 102, and programs that are loaded in random access memory (RAM) 103 of the memory module 108. Random access memory 103 also saves, where relevant, the data used for performance CPU 101 different processes.

CPU 101, persistent storage device 102 and random access memory 103 connected to each other via the bus 104. In addition, 104 bus is also connected interface 105 I / o.

Interface 105 I / o are connected: the module 106 input, keyboard and "mouse"; module 107 conclusion, including the display device with the touch pad and the speaker; the storage module 108, which includes the hard drive; and the module 109 communication, including the modem and terminal adapter. Module 109 communications manages communication with another device (not shown on the drawing) through a network such as the Internet.

Also with the interface 105 IO linked, where appropriate, the drive 110 and is attached, where appropriate, removable media, 111 information, such as a magnetic disk, optical disk drive, optical disk drive or solid-state storage device. Where appropriate, the computer program that is read from the removable media 111 information, installed in the mass memory module 108.

In the case when the sequence of the processes must be performed by means of software, programs that are part of these software tools that are installed on the computer that is included with dedicated hardware, or on universal personal computer, is able to fulfill different functions when installing various programs, such as through a network or the recording media.

In this description, the stage of writing the program recorded on the recording media, obviously includes the processes performed in a specific order sequentially in time, and also includes the processes, which are not performed sequentially in time, that is carried out simultaneously or individually.

In the above definition as a module 17 display, which displays it controls device control the display of the image corresponding to really disclose the invention, used liquid crystal display device. However, the present disclose the invention can be applied not only to the LCD display, but also to the next display. In particular, this disclose the invention can be applied to the device display that accepts command to display in the block frame, field, or the like, which is moving image (in the future, this unit is referred to as "comma"). In this kind of device display a set of pixels, members of the "comma", includes the display items, and some of the elements of the display are saved for a predefined period of time. Note that such item display is referred to as keep the item display, and this kind of device display, including the screen, including such persistent display elements, referred to as preserving the device display. In particular, a liquid crystal device display is just an example of preserving the device display, and present disclose the invention can be applied to all saving devices display.

Furthermore, in addition to preserving the display device present disclose the invention can be applied to flat self-illuminating device display based on light-emitting device organic electroluminescent device. In particular, this disclose the invention can be applied to all devices of display, including the display items, which represent the pixels that make up the image. Note that this kind of device display is called a pixel handset display. While in the pixel handset display pixel is not necessary that one pixel match one item display.

In other words, you can use any device display, which displays it controls device control the display of the image corresponding to really disclose the invention, if the handset display is able to fulfill the above sequence of processes.

In addition, in the above embodiments of the invention was described a case in which the present disclose the invention is applied to the device of input image (such as a digital camera), which includes the device display (the rendering engine). However, control the display of the image in accordance with this disclose the invention can be applied to other electronic devices, including the display devices such as personal digital assistant (PDA), cell phone, portable gaming device, portable reproducing apparatus, TV receiver and the like.

Options for implementation of this disclose the invention is not limited to the above variants of implementation, and can be made of various modifications not beyond the scope of this disclosure.

Now disclose the invention contains subject matter that is associated with the subject matter of the invention which is disclosed in the Japanese priority patent application JP 2010.284323 filed in the patent office of Japan on December 21, 2010, which is included in the present proposal by reference.

1. Device control the display of images containing: sensor controller, completed with sensor control, made with the possibility of detection of an object placed near the device; and the controller display device, made with the possibility to control the display device to replace the display the first image displaying the second image in response to a move in the lateral direction of the mentioned object, remaining near mentioned display device.

2. The device of claim 1 in which the said sensor also has a capability to detect a touch of the mentioned object to the display device, and referred to the controller display device made with the possibility to process touch differently than being mentioned object near mentioned display device.

3. The device according to claim 2, which referred to the sensor is made with the ability to detect multiple touches referred to object to the display device as the operation of the "tapping".

4. The device of claim 1 in which the said controller display device made with the possibility to react to the discovery of the said sensor that mentioned the object remains near mentioned display device when moving object referred to by the distance specified in the lateral direction across the surface of the mentioned device display "scrolls" the first image beyond the display device with a simultaneous "scrolling" second image on the display device.

5. The device of claim 1 in which the said sensor made with the possibility to determine the speed of the object referred to in the lateral direction, in the case when the last speed object referred to in the lateral direction is less than the threshold referred to the controller display device made with the possibility to perform an operation "fast flip in a position close", and operation "fast flip" in a position close animates "scroll" second image on the above display.

6. The device of claim 1 in which the said controller display device made with the possibility to determine whether the magnitude of the displacement of the mentioned object sideways when placed near mentioned display device is less than the threshold, and in the event when the said value is less than the threshold, the controller is made with the possibility to return the first image in its original position on the mentioned display device that occurred before the discovery of the said sensor, proximity of the mentioned object.

7. The device of claim 1 in which the said controller display device made with the possibility to determine whether the magnitude of the displacement of the mentioned object sideways when placed near mentioned display device exceeds the preset threshold, and when the said value exceeds the preset threshold, the controller is configured to "scroll" second image on the display device.

8. The device of claim 1 in which the said controller display device made with the possibility to control the display device to display the menu, in any mode of operation "fast flip in a position close", and navigate to the desired image in the ordered set of stored images, so that when discovered that the object moves to the "fast flip in a position close", was displayed mentioned specified image.

9. The device of claim 8, which referred to an ordered set of stored images ordered at least by date and/or the storage folder.

10. The device of claim 1, wherein when the time during which the mentioned object remains in a state of intimacy, exceeds a threshold time referred to the controller display device made with the possibility to control the display device to display the menu for the state vicinity.

11. The device according to claim 10, in which the said controller display device made with the possibility to control the display device to display the menu for the state of intimacy in accordance with the regulations of the mentioned object relative to that display device.

12. The device according to claim 10, in which the said controller display device made with the possibility to delete the menu for a state close when the last speed of the mentioned object, it is removed from the display device is less than the specified threshold.

13. The device according to claim 10, in which the said controller display device made with the possibility to define, that the mentioned object moves to the "quick

turning in a position close", when the last speed of the mentioned object, it is removed from the display device is greater than the specified threshold, these controllers are implemented with the possibility to perform the animation "scroll" this second image on the above display device that corresponds to the contents menu for state vicinity.

14. The device of claim 1, further comprising: electrostatic touchpad, which includes mentioned sensor and referred to the display device.

15. The device of claim 1 in which the said controller display device made with the possibility of animated scrolling" referred to the first image and the above-mentioned second image with the first effect in response to the touch and with the second effect in response to being near a display device.

16. Device control the display of images containing: the sensor is made with the possibility of detection of an object placed near the device; the device is displayed; and the controller is made with the possibility to react to the change of approach of the mentioned object change the first image the second image, and mentioned the second image is selected from a set of images stored in the specified order.

17. The device according to article 16, in which the said sensor also has a capability to detect a touch of the mentioned object to the display device, and referred to the controller is made with the ability to handle touch differently than being near mentioned display device.

18. The way to control the display of the image that contains the time that: discover through proximity sensor subjects located near a display device; and through the controller changing mentioned on the device display the first image in the second image in response to movement of the mentioned object sideways when the object referred near mentioned display device.

19. The way to control the display of the image that contains the time that: discover through proximity sensor subjects located near a display device; and react through the controller to change the approach of the object referred to by the change of the first image, the second image, and mentioned the second image choose from a variety of images stored in the specified order.

 

© 2013-2014 Russian business network RussianPatents.com - Special Russian commercial information project for world wide. Foreign filing in English.