RussianPatents.com

Method and device for compact graphical user interface. RU patent 2519059.

Method and device for compact graphical user interface. RU patent 2519059.
IPC classes for russian patent Method and device for compact graphical user interface. RU patent 2519059. (RU 2519059):

G06F3/0484 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements (typewriters B41J; conversion of physical variables F15B0005000000, G01; image acquisition G06T0001000000, G06T0009000000; coding, decoding or code conversion, in general H03M; transmission of digital information H04L)
Another patents in same IPC classes:
Transacted double buffering for graphical user interface rendering Transacted double buffering for graphical user interface rendering / 2519034
Invention relates to computer engineering and particularly to applications which include a graphical user interface. A method of updating a graphical user interface (GUI) comprises steps of identifying a requested action which leads to repainting of part of the GUI, determining that the requested action includes delay in repainting said part of the GUI, transacting the requested action through initiation, in a parallel manner, of a double buffering control means which renders GUI updates and makes then invisible, and a splash screen stream which is displayed on a display, and sending the rendered GUI updates to the display at the completion of the transaction of the requested action.
Image capturing device and control method thereof Image capturing device and control method thereof / 2518987
Invention relates to computer engineering and specifically an image capturing device. The image capturing device has an image capturing means, a photographic preparation means for adjusting the predetermined photographic setting of photographing using the image capturing means, a photographic processing means for improving photographing using the image capturing means based on the photographic setting, a detection processing means, a control means for controlling the photographic preparation means. The method realises control of said device.
Vehicle control device Vehicle control device / 2518404
Vehicle control device contains steering wheel, optical emitter and optically linked with it radiation receivers connected to dedicated computer. Optical receivers represent digital television cameras the field of view of which is optical emitter surface. In the first version, optical emitter is made as strip attached to inner surface of steering wheel. In the second version, the device contains the second optical receiver, herewith the receivers represent light emitting diodes and provide illumination of objects falling into field of view of the first and the second optical receiver. The zone where vehicle operator gestures can be recognised is intersection of fields of view the first and the second television cameras on the surface of optical emitter. Appearance of operator's finger in this zone causes emitted light flux intersection and consequently appearance of shaded segments on images generated by the first and the second television cameras, and according to coordinates of these segments the dedicated computer calculates a number of spatial positions of operator's finger at successive time points and determines motion trajectory from them.
Menu display device, menu display method and programme Menu display device, menu display method and programme / 2518318
Invention relates to computer engineering. The menu display device comprises an obtaining module configured to obtain content display data from each application, a display controller configured to control content display in each region according to installation conditions for the region, an operations module for performing operations over the menu, wherein during selection of a region in the menu through focusing thereon using the operations module, the display controller is configured to control display of an application submenu which corresponds to the selected region and display of content in the submenu for content which is controlled by the application, wherein if focus moves into the region in the menu through the operations module, the display controller is configured to control display of the rotation of the menu area such that the angle of rotation in the region towards which the focus is directed is equal to 0° in accordance with the region towards which the focus is directed through the focus moving on a spiral, wherein the centre of the menu area is set as the reference point.
Display control device and method of display control Display control device and method of display control / 2517723
Invention relates to device for control over display of images that detects the proximity of the object to the display. Depending upon object displacement relative to display the controller displays the images at said display device. Images can be scrolled at said display consecutively or in response to definite object displacement it is possible to change over to definite pictures. Interval of said changeover can be set at menu which, if necessary, can be presented at said display.
Method and device for web-page browsing simulation Method and device for web-page browsing simulation / 2517381
Invention relates to computers, particularly to control over navigation over web-pages. Proposed method comprises collection of data on navigation over browser application page. Said data includes that on the page structure complying with the page layout, data on page browse area that define the correspondence between the page area and page structure data. It includes the data complying the area display time during navigation and including page area browse start and finish time intervals. Said data is stored, forecasting model is generated to forecast the page browse area and application of this model at browsing.
Touch actuated sensor configuration integrated with oled structure Touch actuated sensor configuration integrated with oled structure / 2515710
Invention relates to input devices. The device has a first substrate and a second substrate. The first substrate has a touch actuated sensor having on one of two sides of the first substrate at least a plurality of drive lines or a plurality of sense lines, a first metal coating formed on said first of two sides of the first substrate. The second substrate on the first of two sides has a first layer having driven thin-film transistors, a second layer having OLE material, first metal coating sublayers lying on one side of the OLE material, a second metal coating sublayer lying on the other side of the OLE material. The first and second layers are mutually adjacent and are arranged such that at least some of the thin-film transistors of the first layer of the second substrate can electrically drive at least a portion of the OLE material. The two substrates are oriented such that the second side of the first substrate is further from the second side of the second substrate and the first metal coating, formed on said first of two sides of the first substrate, is connected to the second metal coating on the second substrate.
Linking visual properties of charts to cells within tables Linking visual properties of charts to cells within tables / 2514102
Invention relates to applications for processing electronic spreadsheets and more specifically to linking visual properties of charts to cells within tables. The method involves first selecting a visual property of chart data display presented in an electronic spreadsheet, wherein said visual property is associated with a visually perceptible chart display property. Cells within the electronic spreadsheet are then selected. The visual property is then linked with that cell. Further, the method involves testing for changes in cell contents and updating the visual property of the chart display in response to changes in contents of said cell.
Digital image processing device and touch-based image scaling method Digital image processing device and touch-based image scaling method / 2514099
Invention relates to a digital image processing device which employs image scaling to change the scale of a displayed image using a touch procedure. The digital image processing device comprises an interface processor to identify a touch gesture of a user applied to the surface of a touch-screen display when image data are displayed, the touch gesture of the user describing an arched trajectory, and a display control processor for controlling scaling up and scaling down of the displayed image based on the centre angle of the arched trajectory described by said gesture.
Method for displaying audiovisual series on screen of mobile communication terminal, primarily relating to one event, formed by capturing with multiple cameras and user reproducing device - mobile communication terminal for realising said method Method for displaying audiovisual series on screen of mobile communication terminal, primarily relating to one event, formed by capturing with multiple cameras and user reproducing device - mobile communication terminal for realising said method / 2513722
Invention relates to mobile communication. The method of displaying an audiovisual series, primarily relating to one event, on the screen of a mobile communication terminal includes a step for simultaneously obtaining audiovisual information from at least two sources displaying an event of an ambient environment in different perspectives which enable to obtain a video image and/or sound. The method involves merging frames of original images in a single video stream, and transmitting the obtained audiovisual information from said sources to at least one mobile communication terminal while providing synchronisation thereof. Further, the audiovisual information is received with at least one mobile communication terminal. The method also involves forming on the screen of the mobile communication terminal an audiovisual series field which displays audiovisual information received from at least one source of such information, while simultaneously forming at least one audiovisual series field, closed from the view of the user by a mask of graphic primitives.
Atm and method for its operation (variants) Atm and method for its operation (variants) / 2248044
Methods for inputting data into ATM include following steps: forming an input signal when function key or auxiliary keyboard key is pressed at ATM panel, at the same time data inputted into ATM are assigned to pressed key, conversion of input signal from key to input signal, matching coordinate data of point of a point on screen of ATM, previously matched with pressed key, redirection of coordinate data input signal into computer data flow, meant for data input from mouse. Method for data input may have additional steps: forming of signal by pressing key of auxiliary keyboard, conversion of input signal from auxiliary keyboard key to input signal, matching key on standard letter-digit keyboard, previously matched with pressed key; redirection of received input into data flow of computer, meant for inputting data from standard keyboard. ATMs have computer and at least one device for performing financial operations in response to data input in computer, performed in accordance to methods for inputting data into ATM. Devices for storing software have software, providing for control of ATM in accordance to methods for inputting data in ATM.
Device for inputting image into personal computer Device for inputting image into personal computer / 2256210
Device has CPU, control block, receipt register, buffer memory block, address counter, first and second channel transmitters blocks, PC connection block, amplifier, pulse generator, control signals generator, second receipt register, second buffer memory block, first, second and third buffer registers, receipt-transmission register, strings counter, adder, first string counter and digital comparator.
Navigation Navigation / 2265245
Objects are linked to multiple directions on basis of positions of these objects, ascension or descent for one level in hierarchy of such objects is performed, while a certain group of objects can be passed as a whole at first level, and shifting between its other elements or sub-group can take place at second level.
Method for connecting semi-bridge indicator to personal computer Method for connecting semi-bridge indicator to personal computer / 2265878
Semi-bridge indicator is connected to computer through sound card, central processor unit of which is used for forming bearing alternating voltage, and two stereo output channels form alternating current bridge together with semi-bridge indicator, from diagonal line of which signal is injected into central processor unit of sound card.
Device for making image input into computer Device for making image input into computer / 2267150
Device has alphanumeric transformer, control unit and additionally introduced amplifier and transceiver register. Data input is carried out by sequentially reading each byte saved in the transceiver register.
Method for selecting product by means of data transfer network Method for selecting product by means of data transfer network / 2287176
Server system 100 controls selection of product through data transfer network 120, by means of a series of screens. Server system 100 processes user data from user input signals to determine whether user data are synchronized with at least one of a set of products. Server system 100 transfers the earliest one of screen signals, appropriate for such earliest one of screens in series, which has no synchronized data, if selected screen is next in the order of screens and if previous screens in order, prior to selected one, all have no synchronized data.
Information input device Information input device / 2291476
Device contains commutator, storage device, two counters, synchronizer, three gates. Address inputs of commutator and storage device are connected to each other and to byte outputs of counter, which continuously changes its state influenced by clock signal. Each counter state corresponds to the number of external device and to information written in storage device, which defines this external device. Address digit count of storage device is greater than commutator address digit count by one. That is, each external device has an identifier of two words, which is passed to computer during current query cycle.
Computer spherograph, combined computer control device based on it Computer spherograph, combined computer control device based on it / 2292072
Device is equipped with a lever with controlling and controlled arms, hanging on a support with the capability of moving controlled arm inside a spatial angle of a fixed spherical surface. Buttons and scroll wheel are located on controlling arm. Controlled arm, as well as the scanned interior spherical surface, is located inside the device.
I/o device I/o device / 2295748
I/o device can be used in devices for manual data input/output. Device has keyboard mounted on panel, radiation receiver/ source couples, two reserved channels for data processing, which channels have decoder, switches and optical module, control bus, data bus and microcontroller. Radiation receivers and radiation sources are placed in case of optical module.
Data input method Data input method / 2300129
Technical effect is achieved by means of repeated execution of following: setting of beginning of counting of values of parameter value determining parameters, comparison of these values to sets of conditions of selection of spatial areas from a set of such areas, including at least condition of whether parameter values, corresponding to manipulator movement, belong to this spatial area, recording a series of input of spatial areas, comparison of this record to certain set of series of selection of spatial areas, inputting data from a set of data, connected to matching series.

FIELD: physics, computer engineering.

SUBSTANCE: invention relates to an input device which obtains operation input data from the hand of a user, an information processing device and a method of obtaining input values used in said device. A GUI screen image is a standard screen image and renders a first combined GUI region which is a combination of the GUI of a crossed directional switch and the GUI of a joystick, and a second combined GUI region which is a combination of the GUI of buttons of four types of operations and the GUI of a joystick, respectively, in the bottom left and bottom right part of the screen image. The combined GUI to be used is determined depending on the region in the first combined GUI region or the second combined GUI region touched by the user for the first time, and the screen image is switched, and if a finger is separated from the touch panel, the screen image is switched back.

EFFECT: design of a combined GUI used in compact data input means.

15 cl, 10 dwg

 

THE TECHNICAL FIELD

The present invention relates to the input device that receives input operation from the user's hands, the information-processing device and method of obtaining input values used in this device.

DESCRIPTION OF THE PRIOR ART

For the last years the wide circulation was received small-sized information devices, involving the ability to transfer, such as mobile gaming devices, mobile phones, PDA (personal e-secretaries), and others. Such small devices have substantially limited means of data entry that follows from the limited size of these devices. In result, they have developed specific functions and means of input-oriented such a compact device. For example, coating the surface of the display touch panel, and allowing the user to enter your finger or with the stylus input, you can create the user the impression direct manipulation of objects that appear on the display screen.

On the other hand, the use of such small devices are fully applicable also the environment in which you process information in the console gaming machines, personal computers and illogical stationary devices. For example, allowing the user to work on a small device and connecting with this compact device through the network to a stationary device, which is in fact to carry out data processing, it is possible to get all the features of a high level regardless of the user's location. In addition, the game emulation for device stationary type is also available in the version of the game on a small device.

Thus, in recent years a technological line, which allows the device to process information, such as a game, without regard to the size of the device or to the environment that uses this device. However, if you attempt of such a well-developed information processing, using the device's small size, there is a problem of insufficient convenience and ease of use that is a consequence of the limited funds available input data, as described above.

SUMMARY OF THE INVENTION

The present invention allows to solve the above problem and the aim of the invention is to create technology that provides an implementation of a data entry tool that provides sufficient convenience and ease of use even the small size of input.

In accordance with the embodiment of the present invention is a device for data input. The proposed device data entry contains: block image generation GUI that generates the image GUI (Graphical User Interface the graphical user interface); the display, which reproduces the image of the GUI generated by block image generation GUI; touch panel that covers the screen and recognize the situation, in which a user touches the screen; a block of data conversion operations, which identifies the content of the operations performed by the user, based on the conformity between the point of contact recognized touchpad, and a picture of the extracted GUI, the block image generation GUI creates a combined GUI area in the image GUI, and combined GUI area combines several GUI combining graphic images that are made up of the Association at least part of the graphic images of several GUI, and when the user first contact with such a combined GUI, a block of data conversion operations identifies many mentioned GUI, which are United in this combined GUI, one of the GUI, the corresponding graphical image that contains the point from which to begin this contact, and the block image generation GUI allows you to share multiple GUI same detection on the touch panel, switching to this combined GUI on one of the GUI, which identified a block of data conversion operations.

In accordance with another embodiment of the present invention, a device processing. This is an information-processing device contains: block image generation GUI that generates the image GUI (graphical user interface); the processing unit, which performs the processing of information in accordance with user action, consistent with this GUI; display that displays the image GUI generated by block image generation GUI, the output image, which is generated in a result information processing performed by the information-processing device; touch panel that covers the screen and recognize the situation, in which a user touches the screen; a block of data conversion operations, which identifies the content of the operations performed by the user, based on the conformity between the point of contact recognized touchpad, and displayed the image of the GUI, the block image generation GUI creates a combined GUI area in the image GUI, combined GUI area combines several GUI combination graphic images that are combinations of at least part of the graphic images of several GUI, and when the user first contact with such a combined GUI, a block of data conversion operations identifies among several GUI, which are United in this combination GUI, one of the GUI, the corresponding graphical image that contains the point from which to begin this contact and block image generation GUI allows you to share multiple GUI same recognition area on the touch panel by switching the combined GUI on one of the GUI, identified by the block of data conversion operations.

According to one embodiment of the present invention, it is proposed the method of obtaining of the input variable. This method of obtaining of the input variable contains: the stage at which it is generated image GUI (graphical user interface), the stage at which is reproduced on the display screen image GUI, which is the output image, which is generated as a result of processing of the information, the stage at which recognized the situation where the user touches the touch panel, which covers the screen above the display, the stage at which identifies the content of the operations performed by the user, based on the conformity between the detected point of contact and image GUI on screen, while on the stage at which it is generated image GUI, creates a combined GUI in the image GUI, and this area combined GUI is a Union of several GUI received combined by graphic images that are combinations of at least part of the graphic images of many GUI, and when the user first comes to such combined GUI, the stage at which identifies the content of the operations of the many mentioned GUI, which are United in this combined GUI, identified by one of the GUI, the corresponding graphical image that contains the point at which to begin this contact, and on the stage at which it is generated image GUI is based on the joint use of many GUI total area of recognition on the touch panel by switching the combined GUI on one identified GUI.

In practice, as additional forms of implementation of the present invention can also be used arbitrarily selected combination and implementation of the above components of the invention in the form of methods, devices, systems, computer programs, and the like.

In accordance with the present invention can be implemented a wide range of activities, while ensuring ease of use, even when working on the device's small size.

A BRIEF DESCRIPTION OF DRAWINGS

Next, I'll describe examples of the invention with reference to the accompanying figures in the drawings, which are only approximate and not limiting the invention, with similar items in different drawings shall be marked with the same number, where:

Figure 1 - figure, which illustrates an example of the structure of the commonly used control panel;

Figure 2 - the drawing that illustrates examples of devices of information processing, using input device, in accordance with the example of implementation;

Figure 3 - the drawing that illustrates the detailed structure of devices of information processing, in accordance with the example of implementation;

Figure 4 - the drawing that illustrates an example of where several GUI on screen GUI, in accordance with the example of implementation;

Figure 5 - the drawing that illustrates throw in the first area combined GUI, and the recognition area on the touch panel, in accordance with the example of implementation;

6 - figure, which illustrates throw in the field of input-joystick, and a way to manipulate joystick, in accordance with the example of implementation;

In accordance with the example implementation, implemented the input device in small information-processing device, such as a mobile phone, the mobile terminal or similar device. This input device provides the convenience of manipulation, similar to that of, for example, at the controls of the game console. First will be described on the example of the traditional remote control. Figure 1 shows an example of the structure of the commonly used control panel. Remote control contains 120 d-pad directions 121, joysticks a and 127b, a group of buttons for the four types of operations 126, button, L1/L2 130A and click R1/R2 130b, which those workers components that provide the user with the possibility of manipulation. The button group of four types of operations 126 consists of buttons with the terms 122, the button with a cross 123, the button with the square 124 and buttons with triangles 125.

D-pad directions 121 organized in such a way that gives the user the ability to switch in one of four directions (up, down, right and left), or in one of eight directions (up, down, left and right and four directions between these four directions), or switch in any direction. For example, a d-pad directions 121 is used to move the cursor on the image display screen or to scroll through the various types of information on the on-screen image. The button group of four types of work 126 respectively appointed by the application of various functions.

Joysticks a and 127b contain rods, which are fixed with the ability to tilt in any direction, and a sensor that reads the value of this angle. Rods offset bias device (such as spring) to the neutral position and return to the neutral position when they are not being manipulated. The sensor contains a variable resistor, the resistance value of which varies in accordance with the tilt rod, and the pattern of analog-to-digital Converter, which converts the resistance value into a numeric value. If the rod is tilted, the values of the slopes on several basic directions are converted to their corresponding numeric values, and these values are passed on to the gaming device as control signals.

Button L1/L2 130A and button R1/R2 130b composed of two buttons, namely, respectively, of the buttons L1 and the L2 button and out button R1 and R2 button. In both combinations of the two buttons are located respectively at the top and bottom on the side of the remote control 120. For example, the button is used for changing the direction of the line of sight in the game or is used to add other traffic when the user manipulates this button together with the other button. However, these buttons application program can also be assigned a variety of functions.

The user keeps the left hand left arm a, and the right hand is the right handle 128b and manipulated on the remote control to 120. D-pad directions 121, joysticks a and 127b and the button group of four types of operations 126 are located on the top of the control panel is 120, so the user can manipulate them while holding the left arm a and right arm 128b left and right respectively. Button L1/L2 130A and button R1/R2 130b installed on surfaces, situated respectively on the back of the left arm a and right arm 128b, so these buttons can be manipulated with the index fingers of the left hand and right hand.

In accordance with the example implementation, every tool of manipulation, part of the control box, which is shown in figure 1, is played on a flat surface in the form of GUI. Explanation will be given later by the example of input devices, in accordance with the example implementation. Figure 2 shows examples of devices of information processing, using an input device in accordance with this example. This is an information-processing device 10 is a device small size, which can be worn by the user and can be any of the devices, such as mobile phone, PDA, portable gaming device type, and so on. Alternatively, the information-processing device 10 can have a function, which is a combination of the functions of these devices. Therefore, the information-processing device 10 may contain various mechanisms processing corresponding to these functions, however, the proper explanation is omitted, as in this case, you can apply traditionally used technologies.

In addition, an information-processing device 10 may contain loudspeaker for issuing sound signals port for connecting headphones, infrared or wireless local area network (LAN) to communicate with other devices compartment battery or other power source. However, these items are not shown in figure 2.

The display 14 plays a screen image, which, in accordance with the assigned function, you need the user to enter a transaction (for example, the image on-screen menus, icons, etc), the screen image, created in result of processing of the information (for example, the image displayed on the game screen, the image that plays on the screen moving image, the image of the displayed text, image displayed pictures, etc.). In addition, reproduced as on-screen display, GUI (graphical user interface)that provides the user the ability to enter transactions, looking at the image on the screen.

The user shall enter a transaction in the information-processing device 10, touching a finger touchpad 12 or sliding a finger over the touch pad 12, as if he/she manipulates this GUI. Figure 2 on the GUI presents the input area Phillips switch directions 42, which is a graphic image of the switch directions, and area of the input buttons 44, which is a graphic image buttons four types of operations (that is, the button with a circle, a cross button, the button with the square button with a triangle). The dotted lines in figure 2, surrounding the input area Phillips switch directions 42, and the input area buttons 44 presented only to illustrate the boundaries of these areas and therefore are not associated with real output images or functions. The same is true for subsequent figures.

For example, if you select the item OSD menu the user first from the displayed list of menu items or icons moves the point that he wants to display, touch one of the key directions in the field of cross input switch directions 42, and then confirm the selection of this menu item or icon tap the button with a circle in the field of the input buttons 44. The user then changes the direction of the sign appearing in the game, tap one of the key directions in the field of cross input switch directions 42. In another embodiment, the game is interactive, the user expresses an intention, for example, "Yes", the touch of a button with a circle or no, the touch of a button with a cross.

Thus, the input operations that can be realized by creation of the input area Phillips switch directions 42 and/or area of the input buttons 44, can be modified in a variety of ways by assigning buttons and functions in accordance with the functions realized by the information-processing device 10. In accordance with the example implementation, playback input gaming console or personal computer as the touch panel can be achieved great variety of input information in devices of small size, even in the device, such as a game console.

In addition, information on the device's small size, it can also play the game, in which the user is accustomed to play on game consoles, with the same ease of use and without discomfort to the user. Illustrated form or characters input area Phillips switch directions 42 or area of the input buttons 44 shown only as examples, therefore, forms and signs are not intended to limit what is shown in figure 2. The input area Phillips switch directions 42 or enter buttons 44 may be replaced by other means of input that will match the remote control that you want to play.

Figure 3 shows the detailed structure of devices of information processing 10. Apart from the touch panel 12 and display 14 above, the information-processing device 10 contains the memory content 16, memory images GUI 18, power management I/o 20, block of data conversion operations 22, block content processing 24, generating block images GUI 26, buffer memory images GUI 28, block the generation of image content 30, buffer memory image content 32 and block image synthesis 34. In memory of content stored 16 program of the content and / or different types of data. In memory images GUI 18 are stored graphics applications that are used when creating a GUI. The control unit input/output 20 manages the acquisition of input signals from the touch panel 12 and/or I/o image data. Unit conversion information operations 22 converts input signals from the touch panel 12 information on the content of the operation. Block content processing 24 handles the content in accordance with the information about the content of the operation. The unit generating the images GUI 26 generates the image GUI. In the buffer memory images GUI 28 temporarily stored generated image GUI. The unit generating the images content generates 30 image content. In the buffer memory image content 32 temporarily stored generated image content. Block image synthesis 34 generates image, and the image GUI is displayed as an on-screen display on the image content.

The elements shown in Figure 3 as a function blocks execution of different types of machining operations, implemented by the hardware, such as CPU, storage devices or other VLSI, and software, such as programs that handle the content or perform image processing and other operations. Hence, specialist in the art it will be clear that these function blocks can be implemented in various ways, only the hardware, only one software, or a combination of both methods.

The control unit input/output 20 connected with touch panel 12, display 14 memory content 16 and memory images GUI using existing methods and controls I/o data. Input received from the touch panel, 12, is the coordinate of the point of contact to which relates to the user on the touch panel 12, coordinates the motion path, when the point of contact moves continuously and so on. Because the way of recognition of the touch points on the touch panel 12 different for different types of panels, concrete way not described here. In addition, the control unit input/output 20 gives the video display 14 image.

Next, the control unit input/output 20 accesses memory content 16 and reads from it a program or various types of data required to handle this content. In addition, the control unit input/output 20 accesses memory images GUI 18 and reads from it graphics applications Phillips switch the direction buttons or similar components that have been described above. As for "content", it is not restricted to those content types, which could be processed and presented by a computer, such as a computer game, movie, music, little story, photo or other. This example implementation, besides the usual "content", can be applied to the total processing information, such as data transfer, control of schedule, address book, spreadsheet, and so on, with the subsequent explanation assumes that the content can contain all of the above mentioned types of content.

In the case when content is playing, memory content 16 stores information about the program of this game, information about the player reached the level, when the game was played last time, and other data. In the case when content is a movie or music, memory content 16 stores compressed and encoded video as well as audio, program, decoding and playback of the data and so on. Memory content of 16 may be a hard disk drive or the memory can be a combination of the erased the recording media (for example, memory card, CD ROM, optical drive, optical drive, etc.) and devices as it is being read.

Image memory GUI is 18 memory (e.g. hard disk drive etc)that stores the image data, which can be used as graphical data applications images GUI, such as a d-pad directions, various types of buttons and stuff. As will be described later, the image itself GUI can be changed in the result of the operation on the GUI, in accordance with the example implementation. So in memory images GUI 18 data is stored images, corresponding to such a wide range of options in the GUI.

The unit generating the images GUI 26 generates a new image GUI, when such a need arises, based on the content of the operation and stores the generated image in the buffer memory images GUI 28. Although specific examples of image change the GUI will be given later, however, note that in addition to the fact that the user can, for example, to see the color change, as if he touches the touch panel, or to see that the button as if pressed, you and the keys themselves and/or replace button keys and/or buttons a different GUI.

Therefore, the unit generating the images GUI 26 stores inside information, which relates the maintenance of operations and changes that must be made to the picture, identification information of the image to be re-used, and other information. Further, the unit generating the images GUI 26 reads the appropriate image data necessary GUI from memory images GUI 18 and generates new data on-screen image in such a way that in this image are submitted changes that are associated with the content of the operations and which are produced with the GUI displayed in the current time.

In case, when no GUI changes that depends on the content of the operations unit generating the images GUI 26 can not perform the process of generating the new image. The unit generating the images content generates 30 image data that should be displayed as the result of its processing block content processing 24, and saves the data in buffer memory image content 32.

Block image synthesis 34, of the execution of the rendering process using the image data stored in the buffer memory images GUI 28, and image data stored in the buffer memory image content 32, generates an image, the image GUI is reproduced as on-screen display on the image content, and the images are saved in the internal frame buffer memory. Due to the fact that the video signal of the image that is stored in the frame buffer memory, is transmitted to the display 14 under management control devices I/o 20, the display 14 plays an image corresponding operations on the GUI, done by the user.

Below we will provide a concrete example of the screen GUI in accordance with this example. In accordance with the present example, the screen GUI is reproduced as on-screen display on the image content, such as a game. It is therefore important to provide the same ease of use, as in the remote gaming console or similar device without creating interference for the image content on the screen.

In accordance with the present example, in addition to cross the switch directions and buttons four types of operations is played also as a GUI and a joystick. This allows the user to enter data reproduced with the joystick in the same way as a real joystick, including arbitrary direction and arbitrary value, or in accordance with the circumstances to issue commands direction in the same way as in real remote control, manipulation Phillips switch directions. This GUI allows you to enter an arbitrary direction and arbitrary value and can be used to control the direction and/or speed of movement of objects that appear in the game, for an angle field of view and other operations. In the following explanation enter produced through such pseudorosette, which is represented as the GUI will also be referred to as "input-joystick".

If the screen image GUI will be directly located three types described above GUI or even more than three GUI, all these GUI will impede the visibility of the image content on the screen. Therefore, in accordance with the present example, creates a combined area of the GUI, which is a combination of several GUI that allows these multiple GUI share the same recognition area. When the user first contact with this area, throughout the entire combined area GUI is determined that depends on the position of the point of contact in the combined area of the GUI, the one that will be used as a GUI.

Figure 4 shows an example of where several GUI screen image GUI. Each of the pictures GUI 50a, 50b, 50C, 50d and 50 ppm are the image which is reproduced as on-screen display on the image content on the display 14. The operation performed by the user switches the image GUI 50a on one of the images GUI 50b, 50C, 50d and 50 ppm. The screen image GUI 50a is a standard screen image and contains the first combination GUI area 52 and second combined GUI area 56, respectively, bottom left and bottom right of the screen image.

The first combined GUI area 52 is a GUI area, which is a combination GUI Phillips switch directions and GUI joystick, and has the same configuration as the configuration of the input area Phillips switch directions 42, shown in figure 2. This is the first combined GUI area 52 formed graphical image of the cross switch directions 51, which is at least part of a graphic symbol GUI Phillips switch directions, and graphic image of the joystick 53, which is represented by a symbol (such as a circle or a similar character) in the center of it all graphics Phillips switch directions 51, and which is at least part of the graphic image GUI joystick.

The second combined GUI area 56, in the same way as the first combined area of the GUI is a GUI area, which is a combination GUI buttons four types of operations, and GUI joystick, and has the same configuration as the configuration of the input area buttons 44, shown in figure 2. Configuration of the second combined area GUI 56 is a combination of a graphical image of the operation buttons 55, which is at least part of the graphic image GUI buttons four types of operations, and graphics joystick 57, which presents figures (for example, round or similar symbol) at the centre of all graphic images, buttons operations 55, and which is at least part of the graphic image GUI joystick.

If the screen image GUI 50a, which is the standard image, the user will first concern the graphics joystick 53 first combined area GUI 52, will start the process of obtaining the input data via the joystick, and the first combined GUI area 52 switches to the input area joystick 58, which does not contain graphic images Phillips switch directions 51 (image GUI 50b). In more detail, if the user puts the finger on the graphic image of the joystick 53 first combined area GUI 52, the area is switched to the input area joystick 58, and further by sliding your finger on the touch pad in-service will be received as input to the direction of movement of the finger and the distance that moves a finger.

During that period of time, while the finger is continuously in contact with touch panel, this area represents the input area joystick 58, and consistent getting input from the movement of your finger. Then, when the user lifts a finger, then this area goes back to the first combination GUI area 52 (image GUI 50a).

On the other hand, if the image GUI 50a the user first comes to graphics Phillips switch directions 51 first combined area GUI 52, started the process of obtaining the input data through a d-pad direction, and the first combined GUI area 52 switches on the input area Phillips switch directions 42 (image GUI 50s). In this case, also in the period of time as the finger continues to touch the touch pad, this area works as an input area Phillips switch directions 42 and provides input data through a d-pad directions, and when the user lifts a finger, this area goes back to the first combination GUI area 52 (image GUI 50a).

However, the graphical representation of the joystick 53 can be displayed in the input area Phillips switch directions 42, with first combined GUI area 52 and the input area Phillips switch directions 42 will be of the kind that is presented on Figure 4. This eliminates the inconvenience of disappearance and reappearance graphics joystick 53 even when broken touch Phillips switch directions for data entry through a d-pad directions.

The second combined GUI area 56 also works in a similar way, that is, if the user first comes to graphics joystick 57, then start the process of obtaining the input data via the joystick, and this area is switched to the input area joystick 58, which does not contain graphic images, buttons, four types of operations 55 (image GUI 50d), and if the finger breaks away from the touch panel, the area goes back to the second combined GUI area (image 50a). In that interval of time until the finger continues to touch, this area works as an input area joystick 58 and tracks the movement of a finger.

Similarly, providing the ability to switch between the input area Phillips switch directions 42 and enter the joystick 58, and between the input area buttons 44 and the writing area joystick 58, respectively, can be reduced to the size of the area occupied by the image GUI. As a result, even when playing in the form of on-screen display problems, connected with the reproduction of the content on the screen will be small, and on the limited space of the screen can work together to be the image content and the image of the GUI. Also, seeing that the rod of the joystick is tilted in this direction, and at that, the center is set as the starting point of a real joystick action needed to switch looks natural due to the fact that the motion in sliding your finger from the centre of the region will be connected with the entry process through the joystick.

Further, due to the fact that the opportunity switch as the first field of combined GUI 52 bottom left and the second area, combined GUI 56 lower right of the input area joystick 58, can be changed simultaneously played on the screen combination of inputs or can be quickly identified and optionally selected hand, which is held manipulation (that is, the left hand or right hand)that depends on the type of content, content, type, stage and other factors.

In the above example, all GUI input area Phillips switch directions 42 and input areas buttons 44 could be accomplished via a GUI for double data entry, such GUI enable or disable the function provided by a single button that must be pressed or released, respectively touching or no touch a separate area, which is commonly used button. Thus, all these GUI is not intended to limit switch, which indicates the direction, and a button with a circle, cross button, the button with the square button with a triangle. Next, GUI input area joystick 58 can also be implemented as a GUI using an analog input which receives analog data in accordance with the provision of contact in a particular area, and thus not expected to limit this GUI only function joystick.

Anyway, combined GUI is a combination GUI for dip input on/off and GUI for the analog input. Further, in such combined GUI can switch to GUI dip input on/off and the screen displays a graphic image GUI input on/off; at the same time set a small area of pattern recognition switch to GUI analog input, and after switching the recognition area expands. This way is in the process of switching from the combined GUI in several relevant GUI, with operations in the relevant GUI are, of course, in a single stream sequence of operations. When this switch to GUI dip input on/off, the screen remains a graphic image of the switch to GUI analog input, which was presented on the part of the GUI, and thus eliminates the inconvenience associated with disappearing and re-appearing graphic images in the time interval of intermittent work with GUI dip input on/off.

In addition, the screen image GUI 50a can also be reproduced switching button 54. If you tap this button, switching 54, then you can provide the appearance of the bottom of the button image that was not displayed on the screen (for example, a button that is required in accordance with this function, the selection of a button, menu item, start button, triggering start playing a sound or video etc.). Configuring this way of example implementation can be hidden button that little used, and the content of the image on the screen can be configured so that it is very easy for you to see.

Figure 5 presents a picture that illustrates throw in the first combined area GUI 52, and the recognition area on the touch panel. On the left part of Figure 5 presents the image of the first combined area GUI, and on the right side presents the recognition area, which is superimposed on the image. The first combined GUI area 52 is built of a circumference of 60, graphics Phillips switch directions 62, consisting of four keys that show four directions - up, down, right, left - and are located on the periphery of a circle of 60, and graphics joystick 53, which is located in the centre of a circle of 60.

A circle of 60 can be presented on the screen in transparent form, thus the image of such a circle will not close the image content on the screen. This allows the user to set a coefficient of transparency, whereas, for example, the content type, or other factors. Output on the screen is similar to the circle also in the second combined area GUI 56 and in the field of the input buttons 44 the impression of a whole group of keys. In addition, a d-pad directions and buttons are four types of operations preferably be painted in monotone colors, so that priority is given to the colors in the image content on the screen.

Four key graphics Phillips switch directions 62 associated with relevant rectangular recognition areas 64A, 64b, s and 64d in advance of the specified size that surround graphics corresponding keys on the keypad. In addition, a set of graphical representation of the joystick 53 with the recognition area 68, located in the centre of a circle of 60. Determine the shape of recognition areas 64A, 64b, s and 64d as forms, which can be set to a mathematical expression, such as a triangle, a circle, and others (except rectangle), these areas can be easily connected with the images of the keys, without regard to the resolution of the display and/or the touch pad.

Further, specifying the size of the areas of recognition 64A, 64b, s and 64d large enough, so that these areas included the surroundings of graphic images of the appropriate keys, the operation can be recognized even in the case when the touch point your finger slightly deviates from a particular key, and at the same time with partial crossing of areas 64A, 64b, s and 64d can be achieved recognition in related fields. These intersecting parts of the fields 64A, 64b, s and 64d linked with the four diagonal directions that are intermediate directions of the four directions, that is, up, down, right and left. This doubles the number of destinations that can be entered, unlike the case when only part of each key is defined as the recognition area, giving the user the ability to specify commands direction on many levels.

For the case of the input area Phillips switch directions 42 recognition area, corresponding cross switch directions, determined, as shown in figure 5. However, the area of discernment, if the user continues continuously to touch the touch pad, which is a criterion switch from input areas Phillips switch directions 42 back in the first combination GUI area 52, is determined separately, for example, as predefined concentric circle that has a radius equal to the radius of the circle 60 or greater than the radius. This prevents the movement back to the first combined area GUI 52, despite the intention of the user, or switch back to the input area joystick 58 in the case when the finger of the user who enters information via a d-pad direction, moving through the Central part of the input area Phillips switch directions 42. In case, when the point of contact deviates from the field of recognition, and input area Phillips switch directions 42 switches to the first combination GUI area 52.

On 6 shows a figure that illustrates the picture presented in the field of input-joystick 58, and way of working with the joystick. As described above, the input area joystick 58 displayed over a period of time when the user is sliding your finger across the touch panel, leave your finger on the panel after a user has touched, for example, detection 68 graphics joystick 53 first combined area GUI 52. In this process, as shown in the left part of figure 6, in that place on the field to which touches the user's finger is reproduced indicator 70 (for example, a circle). Preferably, the indicator 70 had such size and in such form under which this indicator would not closed with your finger. In addition, the indicator 70 will be easily understood, when, for example, the peripheral area of the indicator will sparkle, or when a small shift in the area so that they are tracked the movement of touch points, or weak when you play the motion path.

Indicator 70 moves in accordance with the movement touch point with the touch pad of the finger 76. In the field of input-joystick 58 also plays a circle 72, has the same radius and are in the same position, as a circle, is shown in the first combined area GUI 52, second combined area GUI 56, entry area, switches directions 42 and input areas buttons 44. As a graphic image of the joystick is in the first combined area GUI 52 and the second combined area GUI 56, in their Central parts, the center of the circle is located at the point where the user first comes to data entry joystick. Next, as shown in the right figure 6, the recognition area 79 defined as concentric circle that has a radius equal to or greater than the radius of a circle of 12. If the point of contact moves away from the field of recognition 79, this line switches back to the original first combination GUI area 52 or back to the second combined GUI area 56.

As it follows from Figure 5 and 6, the recognition area 79, which accepts input-joystick in the field of input-joystick 58, is an area, which is concentric the increase of recognition 68, which switches on the input area joystick 58 in the first combined area GUI 52. The same applies to the relationship between recognition area, which switches on the input-joystick in the second combined area GUI 56, and recognition area after the switch.

The circle 72 may be reproduced on the screen in transparent form or may not be reproduced in General. In the field of input-joystick 58 coordinates of the point of contact 78 read continuously at pre-set intervals, and the direction vector from the center of the circle 74 detection 79 to the point of contact 78 is defined as a set of input values in each moment of time. The intervals obtained are shorter than, for example, the interval time of the conclusion of the personnel on display 14.

In the block of data conversion operations 22 stores information that links the direction from the center of the circle 74 to the point of contact 78 direction vector and the direction of the slope of the real joystick, and the distance from the center 74 to the point of contact 78 and the value of the slope of the real joystick. When accessing this information block of data conversion operations 22 converts the direction vector in each moment of time in the direction of the slope and the value of the slope of the real navigation key and sends the message on this transformation to block content processing 24. This provides the processing block content 24 the opportunity to process the same way as in the case when the input data is carried out through a real joystick. In another variant, the direction or distance can be directly used for processing content.

If the point of contact reaches the border of the circle 72, we obtain the input value is defined as the input value is most inclined joystick. If the point of contact outside of a circle 72, while being within the range of detection 79, it turns out only the direction from the center of the circle 74 in accordance with the point of contact, and it is assumed that this distance is limited to the radius of the circle 72 and regardless of the point of contact. The indicator 70 displayed on the screen as if it moves in the direction of the contact point on the edge of the circle 72. Similarly, the formation of the input area of the joystick 58 on a flat surface can be implemented by means of input, not much different way of working and ease of use from the real joystick. In addition, lock the screen output of a circle or 72 displaying translucent circle 72 can be minimized impact on the screen with the image content can be entered in a random direction and arbitrary value.

7 shows the modified sample screen image GUI. The screen image GUI 50f devices of information processing 10 7 similarly, the screen image GUI 50a, shown in figure 4, contains the first combination GUI area 52 in the bottom left of the screen image and the second combined GUI area 56 at the bottom right of the screen image. The screen image GUI 50f also contains upper left corner of the screen area of the input buttons L1/L2 80, and at the top right of the screen - the area of the input buttons R1/R2 82. The input area buttons L1/L2 80 formed buttons that correspond to the button L1 and the L2 button, shown in figure 1. The input area buttons R1/R2 82 formed buttons that correspond to the button R1 and R2 button, shown in figure 1.

In accordance with this example, it is assumed that the user operates with GUI fingers of the left and right hands 84 while holding the left and right hand 84 the device information processing 10, as shown in Fig.7. Therefore, placing on the screen the first combination GUI area 52 and second combined GUI area 56, respectively, bottom left and bottom right, there is the possibility of manipulating fingers on these areas.

When placing multiple GUI thus, the user can operate the device information processing 10 without distortion while holding the device, and, in addition, the user can operate in two or more areas of the four areas. The input area buttons L1/L2 80 and enter buttons R1/R2 82 established in the form of a sector of a circle, as shown in Fig.7, the Central angle of the relevant sectors of a circle are right angles of the two upper corners of the screen, and within each sector of a circle divided in the center so that each of the two formed by sectors on two corners of the screen would allow to distinguish between them accordingly button L1 and L2 button, button, R1 and R2 button.

When the user holds the device information processing 10 thus, as shown in Fig.7, lower phalanx of the index finger 84 usually located opposite the middle finger so as to form a shape to capture the body of the unit of information processing 10. Therefore the area where index fingers are free to interact with the touch screen, have the form of a sector of a circle which is formed by bending the upper parts from joint of the index finger. In determining the form of the input area buttons L1/L2 80 and form field input buttons R1/R2 82 as sectors of a circle, the user can tap L1/L2, and button R1/R2 and to distinguish them by simply tilting your fingers. When the number of regions which have a view of sectors, may be formed over two buttons.

Angles, which divides the inner part of the original angle, may not be equal. For example, assuming that the smaller the angle of bending produced by the finger, the easier it will be possible to manipulate this finger, angle sector button at the top of the screen can be set to small, and the angle of the sector buttons closer to the left or right side of the screen, can be done more. To best match with the range of motion of the fingers you apart form buttons to configure and provision of these buttons.

On Fig shows a modified example of the input area buttons L1/L2. Although in the field of the input buttons L1/L2/L3 86 shown in Fig as an example, there are three buttons, that is, the button L1 a, L2 button 88b and the L3 button 88p, however, the number of buttons are not limited to this amount. On the one hand, the form of each button is set independently from each other, and on the other hand, the buttons are located in such a way that they form an arc shape. Although Fig form of each button is set in the form of a circle, this form can also be, for example, the rectangle. However, when the form of the buttons will even the most General form, through an arc placement of buttons allows the user to manipulate them without distortion and these manipulations will vary depending on the angle of the index finger. Also from this example we see that the more will be the position of the button from the top side of the screen and the closer this position to the left, the greater can be set to the distance between this button and the related button. The same is true for the buttons on the contrary, such as button R1/R2 shown in Fig.7.

Figure 9 shows another example of placing the image of the screen GUI and image content on the screen. In the example implementation, which was described above, the image GUI was presented as the screen image on the image content. In the example shown in Figure 9, an information-processing device 10 is oriented vertically down the long side, and the image content 100 and image GUI 102 displayed in various fields. In addition, in this case, the information-processing device 10 can be configured in the same way, as shown in Figure 3, and the block of image synthesis 34 provides a synthesis of rendering the image content and images GUI in separate and pre-defined areas, respectively. The ways of working of different types of GUIs are implemented in the same way as described above.

Although in this case the size of the area which recreates the image content 100, becoming fewer, however, the image of the GUI not obstruct the image content 100, and thus the input area Phillips switch directions, the input area buttons, input area joystick and other areas can be reproduced on the display together, without creating serious inconvenience. In addition, this case in the form of a hand holding the device, and/or finger, which are being manipulated in the fields of data entry, likely different from the case described above. Therefore button L1, L2 button, R1 button button R2 may not take the form of a sector of a circle, organized, for example, in the form of the arc.

As to whether the information-processing device 10 be used with a vertical orientation of the long side or with horizontal orientation of the long side, the user can determine it in accordance with the content type, content, games and other considerations. Figure 10 presents an example of a configured image output mode few GUI. Configured image GUI 90 set so that any provisional whether you start the processing of the content, in the middle of whether the content processing, the user can call up a preconfigured image GUI 90 at any time, tap the preset button on the GUI, and configured the image GUI 90 will play as a complement to the display screen. Alternatively, it may be caused by the menu screen, and then can be caused configured image GUI 90 selection from this menu.

Configured image GUI 90 field contains 92, reproducing the content of the configuration, line job transparency rate 94 and the ok/cancel (confirm/cancel) 96. Box 92, reproducing the content of the configuration shows the contents of the configuration, such as the orientation of the device information processing 10; a regime that determines whether switches the input area joystick 58 on the input area Phillips switch directions 42 and/or the input area buttons 44, or the input area joystick 58 shows regularly; shows whether the circumference of the available input Phillips switch directions 42 and/or in the field of the input buttons 44 translucent or not, as well as the color of the circle, and so on. Line job transparency rate 94 defines transparency coefficient of a circle in the case when this circle is reproduced in transparent form. The ok/cancel 96 allows the user to confirm the details of the configuration, or cancel the configuration changes.

In the example shown in Figure 10, box 92, reproducing the specified configuration contents presented in such a way that the information-processing device 10 is used with horizontal orientation of the long side, and use the mode in which the input area joystick 58 switches on the input area Phillips switch directions 42 and/or area of the input buttons 44, and the circles, which are included in the GUI are translucent.

Calling on the screen of the configuration GUI 90, when the user wants to change a given configuration, and the touch of a button commands directions 98, which shows for each menu item in the box 92, reproducing the specified configuration contents, the image of the current configuration, which was displayed, switch to the image of the other possible configurations. If desired configuration is displayed, the user confirms this configuration, tap "ok", included among the button ok/cancel 96. When the circle will play semi-transparent, you properly adjust it on the line job transparency rate of 94.

Thus, allowing the user to easily change the configuration mode displays the GUI, you can ensure that the device information processing in an optimal environment in accordance with the content type, content or stage of the game and in accordance with the preferences of individual users. Therefore, even when the device is oriented horizontally with the long side, the input area Phillips switch directions 42, area of the input buttons 44 and the input area joystick 58 can easily be played together, or form field input buttons L1/L2 80 and input areas buttons R1/R2 82 can be easily expressed in the form of rectangles.

In accordance with the above example of the implementation, the image GUI is reproduced as on-screen display on the display device information processing. This GUI that plays in this process is a two-dimensional representation of the cross of option buttons, buttons of different types, joystick or similar components that were formed in three dimensions, as in a conventional remote control's device. Thus, even the user who is used to manipulate three-dimensional remote control, can easily manipulate with the same means, and with the same ease.

The image of each GUI is located at the corners of the screen, so the surface of the output image content that overlaps the GUI becomes small. In addition, the formation of a combined area of the GUI and the possibility of switching to a separate GUI, depending on what the graphic image GUI contains the point from which to start the user's contact, multiple GUI shared the same recognition area. Next, is the GUI that can easily cause GUI, defined essentially as a non-printing GUI. Due to such features of the image content on the screen and the image GUI can naturally be placed together.

Further, the location on the display screen several GUI lower left, lower right, above on the right above or to the left, in accordance with the position of the fingers that hold the information-processing device, provides a natural manipulation, using large and/or index fingers of both hands. Additionally, the education of a few buttons, located at the top left and top right (work on which will, in all probability, be forefinger) as a contiguous areas in the form of a sector of a circle, or at the location of a few buttons in the form of an arc, taking into account the area of movement of the index finger, the user can touch buttons visible way and without distortion.

Allowing the user to set the orientation of devices of information processing, to include or not to include a few GUI, or to play or not to play the part of the image that forms a GUI, semi-transparent way, can be easily implemented a working environment that is consistent with the content type, user preferences, and similar factors.

The above explanation was based on the examples of implementation of the invention. These examples of implementation are only illustrations, and specialists in the art it will be obvious that there can be various modifications of elements and processes, and that such modifications are also within essence and scope of the invention.

1. An input device that contains: block image generation GUI that generates the image GUI (graphical user interface); display that displays an image GUI generated by the mentioned block image generation GUI; touch panel that covers the screen mentioned display and recognizes that the situation in which a user touches mentioned display; and the block of data conversion operations, which identifies the content of the operations performed by the user, based on the conformity between the point of contact recognized mentioned touchpad, and the image displayed GUI, while the mentioned block image generation GUI creates a combined GUI area in the above image GUI mentioned combined GUI area combines several GUI combining graphic images that are made up of combinations at least part of the graphic images of several GUI, and when the user first contact with the mentioned combined GUI mentioned block of data conversion operations identifies many mentioned GUI, which are United in this combined GUI, one of the GUI, the corresponding graphical image that contains the point from which to begin this contact, and the mentioned block image generation GUI allows you to share multiple GUI same detection mentioned on the touch panel by switching the said combined GUI on one of the GUI, which identified the mentioned block of data conversion operations.

2. The input device according to claim 1, characterized in that the mentioned several GUI, which are United in the above combined GUI contains a GUI that receives analog value corresponding to the mentioned contact point, and mentioned in the combined GUI creates a scope of recognition to switch to GUI receiving this analog value that is less than the area of recognition GUI receiving mentioned analog value, after switching of the above combined GUI.

3. The input device according to claim 2, characterized in that the mentioned several GUI, which are United in the above combined GUI, also contain a GUI that switches to the state on or off the function that is set on the above point of contact by contact or termination of the contact.

4. The input device according to claim 1, wherein when continuous contact with one of the GUI, which switches mentioned combined GUI ends, the mentioned block image generation GUI switches given one of the GUI back in the mentioned combined GUI.

5. The input device according to claim 1, characterized in that in the composition of the mentioned several GUI, which are United in the above combined GUI includes: GUI analog input that accepts an analog value, in accordance with the mentioned contact point, and GUI dip input, which switches the state to on or off the function that is set on the point of contact by contact or termination of the contact.

7. The input device according to claim 6, wherein the mentioned block of data conversion operations consistently receives the direction and distance from the point where it starts mentioned the contact to the contact point at the current time, and sets referred to the values obtained as the input values of the direction and magnitude within the period of time when one of the GUI, which was switched mentioned combined GUI is a GUI mentioned pseudorosette.

8. The input device according to claim 1, characterized in that the display reproduces the image of the GUI generated by the mentioned block image generation GUI, as the on-screen display, the output image that was generated as a result of processing of the information, made in the information-processing device, which is connected to the mentioned input device.

9. The input device according to claim 1, wherein the above-mentioned unit of generation GUI creates areas mentioned several combined GUI on the same screen image.

10. The input device according to claim 6, wherein the mentioned block image generation GUI plays an indicator that represents the point at which contact to the moment the user, during the period of time when one of the GUI, which was switched mentioned combined GUI is a GUI mentioned pseudorosette.

11. The input device according to claim 5, wherein the mentioned block image generation GUI: when switching from the mentioned combined GUI on mentioned GUI dip input maintains a graphic image of the switch on the mentioned GUI analog input, which was reproduced in the above combined GUI; and when switching from the mentioned combined GUI on mentioned GUI analog input disables the graphic image of the switch on the mentioned GUI dip input, which was reproduced in the above combined GUI.

12. Device processing of information containing: block image generation GUI that generates the image GUI (graphical user interface); the processing unit, which performs the processing of information in accordance with user action, corresponding mentioned GUI; display that displays the image GUI generated by the mentioned block image generation GUI, the output image, which is generated as a result of processing of the information, made by the mentioned information-processing device; touch panel that covers the above and display recognizes that the situation in which a user touches mentioned display; block of data conversion operations, which identifies the content of the operations performed by the user, based on the conformity between the point of contact recognized mentioned touchpad, and displayed the image of a GUI, while the mentioned block image generation GUI creates a combined GUI area in the above image GUI mentioned combined GUI area combines several GUI combination of graphic images that are the Union of at least part graphics few GUI, and when the user first contact with the mentioned combined GUI mentioned block of data conversion operations identifies among the mentioned several GUI, which are United in this combination GUI, one of the GUI, the corresponding graphical image that contains the point from which to begin this contact, and the mentioned block image generation GUI allows you to share multiple GUI same detection on mentioned touch panel by switching mentioned combined GUI on one of the GUI identified mentioned block of data conversion operations.

13. How to play the graphical user interface (GUI containing: the stage at which it is generated image GUI (graphical user interface), and the stage at which is reproduced on the display screen mentioned image GUI, which is the output image, which is generated as a result of processing of the information, and the stage at which recognized the situation where the user touches the touch panel, which covers referred to display, and the stage at which identifies the content of the operations performed by the user, based on respect the correspondence between the detected point of contact and image GUI on screen, and the stage at which it is generated mentioned image GUI, creates a combined GUI in the above image the GUI, and this region combined GUI is a Union of several GUI received combined by graphic images that are combinations of at least part of the graphic images of many GUI, and when the user first concerns mentioned combined GUI, on the stage, which is identified mentioned the content of the operations of the many mentioned GUI, which are United in the above combined GUI, identified by one of the GUI, the corresponding graphical image that contains the point at which to begin this contact, and on the stage at which it is generated mentioned image GUI is based on the joint use of many GUI total area of recognition mentioned on the touch panel switching mentioned combined GUI on one mentioned identified GUI.

14. Machine-readable medium, containing a computer program that is executed on your computer and contains: a function that generates the image GUI (graphical user interface); the function that plays on the screen mentioned image GUI, which is the output image, which is generated as a result of processing information, and a function that identifies the contents of the operations performed by the user, based on the conformity between the image GUI on screen and the point of contact recognized on the touch panel, which covers referred to display, and the function that generates the above image GUI, creates a combined area GUI mentioned in the image of the GUI, and this region combined GUI is a Union of several GUI received combined by graphic images that are combinations of at least part of the graphic images of many GUI, and when the user first concerns mentioned combined GUI, the function that identifies mentioned the content of the operations of the many mentioned GUI, which are United in the above combined GUI, identifies one of the GUI, the corresponding graphical image that contains the point at which to begin this contact, and the function that generates the above image GUI, sharing many GUI total area of recognition mentioned on the touch panel switching mentioned combined GUI on one mentioned identified GUI.

15. An input device that contains: display, located in the body of the device information processing and constitutes one with it; and touchpad, covering the display screen, the input device converts the information into a contact point with the index finger or thumb, which is detected by a sensor panel, in the operation performed by the user, and allows the device information processing processing the operation, the display shows: when the user holds case, oriented vertically with the long side, the output image generated by information processing performed by the information processing unit, and the image of the combined GUI (graphical user interface) on the display screen, and when the user holds case oriented horizontally with the long side of the image combined GUI screen image GUI in the display view more information on the output the image, with all the GUI are in the image of the combined GUI in accordance with the regulations of the index finger and thumb of a user when the user is holding the chassis with both parties so that the index finger and the thumb are on the top surface of the device in front of the middle finger, which supports the body at the bottom.

 

© 2013-2014 Russian business network RussianPatents.com - Special Russian commercial information project for world wide. Foreign filing in English.