RussianPatents.com

Virtual haptic panel. RU patent 2505848.

Virtual haptic panel. RU patent 2505848.
IPC classes for russian patent Virtual haptic panel. RU patent 2505848. (RU 2505848):

G06F3/0488 - Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements (typewriters B41J; conversion of physical variables F15B0005000000, G01; image acquisition G06T0001000000, G06T0009000000; coding, decoding or code conversion, in general H03M; transmission of digital information H04L)
Another patents in same IPC classes:
Method and apparatus for processing page resources Method and apparatus for processing page resources / 2504832
Method includes: identifying specific resources related to web pages, and determining corresponding relationships between the identified specific resources and the web pages; according to the corresponding relationships between the specific resources and the web pages, displaying prompt sign indicating the web pages having the specific resources.
Methods of launching services Methods of launching services / 2504824
Text is displayed in a user interface through a computing device. In response to the computing device receiving input to initiate preview of a portion of the text selected in the user interface, the computing device determines which of a plurality of services should provide the preview, and said portion of text is provided to said determined service without involvement of the user. The preview is displayed in the context of the displayed user interface by the computing device using data received from said determined service. In response to receiving, during display of the preview, input for moving to said determined service, the result of moving to said determined service is displayed by the computing device. Said result provides additional functionalities available from said determined service using said portion of text that not available through preview.
Information processing device, display method and display program Information processing device, display method and display program / 2504823
Device has an input unit configured to receive information relating to a first set of processing commands, a controller configured to generate a first signal to cause display of a first icon on a display, generate a second signal based on the received information to cause display of a second icon on the display. The second icon indicates change in status of data relating to the first set of processing commands and reception of user selection of either the first or second icon. Selection of the first icon causes execution of the first set of predetermined processing commands. Selection of the second icon causes execution of the second set of predetermined processing commands, wherein the received information includes location information, and the second icon includes an icon indicating that at least one object or service is available at a predetermined distance from the device.
Information processing device Information processing device / 2504822
Information processing device is configured to: display a menu comprising a first menu item associated with a first program and a second menu item associated with a second program; display on a screen a window with the display object which is part of a first display region passing along a first direction, with superposition on a first background image; upon receiving from the user commands for determining the direction along the first direction in a state of displaying a window with the display object, moving the window with the display object in the first display region in accordance with the given direction without changing the position of displaying the first background image; and displaying a predetermined window of a second display region passing along the first direction, with superposition on a second background image, carried out upon receiving from the user commands for determining the direction along a second direction which is perpendicular to the first direction, irrespective of the displacement or non-displacement of the window with the display object.
Terminal and control method therefor Terminal and control method therefor / 2504821
Mobile terminal has a touch screen for: displaying information and receiving input by contact touch and input by contactless touch; and a control unit for: simultaneously running a first application and a second application; controlling the touch screen to display either a first application menu or a second application menu when the first and second applications are running simultaneously; executing one of a plurality of commands associated with either the first or second application in accordance with the currently displayed menu when a contact touch input is received, and executing one of a plurality of commands associated with the other of the first and second applications when a contactless touch input is received.
User interface for working with search engines and databases (versions) User interface for working with search engines and databases (versions) / 2504097
User interface includes a screen with a working area on which there is a set of selectable objects in form of icons and display of the corresponding set of options in form of icons associated with the corresponding selectable objects. The interface also includes a control for moving a cursor used to select an object or option from said sets. In the interface, the working area of the screen is divided into a fixed number of background units arranged in rows one above the other to form a table in which each cell of the table is a background unit and in which each unit in one row is situated under a unit in an adjacent row to form columns. Each object and each option are in form of an icon situated on top of the background units and having a size equal to the size of one background unit or a multiple of background units in at least the horizontal or vertical direction. Icons of the set of selectable objects are arranged in one direction along rows of the table or columns thereof and include at least one icon corresponding to an option and the rest of the icons corresponding to objects. In each row, icons have a function of altering their appearance when pointed to by a cursor and a function of opening an underlying set of one or more selectable objects which, when opened, are situated in an adjacent row or column of the table on top of background units and along icons of a set opened prior. Icons of the set of selectable objects in each newly opened set are arranged in a direction which matches the direction of icons of a set opened prior, and include at least one icon corresponding to an option and the rest of icons corresponding to objects.
Service sharing Service sharing / 2504096
User-input data indicating that a current application in a device should be shared with a second device on a network are received. Status data indicating the specific status of said current application are determined. An image selection on a display device is processed, wherein said image represents said current application in a specific status of interaction with said device. A message describing data of said image for displaying said image in the second user device on the network is sent, wherein the message describes data indicating the current application and includes the network address for said current application and said status data for obtaining access to the current application in the second user device in said specific interaction status.
Portable device and method of operating single-pointer touch-sensitive user interface Portable device and method of operating single-pointer touch-sensitive user interface / 2503989
Method comprises steps of defining as being active a first one of a set of two or more controllable interface functions. A control gesture is then detected and associated with one active controllable interface function from the set, where the detected movement pattern adjusts the performance of the active controllable interface function. A transition gesture is detected, which is not included as a control gesture for any of the two or more controllable interface functions, which defines the next controllable interface function, corresponding to a second controllable interface function from the set. A further control gesture is detected and the control gesture is associated with the active controllable interface function, where the detected movement pattern adjusts the performance of the active controllable interface function.
Information processing device, information processing method, program and information processing system Information processing device, information processing method, program and information processing system / 2503057
In a case where a link to jump to a destination page is included within a source page, when movement information indicating that the link is moved to a predetermined position while being selected, is input is via an input unit, a control unit determines whether or not the link is moved to a predetermined region within a display surface based on the movement information and region information, and, when it is determined that the link is moved to the predetermined region within the display surface, the control unit acquires page information from a storage unit and causes a display unit to execute processing of displaying on the source page. The page analysis result is associated with the region information for specifying the predetermined region of a movement destination of the link.
Display control apparatus, display control method Display control apparatus, display control method / 2503054
Display control apparatus according to the present invention successively displays each of a predetermined number of target display images from a plurality of candidate images to be displayed on a display screen such that the target display images are arranged on the display screen in an order in accordance with a predetermined order of a plurality of images. The display control apparatus determines when a predetermined number of images is set as new target display images in accordance with a predetermined order, the order of displaying the predetermined number of images such that each of the predetermined number of images is arranged in an order in accordance with the predetermined order for successive display of images according to the display order, different from the predetermined order.
Atm and method for its operation (variants) Atm and method for its operation (variants) / 2248044
Methods for inputting data into ATM include following steps: forming an input signal when function key or auxiliary keyboard key is pressed at ATM panel, at the same time data inputted into ATM are assigned to pressed key, conversion of input signal from key to input signal, matching coordinate data of point of a point on screen of ATM, previously matched with pressed key, redirection of coordinate data input signal into computer data flow, meant for data input from mouse. Method for data input may have additional steps: forming of signal by pressing key of auxiliary keyboard, conversion of input signal from auxiliary keyboard key to input signal, matching key on standard letter-digit keyboard, previously matched with pressed key; redirection of received input into data flow of computer, meant for inputting data from standard keyboard. ATMs have computer and at least one device for performing financial operations in response to data input in computer, performed in accordance to methods for inputting data into ATM. Devices for storing software have software, providing for control of ATM in accordance to methods for inputting data in ATM.
Device for inputting image into personal computer Device for inputting image into personal computer / 2256210
Device has CPU, control block, receipt register, buffer memory block, address counter, first and second channel transmitters blocks, PC connection block, amplifier, pulse generator, control signals generator, second receipt register, second buffer memory block, first, second and third buffer registers, receipt-transmission register, strings counter, adder, first string counter and digital comparator.
Navigation Navigation / 2265245
Objects are linked to multiple directions on basis of positions of these objects, ascension or descent for one level in hierarchy of such objects is performed, while a certain group of objects can be passed as a whole at first level, and shifting between its other elements or sub-group can take place at second level.
Method for connecting semi-bridge indicator to personal computer Method for connecting semi-bridge indicator to personal computer / 2265878
Semi-bridge indicator is connected to computer through sound card, central processor unit of which is used for forming bearing alternating voltage, and two stereo output channels form alternating current bridge together with semi-bridge indicator, from diagonal line of which signal is injected into central processor unit of sound card.
Device for making image input into computer Device for making image input into computer / 2267150
Device has alphanumeric transformer, control unit and additionally introduced amplifier and transceiver register. Data input is carried out by sequentially reading each byte saved in the transceiver register.
Method for selecting product by means of data transfer network Method for selecting product by means of data transfer network / 2287176
Server system 100 controls selection of product through data transfer network 120, by means of a series of screens. Server system 100 processes user data from user input signals to determine whether user data are synchronized with at least one of a set of products. Server system 100 transfers the earliest one of screen signals, appropriate for such earliest one of screens in series, which has no synchronized data, if selected screen is next in the order of screens and if previous screens in order, prior to selected one, all have no synchronized data.
Information input device Information input device / 2291476
Device contains commutator, storage device, two counters, synchronizer, three gates. Address inputs of commutator and storage device are connected to each other and to byte outputs of counter, which continuously changes its state influenced by clock signal. Each counter state corresponds to the number of external device and to information written in storage device, which defines this external device. Address digit count of storage device is greater than commutator address digit count by one. That is, each external device has an identifier of two words, which is passed to computer during current query cycle.
Computer spherograph, combined computer control device based on it Computer spherograph, combined computer control device based on it / 2292072
Device is equipped with a lever with controlling and controlled arms, hanging on a support with the capability of moving controlled arm inside a spatial angle of a fixed spherical surface. Buttons and scroll wheel are located on controlling arm. Controlled arm, as well as the scanned interior spherical surface, is located inside the device.
I/o device I/o device / 2295748
I/o device can be used in devices for manual data input/output. Device has keyboard mounted on panel, radiation receiver/ source couples, two reserved channels for data processing, which channels have decoder, switches and optical module, control bus, data bus and microcontroller. Radiation receivers and radiation sources are placed in case of optical module.
Data input method Data input method / 2300129
Technical effect is achieved by means of repeated execution of following: setting of beginning of counting of values of parameter value determining parameters, comparison of these values to sets of conditions of selection of spatial areas from a set of such areas, including at least condition of whether parameter values, corresponding to manipulator movement, belong to this spatial area, recording a series of input of spatial areas, comparison of this record to certain set of series of selection of spatial areas, inputting data from a set of data, connected to matching series.

FIELD: physics.

SUBSTANCE: system part and touch screen part are simultaneously presented on a touch screen. Operating system (OS) user interface (UI) is presented in the system part, wherein the mouse cursor is presented as part of the OS. A virtual touch panel is presented in said touch screen part to control the mouse cursor, wherein the virtual touch panel includes a region for controlling movements of the mouse cursor. Touches are received within said touch screen part. One or more rules are used to convert the touches into one or more mouse actions. Mouse actions are used to control the mouse cursor and said one or more rules are used to process touches which begin from the inner part of said touch screen part and, by maintaining contact with the touch screen, continue to said system part.

EFFECT: controlling a mouse cursor on a touch screen using a touch panel built into the touch screen.

18 cl, 4 dwg

 

Prior art

The mouse is a ubiquitous tool of input that is best understood by many people using modern computing devices. For decades mouse remained one of the most common input mechanisms for computers. People are quick to recognize the mouse cursors on the screens of computing devices and, as a rule, know how to use them. Regardless of whether the device is a mobile phone, laptop, personal computer, the tablet input data, the mouse cursor is extremely popular for interaction with graphical user interfaces (GUI).

In many situations, a usual mouse is a cumbersome. In General, need a separate device, convenient for hands, for which slide you want a flat surface. Since laptops are becoming more and more popular, tactile panel is moved directly on the computing device. This frees the user from having to use a separate input device to interact with the computer, but tactile panel requires space in a computing device and is limited to specific size and scope. Modern tactile panel provide for the user only a small space to move a finger or the stylus, several complicating for user control of the mouse cursor on the display devices much bigger sizes.

The essence of the invention

In the section «summary of the invention» a list of the concepts in a simplified form, which are further outlined below a detailed description of the invention. It is not intended to determine the key features or essential features of the claimed invention, and it is not intended for use as an aid in determining the amount of the claimed invention.

One aspect of the invention is aimed at the simultaneous presentation of a virtual haptic (touch) panel and the graphical user interface (GUI) operating system OS on the same display device. The user can interact with the OS using virtual tactile panel implemented within a part of the tactile screen display device. Touch user is read by a part of the tactile screen and digitized by digital Converter. Software tactile input convert the packets touch of the touch user data packets that are relevant to the resolution of the screen or display device or part of this display device representing the OS. For software recognition gesture to a converted packages rules apply to determine what actions would be implied by a user touches. Ultimately, an application that manages the mouse cursor, manipulates the mouse cursor in accordance with the mouse.

Another aspect of the invention is focused on GUI in a single display device, which simultaneously represents a virtual device input and representation of the OS. The user can interact with the OS using virtual input devices.

Brief description of drawings

In more detail the present invention described below with reference to the accompanying drawings, in which:

Figure 1 depicts a block diagram of the illustrative operating environment used in the implementation of the present invention;

Figure 2 depicts the scheme illustrative GUI on a device with a touch screen in according to the variant of carrying out the invention;

Figure 3 depicts a schematic of device with a touch screen, configured to represent virtual tactile panel, in accordance with the option of carrying out the invention; and

Figure 4 depicts a diagram of the sequence of operations of the algorithm, illustrating the stages at which represent and manage virtual tactile panel, in accordance with the option of carrying out the invention.

Detailed description of the implementation of the invention

As described in this document object of the invention is submitted with features to meet the requirements specified by law. However, in this document, the description is not intended to limit the scope of the present invention. Rather it implies that the declared object of the invention can be accomplished in other ways to enable various stages or combinations stages similar to those described in this document in conjunction with other existing technologies or technologies in the future. Moreover, although in this paper, the term "block" can be used to refer to various elements of the used methods, the term should not be interpreted as implying any particular order among the various stages disclosed in this document, or between them.

Described in this document options for implementation-oriented virtual tactile (touch) panel provided in the tactile (touch) screen. Using virtual tactile panel, the user can control the mouse cursor on the screen of a computing device. Interaction with virtual tactile panel generates signals input tactile screen captured and translated into action click through software movements recognition. OS interprets signals mouse as if they were adopted from a normal mouse or tactile panel, thus ensuring the functionality of tactile panel via a virtual representation of tactile panel.

Although this document describes the virtual tactile panel, options for implementing it are not limited to. Instead, the options for implementing fully considered virtual execution of other input devices. For example, can be represented by a virtual scroll ball, virtual scroll wheel, a virtual joystick mouse or other virtual version of the device in some variants of implementation. For clarity, the following information describes exclusively virtual tactile panel.

Even despite the fact that described in this document options for implementation for the management of the mouse cursor through the virtual tactile panel, implementation options can also be combined with other guests tactile screen. In particular, virtual tactile panel can be shown, and it can provide a tool to control the mouse cursor, while the rest of the computing screen may also be sensitive to touches. For example, users can use the virtual tactile panel in parts of the tactile screen display device (described below) to control the mouse, and control the mouse cursor by touching the cursor is presented in the system part of a display device (also described below).

After a brief description of the overall summary of the review described in this document, options for implementation, the following describes illustrative computing device. In particular, the first referring to Figure 1, illustrative operating environment for the implementation of the present invention is shown and is, in General, as computing device 100. Computing device 100 is just one more example of a suitable computing environment, and it is not intended for supply of any restrictions of the scope of use or functionality of the invention. Computing device 100 should not be interpreted as having any dependence or requirements relating to any of the depicted components or their combination. In one embodiment, the computing device 100 is a regular computer (e.g. a personal computer or laptop).

One way of implementing the invention can be described in the General context of machine code or used machine commands, including the computer performs commands as software modules that run through a computer or other machine. In General, software modules, consisting of normal routines, programs, objects, components, data structures, etc, refer to code performing specific tasks or implementing some abstract data types. Described in this document options for implementation may be implemented in a variety of system configurations, including portable devices, consumer electronics, General-purpose computers, more specialized computing devices, etc. Described in this document options for implementation can also be implemented in practice in distributed computing environments in which tasks are performed by means of devices of remote processing, linked via a network connection.

As a rule, the computing device 100 includes many machine-readable media. For example, but not limitation, the machine-readable carrier may contain RAM, ROM, EEPROM, flash memory and other memory technologies, CD-ROM, DVD, or other optical or holographic media, magnetic tape, magnetic tape, magnetic disk storage device or other magnetic storage device, or similar material carriers, made with the possibility of storage and/or commands related to those described herein options for implementation.

Storage device 112 includes a computer-data storage medium in the form of energy independent and/or non-volatile memory. A storage device that can be removable, fixed, or a combination thereof. Illustrative device hardware include solid-state memory, hard drives, cache memory, optical drives, etc. Computing device 100 includes one or more processors, retrieving data from these various entities, as a mass storage device 112 or components 120 I/o. Component(s) 116 views are user or device image data. Illustrative view components include the device display, loudspeaker, a component of printing, a component of vibration, etc.

Ports 118 input/output allow you to logically connect computing device 100 with other devices, including components 120 I/o, some of which can be built-in. Visual components include microphone, joystick, game pad, satellite antenna, scanner, printer, wireless device, etc.

The above components related to a computing device 100, can also be included in the mobile device. As described in this document, the mobile device refers to any type of mobile phone, portable devices, personal digital Secretary (PDAs), smart phones, BlackBerry, digital camera, or other mobile device (except laptop), allowing for wireless communication. Specialist in a given field of technology is also worth taking into account that mobile devices will also include the processor and computer media data for different functions. The options for implementing described in this document, referred to as the computing device and mobile device. The options for implementing, computing devices are devices that simply launch the application from which the image captured by the camera in your mobile device.

Computing device 100 includes a tactile screen that will be taken into consideration person skilled in the art as a display device that can detect the location of the touch within the display area. Some variants of implementation will include a single device with parts of the display device is intended for making entries (referred to in this document «taps») tactile screen. In other variants of implementation, the entire display area is made to receive such touches user, as from a finger or a stylus. Touch screens can be implemented by adding a resistive, capacitive, infrared, or similar panels to the normal screen, computing device, such as liquid crystal display (LCD)LED, organic led (OLED), etc. Alternatively, touch screens can contain configuration meters deformation or technology optical display, the dispersion of the signal, surface acoustic wave, or other technology to capture touches. The above lists are not exhaustive, and as it will be clear to the expert in a given field of technology may be applied many other panels and technologies for the capture of touches.

Figure 2 is a scheme, showing the illustrative GUI 200 on a device with a touch screen, in accordance with the option of carrying out the invention. GUI 200 contains two parts of the display device: system part 202 and 204 part tactile screen. In particular, the system portion of 202 a GUI OS such as Microsoft window. Touch in part 204 tactile screen translated into action cursor 206 click within the system section 202. In one embodiment, the whole screen, including both the system part 202 and 204 part tactile screen, may be included in the tactile screen configured so that ignore touch in the system section 202 and handle touch in part 204 tactile screen.

In an alternate embodiment touch in the system section 202 are not ignored; rather touching processed, allowing the user to directly manipulate the mouse cursor. For example, a user can touch the mouse cursor in the system part and sliding the finger touching the mouse cursor to any place in the system section 202. This will follow the mouse with your finger until the finger is not raised. Therefore, implementation of this option would have handled touch directly in the system part of 202 in addition to touch the part of 204 tactile screen.

Software system section 202 of display devices can communicate using a mouse or other input devices. In one embodiment, the system portion of 202 a GUI OS such as Windows®, Windows Mobile, MacOS, Linux, or similar. Using the mouse, the user can interact with applications such software tools as a web browser 214.

Part 204 tactile screen includes the touch pad 208, left key 210, and the right key 212. Sensory Playground 208 is a part of the normal sensory areas tactile panel and allows the user to handle with the touch-pad 208 the same way. Using the touch pad 208, the user can move the mouse cursor in any particular direction 206 by moving your finger or stylus in this direction. More sophisticated touch (for example, move, release, guidance, multi-touch, etc) can also register through the touch the touch platform and/or, in particular, keys. For example, a user can move a finger down in the leftmost part of the sensory areas 208 to specify the scroll key down. Or, the user can easily knock twice on the touch pad 208 to specify the click of the left key of the mouse. In addition, touch the left button 210 and right keys 212 determine clicks the left and right mouse buttons respectively. Also there are other touch that will be obvious to specialists in the field of technology. In addition, other devices (for example, a trackball, scroll wheel, etc) input can alternatively use a variety of actions that can be easily handled by software movements recognition described in this document.

Options for implementation are not limited to the configuration outlined in figure 2. For example, you can display additional keys, or part of a tactile screen may be located in another part of the GUI 200. In addition, part 204 tactile screen can be divided into different sections and different sections can occupy certain part of the GUI 200.

In some variants of implementation, the line between tactile part of 204 screen and system part 202 can be corrected depending on user interaction with the system. In embodiment, alternative described above, touch, arising in part 204 tactile screen and carried in the system part 202, processed completely. For example, this option keeps tracking the implementation of dragging a finger from a part of 204 tactile screen in the system part 202. Rather than stopping the cursor offset 206, when the finger goes beyond the external borders part of 204 tactile screen, the cursor 206 continues to offset the direction of movement of the finger to the specified event occurs, for example, to stop the movement by the user. In other words, such an implementation option does not limit the touch part of 204 tactile screen if the touch is transferred to the system part. In addition, in one embodiment, every touch, starting within a part of the 204 tactile screen and continuing beyond, leads to the touch, which are treated just like within a part of the 204 tactile screen.

Figure 3 is a concept device 100 with a touch screen, configured to represent virtual tactile panel, in accordance with the option of carrying out the invention. It should be noted that only 3 illustrates one way of implementing. Also, to be clear, many components of a computing device 100 are not depicted. Indeed, the computing device 100 includes processor and computer media data to support the software mentioned in this document. In fact, in some versions implementation, packages 318 touches messages 320 mouse, or action in 22 of the mouse can be cached for quick search.

Device 302 display is a single-screen view with guests tactile screen. In one embodiment, software in a computer 100, are both on the device 302 display two different parts of the GUI. These parts are depicted in figure 2 and are mentioned above as a system 304 part and the part 306 tactile screen. The system portion of 304 displays an interactive view of the operating system (OS), thus providing access to applications software. As an example, the system portion of 304 can be conceived as part of the device 302 display, displays the version of Microsoft Windows®, Windows Mobile, Linux, MacOS, or similar. Within the system part 304, the user can interact with the applications software by manipulating the mouse cursor. In addition, tactile 306 part of the screen displays the virtual tactile panel, which can be used by the user for controlling the mouse cursor. Virtual tactile touch panel adopts 316 from the user (for example, by finger or a stylus), and translates touch 316 in commands to move the cursor of the mouse in the system part 304. In short, the user touches the tactile panel in part 306 tactile screen to control the mouse pointer in the system part of 304.

The user can enter a different touch 316 part 306 tactile screen. For example, the user can move a finger through part 306 tactile screen in one direction to specify the cursor to move in this direction. You can easily hit your finger or the stylus on the right, left, middle, or other keys, labeling pressing one of these keys. These keys can also have the feature of "sticking", whereby an action (for example, for the command keystroke should quickly move down) leads to holding the keys down until not registered drop action (for example, a single light knock on the pressed key). Of course, by a part of the 306 tactile screen may be accepted, and various other touch 316.

Touch 316 taken by a part of the 306 tactile screen, go through digital Converter 308. In one embodiment, digital Converter 308 includes control device touch screen, which detects touch 316 taken by a part of the 306 tactile screen and converts touch 316 in their digital equivalents. As an example, but not limitation, digital Converter 308 can be configured to detect changes in the current, voltage, resistance, capacitance, or infrared light, which are the consequences of touches 316. Digital Converter 308 converts the changes in packages 318 touches.

Packages 318 touches (usually called "packages pen and touch") include different information relevant to the touches 316, such as: x/y coordinates, pressure, size, direction, or similar. In addition, packages may also include information related to guests tactile screen devices 302 display, such as portion sizes 306 (for example, two inches by two inches, 200 pixels, etc) tactile screen.

Software 310 input touches convert the packets 318 touches to approach 304 part of the screen. For this purpose, software 310 input touches translate the information provided in the packages 318 touching, in their equivalents on the system part of 304. Software 310 input touch can hold any type of shell program, such as WISPTIS in Microsoft window. In action, software 310 input touches take the information in packets 318 touch and transform them to fit to the screen size, resolution, or the number of pixels devices 302 display.

In action, software 310 input touches convert the information in packets 318 touch the size of the display device or permissions part 306 tactile screen in screen size and resolution, associated with the main screen of the device 302 display. For example, part of 306 tactile screen can be two inches in width by two inches in length, and the device 302 display can be ten inches in width for ten inches in length. In one embodiment, to convert batches of 318 touches, software 310 input touches multiply the distance that moved a finger in tactile 306 part of the screen by a factor of five. In addition, the speed touch can also be transferred to specify the speed at which move the mouse cursor.

Opportunities touch-screen device 302 display can be more accurate than the device resolution 302 display. For example, a device 302 display may include computer screen, made with the possibility of lighting 200 lines of pixels, and the layer (e.g. capacitive resistive, infrared) touch-screen with more than 10,000 lines defined intervals. In this example, detection through digital Converter 308 be converted into cash equivalents display device in the system part of 304, thus allowing touches 316 reproduced in the system part of 304.

Once the packages 318 touches converted by software 310 input touches to view on a display device or within the display area of the system part of 304 converted packages 320 transmitted via software 312 movements recognition to determine what actions were asked touch-based 316. In one embodiment, software 312 movements recognition implement the state machine in which actions 322 mouse are determined on the basis of different rules. The rules implemented by software 312 movements recognition may include any condition associated with the mouse action. For example, if the converted packages 320 indicate detected movement in a particular direction at a particular speed, the software 312 movements determine that the expected movement of the mouse in a certain direction at a certain speed and, accordingly, the package is created 322 mouse. Rules can be configured for almost any type of mouse actions, such as guidance, move, multiple touch, etc.

Actions 322 mouse can be linear or nonlinear transformations touches 316. Linear transformations are direct transformation touches 316 in the system part of 304, taking account of the differences in the size of the screen part 306 tactile screen than the display area part 306 tactile screen 302 display. In other words, touch speed 316 translated into its equivalent in the system part of 304. Nonlinear transformation is related to the touch, which is translated in the system part not directly; rather, touch increases or otherwise manipulated on the basis of rules. For example, the speed with which finger is carried out by a part of 306 tactile screen can be increased, if it moves continuously in one direction at a particular time. In other words, if you hold a virtual joystick on a particular speed, you can increase the speed of movement of the mouse cursor. For software 312 movements recognition can be set rules for accounting of linear and nonlinear motion.

Rules of software tools 312 movements recognition may take into account the movements with multiple taps. For example, moving subject can be done by keeping touches left virtual mouse button when finger on the virtual tactile panel. To account for such actions, as well as other actions with multiple taps, through software 312 movements recognition, a rule can be configured and run.

Actions 322 mouse to the application 314 management GUI for the view. Annex 314 management GUI is the shell, made with the possibility of interpretation of the actions of the mouse and perform actions with the mouse cursor. In action, Annex 314 management GUI controls the mouse cursor in the system part of 304. In one embodiment, the application 314 management GUI is an application program explorer.exe in Windows® OS. Alternatives for implementation may contain different management applications mouse cursors in other OS.

In one embodiment, software 312 movements recognition contain rules for packet processing 318 touches within a part of the 306 tactile screen, but not packages 318 touches within the system part of 304. Thus, if you move your finger outside the boundaries of tactile panel, then the packets 318 touches the outside part 306 tactile screen is not processed, that almost stops moving the mouse. However, in the alternative to the implementation of the mouse messages arising from the inside part of the 306 tactile screen and expanding the system part 304, processed completely. In this embodiment, the touch of starting the virtual tactile panel and moving to a display device OS, shapes 322 mouse for continuous movement of the mouse cursor until the touch is not finished outside the system 304 part. Therefore, using the supported on all of the device 302 display the properties of a tactile screen, the user need not be limited to virtual tactile panel.

Figure 4 is a scheme of sequence 400 operations, representing the stages at which represent and manage virtual tactile panel and GUI operating system on the same display device, in accordance with the option of carrying out the invention. Originally, a single display device with a touch screen simultaneously represents a virtual tactile panel and operating system, as indicated on the stage 402. In one embodiment, it seems extremely GUI OS until the user selects a button with a permanent function or key variable function to view the virtual tactile panel. In another embodiment, the virtual tactile panel is presented without any user interaction.

The user can touch the virtual tactile panel for interaction with the OS, such as moving a mouse cursor, as shown on the stage of the 404. Touch user read through digital Converter, authoring packages taps), as specified at the stage of 406. Packages touches are digital representations touch user. As a tactile screen can provide the ability to register touches with some accuracy, and information display devices - on the other, the packets touches are converted to approach configurations display display device with a touch screen, as shown on stage 408. For example, packages touches can be converted to approach the concrete size of the screen or to the resolution of the display device with a touch screen, or to approach the system part of the displayed simultaneously with the virtual tactile panel.

As stated at the stage 410, the rules apply to the converted packages to determine which mouse actions stipulated by taps), as specified at the stage of 410. These rules can cover actions that the user enters the virtual tablet, such as moving the mouse, pressing a key, and releasing the key, move, guidance, multiple touch, or similar. After determining mouse actions are passed to the application program (for example, explorer.exe in Windows®), which, then, respectively manipulates the operating system GUI.

Despite the fact that the object of the invention is described in the language, characteristic of the structural hallmarks of and/or methodological activities, implies that applied the formula of the invention subject matter of the invention is not necessarily limited to the above characteristic distinctive features or actions. Rather, described above distinctive signs and actions are disclosed in the forms of the examples of implementation of the claims. For example, the interval sampling and sampling periods other than those described in this document can also be collected by volume of claims.

1. One or more machine-readable media with the computer performs the commands implemented for the complete method for the simultaneous display of a virtual touchpad mouse cursor control on the basis of taps on the touch panel containing the stages at which both are part of the system and part of the touch screen on the touch screen; represent the user interface (UI) operating system (OS) in the system part, the mouse pointer appears as part of the OS; are virtual touchpad in the mentioned part of the touch screen to control the mouse cursor, while the virtual touchpad includes the region to control the movements of the mouse cursor; take touch within the mentioned part of the touch screen; use of one or more rules for translation touches in one or more mouse actions; use the mouse to control the mouse cursor, and use referred to one or more rules for processing touches, which begin from the inner part of the mentioned part of the touch screen, and by keeping in contact with touch screen, continue in the mentioned part of the system.

2. Media device according to claim 1, additionally contains the application to digital Converter touches to convert touches in packages of touches.

3. Media in paragraph 2, in which the packages touches indicate at least one of the coordinates x/y, direction and speed associated with taps.

4. Media device according to claim 1, additionally contains a representation of one or more virtual keys of the touch screen.

5. Media device according to claim 1, additionally contains the reception touches in part referred to the system part.

6. Media device according to claim 1, further comprising: reception of the second touch in the system; and the control of the mouse cursor on the basis of the second tap.

7. Media device according to claim 1, which referred to one or more rules do not handle touch, beginning from outside of the touch screen.

8. Media device according to claim 1 in which the virtual touch panel appears only when the user presses a key.

9. Media device according to claim 1 in which touch-screen supports multiple touch.

11. Graphical user interface (GUI) to 10, in which the virtual touchpad mouse includes one or more virtual keys.

12. Graphical user interface (GUI) to 10, in which the touch is the user's finger or stylus.

13. Graphical user interface (GUI) to 10, in which touch are used to control the mouse pointer in the system area of the display.

14. Way to represent virtual touchpad on the device display, so the user interacted with the graphical representation of the user interface (GUI) operating system (OS)that contains the stages at which both are part of the system and part of the touch screen on the display device, the system part of a GUI referred OS, as part of the touch screen is a virtual touchpad in performance, indicating a virtualized touch pad mouse, which includes the area to entertain touch to control the movements of the mouse cursor; take one or more touches on the virtual touchpad; take one or more of touches, start from the inner part of the mentioned part of the touch screen and maintaining contact with the device display, continue in the system part, translate one or more touches in packages of touches, which indicate the direction of the x/y; convert the packets touches in the converted packages considering the size of the screen associated with the part of the system; define one or more mouse actions on the basis of the converted packages; and manipulate GUI referred OS-based packages mouse.

15. The method according to paragraph 14, additionally contains caching converted packages.

16. The method according to item 15, which touch is the user's finger or stylus.

17. The method according to item 15, in which the manipulation GUI referred OS additionally contains manipulation of the mouse cursor.

18. The method according to paragraph 17, where the mouse cursor is controlled by the application program.

 

© 2013-2014 Russian business network RussianPatents.com - Special Russian commercial information project for world wide. Foreign filing in English.