RussianPatents.com
|
Virtual haptic panel. RU patent 2505848. |
|||||||||||||||||||||
IPC classes for russian patent Virtual haptic panel. RU patent 2505848. (RU 2505848):
|
FIELD: physics. SUBSTANCE: system part and touch screen part are simultaneously presented on a touch screen. Operating system (OS) user interface (UI) is presented in the system part, wherein the mouse cursor is presented as part of the OS. A virtual touch panel is presented in said touch screen part to control the mouse cursor, wherein the virtual touch panel includes a region for controlling movements of the mouse cursor. Touches are received within said touch screen part. One or more rules are used to convert the touches into one or more mouse actions. Mouse actions are used to control the mouse cursor and said one or more rules are used to process touches which begin from the inner part of said touch screen part and, by maintaining contact with the touch screen, continue to said system part. EFFECT: controlling a mouse cursor on a touch screen using a touch panel built into the touch screen. 18 cl, 4 dwg
Prior art The mouse is a ubiquitous tool of input that is best understood by many people using modern computing devices. For decades mouse remained one of the most common input mechanisms for computers. People are quick to recognize the mouse cursors on the screens of computing devices and, as a rule, know how to use them. Regardless of whether the device is a mobile phone, laptop, personal computer, the tablet input data, the mouse cursor is extremely popular for interaction with graphical user interfaces (GUI). In many situations, a usual mouse is a cumbersome. In General, need a separate device, convenient for hands, for which slide you want a flat surface. Since laptops are becoming more and more popular, tactile panel is moved directly on the computing device. This frees the user from having to use a separate input device to interact with the computer, but tactile panel requires space in a computing device and is limited to specific size and scope. Modern tactile panel provide for the user only a small space to move a finger or the stylus, several complicating for user control of the mouse cursor on the display devices much bigger sizes. The essence of the invention In the section «summary of the invention» a list of the concepts in a simplified form, which are further outlined below a detailed description of the invention. It is not intended to determine the key features or essential features of the claimed invention, and it is not intended for use as an aid in determining the amount of the claimed invention. One aspect of the invention is aimed at the simultaneous presentation of a virtual haptic (touch) panel and the graphical user interface (GUI) operating system OS on the same display device. The user can interact with the OS using virtual tactile panel implemented within a part of the tactile screen display device. Touch user is read by a part of the tactile screen and digitized by digital Converter. Software tactile input convert the packets touch of the touch user data packets that are relevant to the resolution of the screen or display device or part of this display device representing the OS. For software recognition gesture to a converted packages rules apply to determine what actions would be implied by a user touches. Ultimately, an application that manages the mouse cursor, manipulates the mouse cursor in accordance with the mouse. Another aspect of the invention is focused on GUI in a single display device, which simultaneously represents a virtual device input and representation of the OS. The user can interact with the OS using virtual input devices. Brief description of drawings In more detail the present invention described below with reference to the accompanying drawings, in which: Figure 1 depicts a block diagram of the illustrative operating environment used in the implementation of the present invention; Figure 2 depicts the scheme illustrative GUI on a device with a touch screen in according to the variant of carrying out the invention; Figure 3 depicts a schematic of device with a touch screen, configured to represent virtual tactile panel, in accordance with the option of carrying out the invention; and Figure 4 depicts a diagram of the sequence of operations of the algorithm, illustrating the stages at which represent and manage virtual tactile panel, in accordance with the option of carrying out the invention. Detailed description of the implementation of the invention As described in this document object of the invention is submitted with features to meet the requirements specified by law. However, in this document, the description is not intended to limit the scope of the present invention. Rather it implies that the declared object of the invention can be accomplished in other ways to enable various stages or combinations stages similar to those described in this document in conjunction with other existing technologies or technologies in the future. Moreover, although in this paper, the term "block" can be used to refer to various elements of the used methods, the term should not be interpreted as implying any particular order among the various stages disclosed in this document, or between them. Described in this document options for implementation-oriented virtual tactile (touch) panel provided in the tactile (touch) screen. Using virtual tactile panel, the user can control the mouse cursor on the screen of a computing device. Interaction with virtual tactile panel generates signals input tactile screen captured and translated into action click through software movements recognition. OS interprets signals mouse as if they were adopted from a normal mouse or tactile panel, thus ensuring the functionality of tactile panel via a virtual representation of tactile panel. Although this document describes the virtual tactile panel, options for implementing it are not limited to. Instead, the options for implementing fully considered virtual execution of other input devices. For example, can be represented by a virtual scroll ball, virtual scroll wheel, a virtual joystick mouse or other virtual version of the device in some variants of implementation. For clarity, the following information describes exclusively virtual tactile panel. Even despite the fact that described in this document options for implementation for the management of the mouse cursor through the virtual tactile panel, implementation options can also be combined with other guests tactile screen. In particular, virtual tactile panel can be shown, and it can provide a tool to control the mouse cursor, while the rest of the computing screen may also be sensitive to touches. For example, users can use the virtual tactile panel in parts of the tactile screen display device (described below) to control the mouse, and control the mouse cursor by touching the cursor is presented in the system part of a display device (also described below). After a brief description of the overall summary of the review described in this document, options for implementation, the following describes illustrative computing device. In particular, the first referring to Figure 1, illustrative operating environment for the implementation of the present invention is shown and is, in General, as computing device 100. Computing device 100 is just one more example of a suitable computing environment, and it is not intended for supply of any restrictions of the scope of use or functionality of the invention. Computing device 100 should not be interpreted as having any dependence or requirements relating to any of the depicted components or their combination. In one embodiment, the computing device 100 is a regular computer (e.g. a personal computer or laptop). One way of implementing the invention can be described in the General context of machine code or used machine commands, including the computer performs commands as software modules that run through a computer or other machine. In General, software modules, consisting of normal routines, programs, objects, components, data structures, etc, refer to code performing specific tasks or implementing some abstract data types. Described in this document options for implementation may be implemented in a variety of system configurations, including portable devices, consumer electronics, General-purpose computers, more specialized computing devices, etc. Described in this document options for implementation can also be implemented in practice in distributed computing environments in which tasks are performed by means of devices of remote processing, linked via a network connection. As a rule, the computing device 100 includes many machine-readable media. For example, but not limitation, the machine-readable carrier may contain RAM, ROM, EEPROM, flash memory and other memory technologies, CD-ROM, DVD, or other optical or holographic media, magnetic tape, magnetic tape, magnetic disk storage device or other magnetic storage device, or similar material carriers, made with the possibility of storage and/or commands related to those described herein options for implementation. Storage device 112 includes a computer-data storage medium in the form of energy independent and/or non-volatile memory. A storage device that can be removable, fixed, or a combination thereof. Illustrative device hardware include solid-state memory, hard drives, cache memory, optical drives, etc. Computing device 100 includes one or more processors, retrieving data from these various entities, as a mass storage device 112 or components 120 I/o. Component(s) 116 views are user or device image data. Illustrative view components include the device display, loudspeaker, a component of printing, a component of vibration, etc. Ports 118 input/output allow you to logically connect computing device 100 with other devices, including components 120 I/o, some of which can be built-in. Visual components include microphone, joystick, game pad, satellite antenna, scanner, printer, wireless device, etc. The above components related to a computing device 100, can also be included in the mobile device. As described in this document, the mobile device refers to any type of mobile phone, portable devices, personal digital Secretary (PDAs), smart phones, BlackBerry, digital camera, or other mobile device (except laptop), allowing for wireless communication. Specialist in a given field of technology is also worth taking into account that mobile devices will also include the processor and computer media data for different functions. The options for implementing described in this document, referred to as the computing device and mobile device. The options for implementing, computing devices are devices that simply launch the application from which the image captured by the camera in your mobile device. Computing device 100 includes a tactile screen that will be taken into consideration person skilled in the art as a display device that can detect the location of the touch within the display area. Some variants of implementation will include a single device with parts of the display device is intended for making entries (referred to in this document «taps») tactile screen. In other variants of implementation, the entire display area is made to receive such touches user, as from a finger or a stylus. Touch screens can be implemented by adding a resistive, capacitive, infrared, or similar panels to the normal screen, computing device, such as liquid crystal display (LCD)LED, organic led (OLED), etc. Alternatively, touch screens can contain configuration meters deformation or technology optical display, the dispersion of the signal, surface acoustic wave, or other technology to capture touches. The above lists are not exhaustive, and as it will be clear to the expert in a given field of technology may be applied many other panels and technologies for the capture of touches. Figure 2 is a scheme, showing the illustrative GUI 200 on a device with a touch screen, in accordance with the option of carrying out the invention. GUI 200 contains two parts of the display device: system part 202 and 204 part tactile screen. In particular, the system portion of 202 a GUI OS such as Microsoft window. Touch in part 204 tactile screen translated into action cursor 206 click within the system section 202. In one embodiment, the whole screen, including both the system part 202 and 204 part tactile screen, may be included in the tactile screen configured so that ignore touch in the system section 202 and handle touch in part 204 tactile screen. In an alternate embodiment touch in the system section 202 are not ignored; rather touching processed, allowing the user to directly manipulate the mouse cursor. For example, a user can touch the mouse cursor in the system part and sliding the finger touching the mouse cursor to any place in the system section 202. This will follow the mouse with your finger until the finger is not raised. Therefore, implementation of this option would have handled touch directly in the system part of 202 in addition to touch the part of 204 tactile screen. Software system section 202 of display devices can communicate using a mouse or other input devices. In one embodiment, the system portion of 202 a GUI OS such as Windows®, Windows Mobile, MacOS, Linux, or similar. Using the mouse, the user can interact with applications such software tools as a web browser 214. Part 204 tactile screen includes the touch pad 208, left key 210, and the right key 212. Sensory Playground 208 is a part of the normal sensory areas tactile panel and allows the user to handle with the touch-pad 208 the same way. Using the touch pad 208, the user can move the mouse cursor in any particular direction 206 by moving your finger or stylus in this direction. More sophisticated touch (for example, move, release, guidance, multi-touch, etc) can also register through the touch the touch platform and/or, in particular, keys. For example, a user can move a finger down in the leftmost part of the sensory areas 208 to specify the scroll key down. Or, the user can easily knock twice on the touch pad 208 to specify the click of the left key of the mouse. In addition, touch the left button 210 and right keys 212 determine clicks the left and right mouse buttons respectively. Also there are other touch that will be obvious to specialists in the field of technology. In addition, other devices (for example, a trackball, scroll wheel, etc) input can alternatively use a variety of actions that can be easily handled by software movements recognition described in this document. Options for implementation are not limited to the configuration outlined in figure 2. For example, you can display additional keys, or part of a tactile screen may be located in another part of the GUI 200. In addition, part 204 tactile screen can be divided into different sections and different sections can occupy certain part of the GUI 200. In some variants of implementation, the line between tactile part of 204 screen and system part 202 can be corrected depending on user interaction with the system. In embodiment, alternative described above, touch, arising in part 204 tactile screen and carried in the system part 202, processed completely. For example, this option keeps tracking the implementation of dragging a finger from a part of 204 tactile screen in the system part 202. Rather than stopping the cursor offset 206, when the finger goes beyond the external borders part of 204 tactile screen, the cursor 206 continues to offset the direction of movement of the finger to the specified event occurs, for example, to stop the movement by the user. In other words, such an implementation option does not limit the touch part of 204 tactile screen if the touch is transferred to the system part. In addition, in one embodiment, every touch, starting within a part of the 204 tactile screen and continuing beyond, leads to the touch, which are treated just like within a part of the 204 tactile screen. Figure 3 is a concept device 100 with a touch screen, configured to represent virtual tactile panel, in accordance with the option of carrying out the invention. It should be noted that only 3 illustrates one way of implementing. Also, to be clear, many components of a computing device 100 are not depicted. Indeed, the computing device 100 includes processor and computer media data to support the software mentioned in this document. In fact, in some versions implementation, packages 318 touches messages 320 mouse, or action in 22 of the mouse can be cached for quick search. Device 302 display is a single-screen view with guests tactile screen. In one embodiment, software in a computer 100, are both on the device 302 display two different parts of the GUI. These parts are depicted in figure 2 and are mentioned above as a system 304 part and the part 306 tactile screen. The system portion of 304 displays an interactive view of the operating system (OS), thus providing access to applications software. As an example, the system portion of 304 can be conceived as part of the device 302 display, displays the version of Microsoft Windows®, Windows Mobile, Linux, MacOS, or similar. Within the system part 304, the user can interact with the applications software by manipulating the mouse cursor. In addition, tactile 306 part of the screen displays the virtual tactile panel, which can be used by the user for controlling the mouse cursor. Virtual tactile touch panel adopts 316 from the user (for example, by finger or a stylus), and translates touch 316 in commands to move the cursor of the mouse in the system part 304. In short, the user touches the tactile panel in part 306 tactile screen to control the mouse pointer in the system part of 304. The user can enter a different touch 316 part 306 tactile screen. For example, the user can move a finger through part 306 tactile screen in one direction to specify the cursor to move in this direction. You can easily hit your finger or the stylus on the right, left, middle, or other keys, labeling pressing one of these keys. These keys can also have the feature of "sticking", whereby an action (for example, for the command keystroke should quickly move down) leads to holding the keys down until not registered drop action (for example, a single light knock on the pressed key). Of course, by a part of the 306 tactile screen may be accepted, and various other touch 316. Touch 316 taken by a part of the 306 tactile screen, go through digital Converter 308. In one embodiment, digital Converter 308 includes control device touch screen, which detects touch 316 taken by a part of the 306 tactile screen and converts touch 316 in their digital equivalents. As an example, but not limitation, digital Converter 308 can be configured to detect changes in the current, voltage, resistance, capacitance, or infrared light, which are the consequences of touches 316. Digital Converter 308 converts the changes in packages 318 touches. Packages 318 touches (usually called "packages pen and touch") include different information relevant to the touches 316, such as: x/y coordinates, pressure, size, direction, or similar. In addition, packages may also include information related to guests tactile screen devices 302 display, such as portion sizes 306 (for example, two inches by two inches, 200 pixels, etc) tactile screen. Software 310 input touches convert the packets 318 touches to approach 304 part of the screen. For this purpose, software 310 input touches translate the information provided in the packages 318 touching, in their equivalents on the system part of 304. Software 310 input touch can hold any type of shell program, such as WISPTIS in Microsoft window. In action, software 310 input touches take the information in packets 318 touch and transform them to fit to the screen size, resolution, or the number of pixels devices 302 display. In action, software 310 input touches convert the information in packets 318 touch the size of the display device or permissions part 306 tactile screen in screen size and resolution, associated with the main screen of the device 302 display. For example, part of 306 tactile screen can be two inches in width by two inches in length, and the device 302 display can be ten inches in width for ten inches in length. In one embodiment, to convert batches of 318 touches, software 310 input touches multiply the distance that moved a finger in tactile 306 part of the screen by a factor of five. In addition, the speed touch can also be transferred to specify the speed at which move the mouse cursor. Opportunities touch-screen device 302 display can be more accurate than the device resolution 302 display. For example, a device 302 display may include computer screen, made with the possibility of lighting 200 lines of pixels, and the layer (e.g. capacitive resistive, infrared) touch-screen with more than 10,000 lines defined intervals. In this example, detection through digital Converter 308 be converted into cash equivalents display device in the system part of 304, thus allowing touches 316 reproduced in the system part of 304. Once the packages 318 touches converted by software 310 input touches to view on a display device or within the display area of the system part of 304 converted packages 320 transmitted via software 312 movements recognition to determine what actions were asked touch-based 316. In one embodiment, software 312 movements recognition implement the state machine in which actions 322 mouse are determined on the basis of different rules. The rules implemented by software 312 movements recognition may include any condition associated with the mouse action. For example, if the converted packages 320 indicate detected movement in a particular direction at a particular speed, the software 312 movements determine that the expected movement of the mouse in a certain direction at a certain speed and, accordingly, the package is created 322 mouse. Rules can be configured for almost any type of mouse actions, such as guidance, move, multiple touch, etc. Actions 322 mouse can be linear or nonlinear transformations touches 316. Linear transformations are direct transformation touches 316 in the system part of 304, taking account of the differences in the size of the screen part 306 tactile screen than the display area part 306 tactile screen 302 display. In other words, touch speed 316 translated into its equivalent in the system part of 304. Nonlinear transformation is related to the touch, which is translated in the system part not directly; rather, touch increases or otherwise manipulated on the basis of rules. For example, the speed with which finger is carried out by a part of 306 tactile screen can be increased, if it moves continuously in one direction at a particular time. In other words, if you hold a virtual joystick on a particular speed, you can increase the speed of movement of the mouse cursor. For software 312 movements recognition can be set rules for accounting of linear and nonlinear motion. Rules of software tools 312 movements recognition may take into account the movements with multiple taps. For example, moving subject can be done by keeping touches left virtual mouse button when finger on the virtual tactile panel. To account for such actions, as well as other actions with multiple taps, through software 312 movements recognition, a rule can be configured and run. Actions 322 mouse to the application 314 management GUI for the view. Annex 314 management GUI is the shell, made with the possibility of interpretation of the actions of the mouse and perform actions with the mouse cursor. In action, Annex 314 management GUI controls the mouse cursor in the system part of 304. In one embodiment, the application 314 management GUI is an application program explorer.exe in Windows® OS. Alternatives for implementation may contain different management applications mouse cursors in other OS. In one embodiment, software 312 movements recognition contain rules for packet processing 318 touches within a part of the 306 tactile screen, but not packages 318 touches within the system part of 304. Thus, if you move your finger outside the boundaries of tactile panel, then the packets 318 touches the outside part 306 tactile screen is not processed, that almost stops moving the mouse. However, in the alternative to the implementation of the mouse messages arising from the inside part of the 306 tactile screen and expanding the system part 304, processed completely. In this embodiment, the touch of starting the virtual tactile panel and moving to a display device OS, shapes 322 mouse for continuous movement of the mouse cursor until the touch is not finished outside the system 304 part. Therefore, using the supported on all of the device 302 display the properties of a tactile screen, the user need not be limited to virtual tactile panel. Figure 4 is a scheme of sequence 400 operations, representing the stages at which represent and manage virtual tactile panel and GUI operating system on the same display device, in accordance with the option of carrying out the invention. Originally, a single display device with a touch screen simultaneously represents a virtual tactile panel and operating system, as indicated on the stage 402. In one embodiment, it seems extremely GUI OS until the user selects a button with a permanent function or key variable function to view the virtual tactile panel. In another embodiment, the virtual tactile panel is presented without any user interaction. The user can touch the virtual tactile panel for interaction with the OS, such as moving a mouse cursor, as shown on the stage of the 404. Touch user read through digital Converter, authoring packages taps), as specified at the stage of 406. Packages touches are digital representations touch user. As a tactile screen can provide the ability to register touches with some accuracy, and information display devices - on the other, the packets touches are converted to approach configurations display display device with a touch screen, as shown on stage 408. For example, packages touches can be converted to approach the concrete size of the screen or to the resolution of the display device with a touch screen, or to approach the system part of the displayed simultaneously with the virtual tactile panel. As stated at the stage 410, the rules apply to the converted packages to determine which mouse actions stipulated by taps), as specified at the stage of 410. These rules can cover actions that the user enters the virtual tablet, such as moving the mouse, pressing a key, and releasing the key, move, guidance, multiple touch, or similar. After determining mouse actions are passed to the application program (for example, explorer.exe in Windows®), which, then, respectively manipulates the operating system GUI. Despite the fact that the object of the invention is described in the language, characteristic of the structural hallmarks of and/or methodological activities, implies that applied the formula of the invention subject matter of the invention is not necessarily limited to the above characteristic distinctive features or actions. Rather, described above distinctive signs and actions are disclosed in the forms of the examples of implementation of the claims. For example, the interval sampling and sampling periods other than those described in this document can also be collected by volume of claims. 1. One or more machine-readable media with the computer performs the commands implemented for the complete method for the simultaneous display of a virtual touchpad mouse cursor control on the basis of taps on the touch panel containing the stages at which both are part of the system and part of the touch screen on the touch screen; represent the user interface (UI) operating system (OS) in the system part, the mouse pointer appears as part of the OS; are virtual touchpad in the mentioned part of the touch screen to control the mouse cursor, while the virtual touchpad includes the region to control the movements of the mouse cursor; take touch within the mentioned part of the touch screen; use of one or more rules for translation touches in one or more mouse actions; use the mouse to control the mouse cursor, and use referred to one or more rules for processing touches, which begin from the inner part of the mentioned part of the touch screen, and by keeping in contact with touch screen, continue in the mentioned part of the system. 2. Media device according to claim 1, additionally contains the application to digital Converter touches to convert touches in packages of touches. 3. Media in paragraph 2, in which the packages touches indicate at least one of the coordinates x/y, direction and speed associated with taps. 4. Media device according to claim 1, additionally contains a representation of one or more virtual keys of the touch screen. 5. Media device according to claim 1, additionally contains the reception touches in part referred to the system part. 6. Media device according to claim 1, further comprising: reception of the second touch in the system; and the control of the mouse cursor on the basis of the second tap. 7. Media device according to claim 1, which referred to one or more rules do not handle touch, beginning from outside of the touch screen. 8. Media device according to claim 1 in which the virtual touch panel appears only when the user presses a key. 9. Media device according to claim 1 in which touch-screen supports multiple touch. 11. Graphical user interface (GUI) to 10, in which the virtual touchpad mouse includes one or more virtual keys. 12. Graphical user interface (GUI) to 10, in which the touch is the user's finger or stylus. 13. Graphical user interface (GUI) to 10, in which touch are used to control the mouse pointer in the system area of the display. 14. Way to represent virtual touchpad on the device display, so the user interacted with the graphical representation of the user interface (GUI) operating system (OS)that contains the stages at which both are part of the system and part of the touch screen on the display device, the system part of a GUI referred OS, as part of the touch screen is a virtual touchpad in performance, indicating a virtualized touch pad mouse, which includes the area to entertain touch to control the movements of the mouse cursor; take one or more touches on the virtual touchpad; take one or more of touches, start from the inner part of the mentioned part of the touch screen and maintaining contact with the device display, continue in the system part, translate one or more touches in packages of touches, which indicate the direction of the x/y; convert the packets touches in the converted packages considering the size of the screen associated with the part of the system; define one or more mouse actions on the basis of the converted packages; and manipulate GUI referred OS-based packages mouse. 15. The method according to paragraph 14, additionally contains caching converted packages. 16. The method according to item 15, which touch is the user's finger or stylus. 17. The method according to item 15, in which the manipulation GUI referred OS additionally contains manipulation of the mouse cursor. 18. The method according to paragraph 17, where the mouse cursor is controlled by the application program.
|
© 2013-2014 Russian business network RussianPatents.com - Special Russian commercial information project for world wide. Foreign filing in English. |