Touch input Foundation
On computers with a touch digitizer, another means of input exists in addition to the mouse and the keyboard. On these machines, one or more fingers can be used to execute commands and to operate the controls in an application's user interface. For example, a push button can be operated by performing a so-called tap gesture, which means tapping a finger at the appropriate position on the screen.
The system recognizes a set of pre-defined gestures which are outlined in the chapter "Touch gestures".
Whenever one of the pre-defined gestures is performed by the user, the application is notified using a dedicated event. Code can be written by the application developer which implements the behavior or command corresponding to the gesture, and which is executed automatically upon reception of the event. The chapter "Processing touch gestures" contains further details about the interfaces and protocols associated with processing touch input.
The system recognizes a number of pre-defined gestures which can be used to operate the controls or execute commands in an application. The following table lists the touch gestures which are supported.
Illustration | Gesture | Description |
---|---|---|
Tap | A one-finger gesture used to select an item. Equivalent to and reported as a mouse button left-click. | |
Double-Tap | Used to activate an object’s default action, equivalent to and reported as a mouse-button double-click. | |
Pan | Used to move objects on the screen or to scroll within a list of items. This is a one- or two-finger gesture. | |
Rotate | Used to rotate objects on the screen. This is a two-finger gesture. | |
Zoom | Used to change an object’s size. This is a two-finger gesture. | |
Two-Finger Tap | A tap gesture performed with two fingers. | |
Press-and-Tap | Executed by first selecting an object with a finger, and performing a tap with a second finger. Configurable as the equivalent to a mouse button right-click on some platforms. In this case, a press-and-tap gesture is reported as a right-click. | |
Press-and-Hold | A tap gesture with the finger being left on the screen for a certain period of time. Equivalent to and reported as a mouse button right-click under Windows 8. |
The following interfaces are used with touch input:
Interface | Type | Description |
---|---|---|
:InputMode | Member var. | Controls touch behavior for an UI element |
:Gesture, Gesture() | Event | Notifies application about touch gestures |
IsTouchEnabled() | Function | Determines system capabilities |
Setting up the application for processing touch events generally involves the following steps:
1. Enabling touch events on the respective UI element
2. Configuring the touch sub-system
3. Writing an event handler which implements the desired behavior
The following paragraphs outline each of these steps in turn.
Enabling touch events
By default, the system automatically performs touch-processing for certain UI elements in order to implement platform compatible behavior. For this reason, standard UI elements such as list boxes and push buttons can be operated using touch gestures without having to write additional code.
In certain cases, however, this default behavior may have to be altered, or new behavior may have to be implemented for a custom control. In these cases, touch events must be explicitly enabled. This is done by assigning the constant XBP_INPUTMODE_TOUCHGESTURES to the :inputMode member variable of the respective Xbase Part. Afterwards, the system generates an xbeP_Gesture event whenever a touch gesture is performed by the user involving the respective Xbase Part. More information about handling xbeP_Gesture is provided in subsequent paragraphs.
By default, the constant XBP_INPUTMODE_SYSTEMDEFAULT is contained in the member variable :inputMode of an Xbase Part after it is created. In this mode, platform compatible behavior is active. In some cases, this behavior is implemented in the corresponding Xbase Part class (using Gesture events). However, often the way a certain elements reacts to touch gestures is due to behavior which is intrinsic to the underlying operating system control.
Configuring the touch sub-system
The system allows for a much finer control of the touch feature than just enabling touch gesture events for an UI element. To do this, additional constants are defined in the file xbp.ch which can be assigned to the :inputModeinstance variable. This allows to disable individual touch gestures, for example, so that xbeP_Gesture events are generated only for those touch gestures the application is actually prepared to handle. In addition, constants exist to configuring certain properties of the touch sub-system. See the documentation on the :inputMode member variable for further details on the various options which can be configured via this instance variable.
Writing an event handler
If touch gesture events are enabled for one of its UI elements, the application gets notified whenever one of the pre-defined gestures is executed. This occurs via a dedicated event (xbeP_Gesture) which is generated by the system.
In order to implement behavior for a certain touch gesture, a suitable event handler must be implemented by the application developer. This handler can coded as a code block, or it can be implemented in a method of a user-defined Xbase Part class. If the event handler is coded as a code block, it must be assigned to the :gesture callback slot of the Xbase Part for which the behavior is to be implemented.
Note that for compatibility reasons, the gestures which correspond to a left-click or a right-click with the mouse are reported as mouse events to the application. No gesture event is generated in this case. This means that the tap gesture can be handled in the same way as a mouse click, for example. In fact, no additional code needs to be written if the application already contains code handling mouse events.
An example for this is demonstrated in the example ..\basics\touch\gestures.prg in the Xbase++ sample collection. Here, a combination of event handlers for mouse and touch gesture events is used for manipulating an image. Although intended primarily for touch input, the afore-mentioned compatibility features nevertheless allow the image to be dragged around with the mouse. The following event handlers are registered in the sample application:
A touch gesture being performed by the user may result in a stream of xbeP_Gesture events being sent to the application, each corresponding to a certain state within the overall gesture. The speed with which each event is processed directly affects how responsive the front-end of the application feels to the user. For this reason, xbeP_Gesture events are sent synchronously to the application. This is done by executing the :handleEvent()method of the respective Xbase Part directly. This means that the xbeP_Gesture event code will never be retrieved by the function AppEvent(), because the system bypasses the application's event queue.
In Xbase++, Xbase Parts are created and maintained by a special system thread called the UI thread. Because of their synchronous nature, the application-defined event handlers for processing xbeP_Gesture are executed in that thread.
Gesture event routing
When a touch gesture is performed by the user, the gesture events generated for the gesture are first routed to the UI element the finger is positioned over. If a touch gesture is performed over a group of elements which are stacked upon each other, the corresponding events are first sent to element which is in the foreground (at the top of the z-order). This behavior is similar to the way mouse events are processed by the system.
Unlike with regular computers which tend to be operated with a mouse and keyboard, however, on a touch-enabled device the finger often is the sole means of input. Also, touch gestures tend to have far less precision than the strict point-and-click methodology employed using a mouse. For this reason, touch gesture events are handled in a slightly different way. Whereas a mouse click occurs (and hence is handled) only on a single object, a gesture event travels upwards in the parent/child hierarchy. For example, if a pan gesture is performed over a push button, for example, the corresponding xbeP_Gesture events will cause the code block in the push button's :gesture callback slot as well as its :gesture() callback method to be executed. Afterwards, the gesture event will be routed to the parent object and so on. This kind of event delegation makes it easier to implement behavior for compound objects such as a browse, for example, and also simplifies using gestures for triggering application-wide commands.
In some cases this scheme may cause problems, however, because it introduces the risk of triggering event handling code several times for a single event. In this case, the default processing normally performed by the system must be disabled to prevent gesture events from being routed to the parent object.
Touch input is not supported in console (VIO) applications. However, the XbpCrt class which is used as the console window in Hybrid Mode applications features all of the touch-related interfaces described in the previous chapters.
If touch input is to be supported in a character mode application, the application must be linked as a GUI application. Also, an XbpCrt window must be used as the console window, which is the default established by the standard AppSys() procedure provided in the appsys.prg file. Finally, SetMouse(.T.) must be called to enable mouse/non-keyboard events.
Once the character mode application has been set up correctly, touch gestures can be processed using the (XbpCrt) console window's :gesturecallback slot. In addition, certain gestures automatically generate standard events. A tap gesture is reported as a regular mouse click, for example, which ensures text-mode commands such as AChoice() operates as expected, even if the application does not contain handling code for this specific gesture.
If you see anything in the documentation that is not correct, does not match your experience with the particular feature or requires further clarification, please use this form to report a documentation issue.