Programming Guide:xppguide

Touch input Foundation

On computers with a touch digitizer, another means of input exists in addition to the mouse and the keyboard. On these machines, one or more fingers can be used to execute commands and to operate the controls in an application's user interface. For example, a push button can be operated by performing a so-called tap gesture, which means tapping a finger at the appropriate position on the screen.

The system recognizes a set of pre-defined gestures which are outlined in the chapter " Touch gestures".

Whenever one of the pre-defined gestures is performed by the user, the application is notified using a dedicated event. Code can be written by the application developer which implements the behavior or command corresponding to the gesture, and which is executed automatically upon reception of the event. The chapter " Processing touch gestures" contains further details about the interfaces and protocols associated with processing touch input.

Touch input is supported on machines running Windows 7 or newer.

Touch gestures

The system recognizes a number of pre-defined gestures which can be used to operate the controls or execute commands in an application. The following table lists the touch gestures which are supported.

Illustration Gesture Description
Tap A one-finger gesture used to select an item. Equivalent to and reported as a mouse button left-click.
Double-Tap Used to activate an object’s default action, equivalent to and reported as a mouse-button double-click.
Pan Used to move objects on the screen or to scroll within a list of items. This is a one- or two-finger gesture.
Rotate Used to rotate objects on the screen. This is a two-finger gesture.
Zoom Used to change an object’s size. This is a two-finger gesture.
Two-Finger Tap A tap gesture performed with two fingers.
Press-and-Tap Executed by first selecting an object with a finger, and performing a tap with a second finger. Configurable as the equivalent to a mouse button right-click on some platforms. In this case, a press-and-tap gesture is reported as a right-click.
Press-and-Hold A tap gesture with the finger being left on the screen for a certain period of time. Equivalent to and reported as a mouse button right-click under Windows 8.

The Xbase Part library currently recognizes two touch points, which translates to two fingers which can be used simultaneously for executing gestures.

Touch input is supported on machines running Windows 7 or newer.

Processing touch gestures

The following interfaces are used with touch input:

Interfaces for processing touch events
Interface Type Description
:InputMode Member var. Controls touch behavior for an UI element
:Gesture, Gesture() Event Notifies application about touch gestures
IsTouchEnabled() Function Determines system capabilities

Setting up the application for processing touch events generally involves the following steps:

1. Enabling touch events on the respective UI element

2. Configuring the touch sub-system

3. Writing an event handler which implements the desired behavior

The following paragraphs outline each of these steps in turn.

Enabling touch events

By default, the system automatically performs touch-processing for certain UI elements in order to implement platform compatible behavior. For this reason, standard UI elements such as list boxes and push buttons can be operated using touch gestures without having to write additional code.

In certain cases, however, this default behavior may have to be altered, or new behavior may have to be implemented for a custom control. In these cases, touch events must be explicitly enabled. This is done by assigning the constant XBP_INPUTMODE_TOUCHGESTURES to the :inputMode member variable of the respective Xbase Part. Afterwards, the system generates an xbeP_Gesture event whenever a touch gesture is performed by the user involving the respective Xbase Part. More information about handling xbeP_Gesture is provided in subsequent paragraphs.

Applications can use the IsTouchEnabled() environment function to determine whether a touch digitizer is installed in the computer, and whether touch input is enabled in the operating system.

By default, the constant XBP_INPUTMODE_SYSTEMDEFAULT is contained in the member variable :inputMode of an Xbase Part after it is created. In this mode, platform compatible behavior is active. In some cases, this behavior is implemented in the corresponding Xbase Part class (using Gesture events). However, often the way a certain elements reacts to touch gestures is due to behavior which is intrinsic to the underlying operating system control.

Applications should not make assumptions on whether the system generates touch events for a particular UI element as part of the default processing. Instead, applications should use the :inputMode member variable to explicitly enable touch events.

Configuring the touch sub-system

The system allows for a much finer control of the touch feature than just enabling touch gesture events for an UI element. To do this, additional constants are defined in the file which can be assigned to the :inputMode instance variable. This allows to disable individual touch gestures, for example, so that xbeP_Gesture events are generated only for those touch gestures the application is actually prepared to handle. In addition, constants exist to configuring certain properties of the touch sub-system. See the documentation on :inputMode member variable for further details on the various options which can be configured via this instance variable.

Example, TBD 

Writing an event handler

If touch gesture events are enabled for one of its UI elements, the application gets notified whenever one of the pre-defined gestures is executed. This occurs via a dedicated event (xbeP_Gesture) which is generated by the system.

In order to implement behavior for a certain touch gesture, a suitable event handler must be implemented by the application developer. This handler can coded as a code block, or it can be implemented in a method of a user-defined Xbase Part class. If the event handler is coded as a code block, it must be assigned to the :gesture callback slot of the Xbase Part for which the behavior is to be implemented.

Note that for compatibility reasons, the gestures which correspond to a left-click or a right-click with the mouse are reported as mouse events to the application. No gesture event is generated in this case. This means that the tap gesture can be handled in the same way as a mouse click, for example. In fact, no additional code needs to be written if the application already contains code handling mouse events.

Example, TBD 

A touch gesture being performed by the user may result in a stream of xbeP_Gesture events being sent to the application, each corresponding to a certain state within the overall gesture. The speed with which each event is processed directly affects how responsive the front-end of the application feels to the user. For this reason, xbeP_Gesture events are sent synchronously to the application. This is done by executing the :handleEvent() method of the respective Xbase Part directly. This means that the xbeP_Gesture event code will never be retrieved by the function AppEvent(), because the system bypasses the application's event queue.

In Xbase++, Xbase Parts are created and maintained by a special system thread called the UI thread. Because of their synchronous nature, the application-defined event handlers for processing xbeP_Gesture are executed in that thread.

Applications must assume all thread-local settings to be at their default values when processing xbeP_Gesture events.

Gesture event routing

When a touch gesture is performed by the user, the gesture events generated for the gesture are first routed to the UI element the finger is positioned over. If a touch gesture is performed over a group of elements which are stacked upon each other, the corresponding events are first sent to element which is in the foreground (at the top of the z-order). This behavior is similar to the way mouse events are processed by the system.

Unlike with regular computers which tend to be operated with a mouse and keyboard, however, on a touch-enabled device the finger often is the sole means of input. Also, touch gestures tend to have far less precision than the strict point-and-click methodology employed using a mouse. For this reason, touch gesture events are handled in a slightly different way. Whereas a mouse click occurs (and hence is handled) only on a single object, a gesture event travels upwards in the parent/child hierarchy. For example, if a pan gesture is performed over a push button, for example, the corresponding xbeP_Gesture events will cause the code block in the push button's :gesture callback slot as well as its :gesture() callback method to be executed. Afterwards, the gesture event will be routed to the parent object and so on. This kind of event delegation makes it easier to implement behavior for compound objects such as a browse, for example, and also simplifies using gestures for triggering application-wide commands.

In some cases this scheme may cause problems, however, because it introduces the risk of triggering event handling code several times for a single event. In this case, the default processing normally performed by the system must be disabled to prevent gesture events from being routed to the parent object.

The system default event processing can be disabled via the XBP_INPUTMODEGF_NOLEGACYSUPPORT constant. See the documentation on the :inputMode member variable for further information.
Touch input in character mode applications

Touch input is not supported in console (VIO) applications. However, the XbpCrt class which is used as the console window in Hybrid Mode applications features all of the touch-related interfaces described in the previous chapters.

A Hybrid Mode application is a character mode application which is linked using the /PM:PM linker switch. In this case, a special console window is used which allows using text-mode commands in a graphics mode (GUI) application.

If touch input is to be supported in a character mode application, the application must be linked as a GUI application. Also, an XbpCrt window must be used as the console window, which is the default established by the standard AppSys() procedure provided in the appsys.prg file. Also, SetMouse(.T.) must be called to enable mouse/non-keyboard events.

Once the character mode application has been set up correctly, touch gestures can be processed using the (XbpCrt) console window's :gesture callback slot. Also, certain gestures are used to simulate standard events. A tap gesture is reported as a regular mouse click, which ensures text-mode commands such as AChoice() continue to operate as expected.

Example, TBD 


If you see anything in the documentation that is not correct, does not match your experience with the particular feature or requires further clarification, please use this form to report a documentation issue.