Programming Guide:xppguide

Touch input Foundation

On computers with a touch digitizer, another means of input exists in addition to the mouse and the keyboard. On these machines, one or more fingers can be used to execute commands and to operate the controls in an application's user interface. For example, a push button can be operated by performing a so-called tap gesture, which means tapping a finger at the appropriate position on the screen.

The system recognizes a set of pre-defined gestures which are outlined in the chapter "Touch gestures".

Whenever one of the pre-defined gestures is performed by the user, the application is notified using a dedicated event. Code can be written by the application developer which implements the behavior or command corresponding to the gesture, and which is executed automatically upon reception of the event. The chapter "Processing touch gestures" contains further details about the interfaces and protocols associated with processing touch input.

Touch input is supported on machines running Windows 7 or newer.

Touch gestures

The system recognizes a number of pre-defined gestures which can be used to operate the controls or execute commands in an application. The following table lists the touch gestures which are supported.

Gestures
Illustration Gesture Description
Tap A one-finger gesture used to select an item. Equivalent to and reported as a mouse button left-click.
Double-Tap Used to activate an object’s default action, equivalent to and reported as a mouse-button double-click.
Pan Used to move objects on the screen or to scroll within a list of items. This is a one- or two-finger gesture.
Rotate Used to rotate objects on the screen. This is a two-finger gesture.
Zoom Used to change an object’s size. This is a two-finger gesture.
Two-Finger Tap A tap gesture performed with two fingers.
Press-and-Tap Executed by first selecting an object with a finger, and performing a tap with a second finger. Configurable as the equivalent to a mouse button right-click on some platforms. In this case, a press-and-tap gesture is reported as a right-click.
Press-and-Hold A tap gesture with the finger being left on the screen for a certain period of time. Equivalent to and reported as a mouse button right-click under Windows 8.

The Xbase Part library currently recognizes two touch points, which translates to two fingers which can be used simultaneously for executing gestures.

Touch input is supported on machines running Windows 7 or newer.

Processing touch gestures

The following interfaces are used with touch input:

Interfaces for processing touch events
Interface Type Description
:InputMode Member var. Controls touch behavior for an UI element
:Gesture, Gesture() Event Notifies application about touch gestures
IsTouchEnabled() Function Determines system capabilities

Setting up the application for processing touch events generally involves the following steps:

1. Enabling touch events on the respective UI element

2. Configuring the touch sub-system

3. Writing an event handler which implements the desired behavior

The following paragraphs outline each of these steps in turn.

Enabling touch events

By default, the system automatically performs touch-processing for certain UI elements in order to implement platform compatible behavior. For this reason, standard UI elements such as list boxes and push buttons can be operated using touch gestures without having to write additional code.

In certain cases, however, this default behavior may have to be altered, or new behavior may have to be implemented for a custom control. In these cases, touch events must be explicitly enabled. This is done by assigning the constant XBP_INPUTMODE_TOUCHGESTURES to the :inputMode member variable of the respective Xbase Part. Afterwards, the system generates an xbeP_Gesture event whenever a touch gesture is performed by the user involving the respective Xbase Part. More information about handling xbeP_Gesture is provided in subsequent paragraphs.

Applications can use the IsTouchEnabled() environment function to determine whether a touch digitizer is installed in the computer, and whether touch input is enabled in the operating system.

By default, the constant XBP_INPUTMODE_SYSTEMDEFAULT is contained in the member variable :inputMode of an Xbase Part after it is created. In this mode, platform compatible behavior is active. In some cases, this behavior is implemented in the corresponding Xbase Part class (using Gesture events). However, often the way a certain elements reacts to touch gestures is due to behavior which is intrinsic to the underlying operating system control.

Applications should not make assumptions on whether the system generates touch events for a particular UI element as part of the default processing. Instead, applications should use the :inputMode member variable to explicitly enable touch events.

Configuring the touch sub-system

The system allows for a much finer control of the touch feature than just enabling touch gesture events for an UI element. To do this, additional constants are defined in the file xbp.ch which can be assigned to the :inputModeinstance variable. This allows to disable individual touch gestures, for example, so that xbeP_Gesture events are generated only for those touch gestures the application is actually prepared to handle. In addition, constants exist to configuring certain properties of the touch sub-system. See the documentation on the :inputMode member variable for further details on the various options which can be configured via this instance variable.

// Example for setting up the touch system for a dialog window: 
// the touch system is switched on and configured to report all 
// gestures. Pen flicks, a set of pre-defined gestures for 
// initiating clipboard operations or navigation, as well as 
// visual feedback for the pen buttons are disabled. These 
// features are intended to aid in document-scentric interaction 
// scenarios, but may disrupt the user's input flow in an 
// application using manifold gestures. The code is an excerpt 
// from an application using touch input for manipulating an 
// image. See the example ..\basics\touch\gestures.prg in the 
// sample collection. 
oDlg := XbpDialog():new( ,, {10,30}, {640,400} ) 
oDlg:drawingArea:inputMode := XBP_INPUTMODE_TOUCHGESTURES +; 
                              XBP_INPUTMODE_NOPENFLICKS   +; 
                              XBP_INPUTMODE_NOPENBUTTONFEEDBACK 

Writing an event handler

If touch gesture events are enabled for one of its UI elements, the application gets notified whenever one of the pre-defined gestures is executed. This occurs via a dedicated event (xbeP_Gesture) which is generated by the system.

In order to implement behavior for a certain touch gesture, a suitable event handler must be implemented by the application developer. This handler can coded as a code block, or it can be implemented in a method of a user-defined Xbase Part class. If the event handler is coded as a code block, it must be assigned to the :gesture callback slot of the Xbase Part for which the behavior is to be implemented.

Note that for compatibility reasons, the gestures which correspond to a left-click or a right-click with the mouse are reported as mouse events to the application. No gesture event is generated in this case. This means that the tap gesture can be handled in the same way as a mouse click, for example. In fact, no additional code needs to be written if the application already contains code handling mouse events.

An example for this is demonstrated in the example ..\basics\touch\gestures.prg in the Xbase++ sample collection. Here, a combination of event handlers for mouse and touch gesture events is used for manipulating an image. Although intended primarily for touch input, the afore-mentioned compatibility features nevertheless allow the image to be dragged around with the mouse. The following event handlers are registered in the sample application:

// :gesture code block for processing touch input. 
// Gestures such as zoom or pan change the position 
// or size of the image 
oDlg:drawingArea:gesture:= {|mp1,mp2| HandleGesture(oDlg, oPS, mp1, mp2)} 

// Codeblocks for tracking mouse pointer movement and 
// button presses. These are required for simulating 
// pan operations with the left mouse button 
oDlg:drawingArea:motion := {|mp1,mp2| HandleMotion(oDlg, oPS, mp1, mp2)} 
oDlg:drawingArea:lbDown := {|mp1,mp2| aMousePos:=AClone(mp1), lMouseDown:=.T.} 
oDlg:drawingArea:lbUp   := {|mp1,mp2| lMouseDown:=.F.} 

A touch gesture being performed by the user may result in a stream of xbeP_Gesture events being sent to the application, each corresponding to a certain state within the overall gesture. The speed with which each event is processed directly affects how responsive the front-end of the application feels to the user. For this reason, xbeP_Gesture events are sent synchronously to the application. This is done by executing the :handleEvent()method of the respective Xbase Part directly. This means that the xbeP_Gesture event code will never be retrieved by the function AppEvent(), because the system bypasses the application's event queue.

In Xbase++, Xbase Parts are created and maintained by a special system thread called the UI thread. Because of their synchronous nature, the application-defined event handlers for processing xbeP_Gesture are executed in that thread.

Applications must assume all thread-local settings to be at their default values when processing xbeP_Gesture events.

Gesture event routing

When a touch gesture is performed by the user, the gesture events generated for the gesture are first routed to the UI element the finger is positioned over. If a touch gesture is performed over a group of elements which are stacked upon each other, the corresponding events are first sent to element which is in the foreground (at the top of the z-order). This behavior is similar to the way mouse events are processed by the system.

Unlike with regular computers which tend to be operated with a mouse and keyboard, however, on a touch-enabled device the finger often is the sole means of input. Also, touch gestures tend to have far less precision than the strict point-and-click methodology employed using a mouse. For this reason, touch gesture events are handled in a slightly different way. Whereas a mouse click occurs (and hence is handled) only on a single object, a gesture event travels upwards in the parent/child hierarchy. For example, if a pan gesture is performed over a push button, for example, the corresponding xbeP_Gesture events will cause the code block in the push button's :gesture callback slot as well as its :gesture() callback method to be executed. Afterwards, the gesture event will be routed to the parent object and so on. This kind of event delegation makes it easier to implement behavior for compound objects such as a browse, for example, and also simplifies using gestures for triggering application-wide commands.

In some cases this scheme may cause problems, however, because it introduces the risk of triggering event handling code several times for a single event. In this case, the default processing normally performed by the system must be disabled to prevent gesture events from being routed to the parent object.

The system default event processing can be disabled via the XBP_INPUTMODEGF_NOLEGACYSUPPORT constant. See the documentation on the :inputModemember variable for further information.

Touch input in character mode applications

Touch input is not supported in console (VIO) applications. However, the XbpCrt class which is used as the console window in Hybrid Mode applications features all of the touch-related interfaces described in the previous chapters.

A Hybrid Mode application is a character mode application which is linked using the /PM:PM linker switch. In this case, a special console window is used which allows using text-mode commands in a graphics mode (GUI) application.

If touch input is to be supported in a character mode application, the application must be linked as a GUI application. Also, an XbpCrt window must be used as the console window, which is the default established by the standard AppSys() procedure provided in the appsys.prg file. Finally, SetMouse(.T.) must be called to enable mouse/non-keyboard events.

Once the character mode application has been set up correctly, touch gestures can be processed using the (XbpCrt) console window's :gesturecallback slot. In addition, certain gestures automatically generate standard events. A tap gesture is reported as a regular mouse click, for example, which ensures text-mode commands such as AChoice() operates as expected, even if the application does not contain handling code for this specific gesture.

// Example showing how to use touch and mouse wheel events 
// for supporting scrolling lines in a MEMOEDIT(). The 
// example must be called with the name of the file to 
// edit. 
#include "Inkey.ch" 
#include "xbp.ch" 
#include "appevent.ch" 

#define xbeP_ScrollUpDown   (xbeP_User+1) 

// Global variables for storing the scrollable screen area 
// of the ACHOICE as well as a carry-over of fractional 
// lines accumulating during smalls pans 
STATIC aScrollRect  := NIL 
STATIC nLineFraction:= 0 

PROCEDURE Main( cFile ) 
   IF Empty(cFile) 
      ? "Please specify a file for editing." 
      QUIT 
   ENDIF 

   SET TYPEAHEAD TO 100 

   // Switch on mouse- and gesture-support. Disable the auto- 
   // marking feature of the console window (XbpCrt) so 
   // that is doesn't interfere when dragging the mouse or 
   // the finger 
   SetMouse( .T. ) 
   SetAppWindow():autoMark := .F. 

   // Set up the touch system for the console window (XbpCrt): 
   // in this example, we only support vertical pans and switch 
   // off other gestures for simpliticy 
   SetAppWindow():inputMode := XBP_INPUTMODE_TOUCHGESTURES           + ; 
                               XBP_INPUTMODEGF_NOZOOMGESTURE         + ; 
                               XBP_INPUTMODEGF_NOROTATEGESTURE       + ; 
                               XBP_INPUTMODEGF_NOTWOFINGERTAPGESTURE + ; 
                               XBP_INPUTMODEGF_NOPRESSANDTAPGESTURE  + ; 
                               XBP_INPUTMODEGF_NOONEFINGERPAN_HORZ 

   // Register the event handlers for handling touch gestures and 
   // the mouse wheel 
   SetAppWindow():gesture   := {|nId,aInfo| HandleGesture(nId, aInfo, SetAppWindow())} 
   SetAppWindow():wheel     := {|aPos,aWheel| HandleWheel(aPos, aWheel, SetAppWindow())} 

   // Register a handler for a custom event which performs the 
   // scrolling in the MEMOEDIT(). Since touch events arrive in the 
   // UI thread, the operations performed in the handler are 
   // restricted to this thread. Posting this custom event allows 
   // us to handle scrolling in the same thread that hosts the 
   // MEMOEDIT() 
   SetAppEvent( xbeP_ScrollUpDown, {|nRows| ScrollUpDown(nRows)} ) 

   ? FileEdit( cFile, 2, 2, 22, 76 ) 
RETURN 

* Edit function for text files 
FUNCTION FileEdit( cFilename, nTop, nLeft, nBottom, nRight ) 
   LOCAL cString  := MemoRead( cFilename ) 
   LOCAL cScreen  := SaveScreen( nTop, nLeft, nBottom, nRight ) 

   DispBox( nTop, nLeft, nBottom, nRight, 2 ) 
   @ nTop, nLeft+1 SAY PadC(cFileName, nRight-nLeft-1, Chr(205) ) 

   SET SCOREBOARD OFF 
   aScrollRect := {nTop, nLeft, nBottom, nRight} 
   cString := MemoEdit( cString    , ; 
                        nTop   + 1 , ; 
                        nLeft  + 1 , ; 
                        nBottom- 1 , ; 
                        nRight - 1 , ; 
                        .T. ) 
   aScrollRect := NIL 
   SET SCOREBOARD ON 
   RestScreen( nTop, nLeft, nBottom, nRight, cScreen ) 
RETURN MemoWrit( cFilename, cString ) 


* Event handler for gesture events 
PROCEDURE HandleGesture( nId, aInfo, oCrt ) 
 LOCAL nCharY 
 LOCAL nRows 

   UNUSED( nId ) // Only pan gestures supported here! 

   IF Empty(aScrollRect) 
      // No area to scroll, ignore gesture 
      RETURN 
   ENDIF 

   IF aInfo[XBP_GESTUREINFO_STATE] == XBP_GESTURESTATE_BEGIN 
      // First event for a gesture; initialize counters etc. 
      nLineFraction := 0 
   ENDIF 

   // Compute the number of lines to scroll from the 
   // distance the finger was moved while panning. 
   // If the distance already warrants scrolling a 
   // line, post a scroll event. Catch fractional 
   // lines and have them accumulate during small 
   // pans. 
   nCharY := oCrt:fontHeight 
   nRows  := aInfo[XBP_GESTUREINFO_POS][2] - aInfo[XBP_GESTUREINFO_LASTPOS][2] 
   nRows  := (nRows + (nLineFraction*nCharY)) / nCharY 

   IF Int(nRows) != 0 
      PostAppEvent( xbeP_ScrollUpDown, Int(nRows),, oCrt ) 
   ENDIF 

   nLineFraction := nRows - Int( nRows ) 
RETURN 


* Event handler for mouse wheel events 
PROCEDURE HandleWheel( aPos, aWheel, oCrt ) 
 LOCAL nLines 

   // Use line-wise scrolling: perform scrolling only for 
   // full lines, ignore wheel events for partial lines. 
   // Use the same custom scroll event we use for handling 
   // pans. 
   nLines := aWheel[3] 

   IF nLines != 0 
      PostAppEvent( xbeP_ScrollUpDown, nLines *-1,, oCrt ) 
   ENDIF 
RETURN 


* Event handler for custom scroll event. This handler 
* uses KEYBOARD to insert K_UP/K_DOWN events into the 
* event queue. Consequently, this procedure must be 
* executed in the same thread that hosts the MEMOEDIT(). 
PROCEDURE ScrollUpDown( nRows ) 
 LOCAL nRow := Row() 
 LOCAL nCol := Col() 
 LOCAL nSkip 
 LOCAL cStr 

   // Skip up/down sufficient times to have the MEMOEDIT() 
   // scroll the display as requested. Skip back to the 
   // original cursor position to keep the cursor in 
   // place as best as we can. 
   IF nRows > 0 
     nSkip := aScrollRect[3] - nRow 
     cStr  := Replicate(Chr(K_DOWN), nSkip + nRows -1) + Replicate(Chr(K_UP), nSkip-1) 
   ELSE 
     nSkip := nRow - aScrollRect[1] 
     cStr  := Replicate(Chr(K_UP), nSkip + (Abs(nRows) -1)) + Replicate(Chr(K_DOWN), nSkip-1) 
   ENDIF 

   KEYBOARD (cStr) 
RETURN 


Feedback

If you see anything in the documentation that is not correct, does not match your experience with the particular feature or requires further clarification, please use this form to report a documentation issue.