Skip to content
English
  • There are no suggestions because the search field is empty.

.Touch-Device-Handling-v2.2

Customer Documentation

Customer Documentation: Neonode® Touch Sensor Module User's Guide : .Touch Device Handling v2.2

Customer Documentation: Neonode® Touch Sensor Module User's Guide : .Touch Device Handling v2.2

Introduction

When connecting the Touch Sensor Module over USB HID, It is recognized as a Touch Screen Digitizer by the Operating System. As for most OS, a "touch" is recognized on the UP event of a tracked touch. In practice, this means that the end user would have to both "press" and "release" their finger (or touch object) before a "touch" can be preformed. This would be similar to a tapping motion, where the touch is recognized when the touch object is exiting the Touch Active Area.

When using the Touch Sensor Module as an in-air solution for contactless touch, it usually gives a more tactile response if the application reacts on the DOWN event of a touch, instead of the UP event. This means that as soon as the end user positions a touch object inside of the TAA, a touch would be preformed. In this article, we will show how this can be achieved by binding buttons to the DOWN event in a WPF application example. 

Content

Touch Screen Digitizer and Mouse

For the application to handle input from the Touch Sensor Module, it needs to both be able to read input from a touchscreen digitizer, as well as a mouse. The sensor module first sends touch input to the host system, which would then get translated to mouse input. Therefore, it is important to make sure that the application can handle both input devices, as well as separating the input devices from one another. Please refer to section WPF Touch Device Handling Application Example for an MVVM application example using WPF.

How to bind the touch event to trigger a button command might differ depending on your framework. It is usually with the same principle as this WPF application shows, but using different formatting.


A "touch" has 4 significant events called, TouchEnter, TouchDown, TouchLeave, and TouchUp that is used to describe a TOUCH event. The wanted touch events (i.e. TouchDown) could then be handled in the application to trigger a button, for example.

To get a better understanding of the different touch events, please consider the following example. 


Example

Imagine 4 buttons that are positioned in a 2x2 pattern called UpperLeft, UpperRight, LowerLeft, LowerRight. Each button can be configured to be fired on one on multiple touch events.

Step-by-Step walkthrough of the touch events in the application

  1. Start: No touches exist anywhere.
  2. An object (finger) is detected within UpperLeft. (i.e. you put your finger inside the UpperLeft button).
    1. The TouchDown event is fired inside UpperLeft (i.e. the handler is called, if it is registered to handle the TouchDown case).
    2. The TouchEnter event is fired inside UpperLeft.
  3. The object is moved from UpperLeft to UpperRight.
    1. The TouchLeave event is fired inside UpperLeft. Note that TouchUp is NOT fired as we are still tracking the object!).
    2. The TouchEnter event is fired inside UpperRight. Note that TouchDown is NOT fired as this is NOT a new object!).
  4. The object is moved from UpperRight to LowerRight.
    1. The TouchLeave event is fired inside UpperRight.
    2. The TouchEnter event is fired inside LowerRight.
  5. The object is removed from the LowerRight area. I.e. you removed your finger.
    1. The TouchLeave event is fired in the LowerRight area.
    2. The TouchUp event is fired in the LowerRight area.
  6. No objects are tracked anymore.

WPF Touch Device Handling Application Example

System Requirements

  • Windows 8, or higher
  • Visual Studio 2019
  • .Net Core

Download the Example WPF Application, here.

Introduction

The WPF Touch Device Handling Application both reads and separates TOUCH- from MOUSE events. The application contains of 6 buttons that triggers from different touch- or mouse events. A text is printed in the lower left corner to indicate when a button has been fired, and by what event. The application can be used to test out the effects of the different touch events. The application code is available for reference, which is created in the framework WPF. 

To use a Command to handle touches, one must add the NuGet package "Microsoft.Xaml.Behaviors.Wpf", which has been done in this project already.

Event Button Description

In the application, the 6 buttons will each represent an area that different touch and mouse events reacts to. The buttons will trigger to the following actions. 

Run the Application

  1. Download the example application.

  2. Open the solution in Visual Studio 2019.

    1. After opening the solution the first time, build the solution once, or it will show an error: "The name "MainViewModel" does not exist in the namespace "clr-namespace:WpfTouchDemo.ViewModels;assembly=WpfTouchDemo.ViewModels"
      After building the first time it is able to resolve the name.

  3. Connect the Touch sensor Module to the computer.
  4. Run the program (F6).
    1. When you run the program, the application will start, and 6 buttons will appear. 
    2. Maximize the application window
  5. Test pressing the buttons using both a mouse device, and the Touch Sensor Module.
    1. read the latest triggered event response in the lower left corner, and 


Binding

The input events are bound in the following two files, and those are.

  • MainWindow.xaml - Make sure to bind the button commands correctly here.
  • MainViewModel.cs - All code that manage button presses is here.

The trigger event TouchEnterButtonCommand for example, is bound using the following method.

MainViewModel.xaml
 
        //Binding TouchEnterButtonCommand
MainViewModel.cs. TouchEnterButtonCommand : Command
//Bind TouchEnterButtonCommandthis.TouchEnterButtonCommand = new Command ((parameter) => { this.SelectedButtonLabel = "Touch Enter."; }, () => true);


Where TouchEnterButtonCommand is bound to the button "Touch Enter", as such:

Section from MainViewModel.xaml
    // Touch Enter handling using Command.         


Touch Evaluation

Hence, the Touch Sensor Module both sends touch events, as well as mouse events (translated from touch events). The application should however only process one. If it is not taken to consideration, the application would presumably react on both event input.

Mouse events can both be from a mouse device, or promoted from a touch device. If the application reacts open the "fake" mouse events, there might be a small delay, which is why only the first input event should be read. The first event in this case would always be a touch event.

The application distinguish touch events from the translated touch events (mouse events) by checking if the events were originally sent from a touch screen digitizer. 

The following code is an example from WPF example application, which concludes if the application should read event- or touch data. 

Section from MainViewModel.cs
switch ( parameter )                {                    case TouchEventArgs eventArgs:                        // This is a Touch Event.                        this.SelectedButtonLabel = "Touch Down in Touch Down Area.";                    break;                    case MouseEventArgs eventArgs:                        // this is a Mouse Event. What we don't know is if it is promoted from a Touch Event or from a "real mouse".                        if ( eventArgs.StylusDevice == null )                        {                            // This is from a mouse, i.e. not promoted from a Touch Event. We ignore it if it was promoted, or we get                            // two clicks.                            this.SelectedButtonLabel = "Left Mouse Button Down in Touch Down Area.";                        }                    break;                }

I2C communication

Similarly






Attachments:

Asset 5DemoSvgTest.svg (image/svg+xml)
Artboard 1Demo.svg (image/svg+xml)
KeyboardDemo2.svg (image/svg+xml)
image2020-5-15_11-6-49.png (image/png)
image2020-5-15_17-34-2.png (image/png)
image2020-5-15_17-46-34.png (image/png)
Asset 1PTM.svg (image/svg+xml)
KeyboardDemo3.svg (image/svg+xml)
Asset 7Demo.svg (image/svg+xml)
Asset 10Demo.svg (image/svg+xml)
Asset 11Demo.svg (image/svg+xml)
Asset 12Demo.svg (image/svg+xml)
Asset 13Demo.svg (image/svg+xml)
Asset 14Demo.svg (image/svg+xml)
Asset 15Demo.svg (image/svg+xml)
Artboard 1Demo.svg (image/svg+xml)
Asset 15Demo.svg (image/svg+xml)
KeyboardDemo3.svg (image/svg+xml)
Asset 18keyboardmousedemo.svg (image/svg+xml)
Asset 17keyboardmousedemo.svg (image/svg+xml)
image2020-5-26_9-8-39.png (image/png)
image2020-5-26_9-12-45.png (image/png)
image2020-5-26_9-13-26.png (image/png)