Making WPF button’s Pressed state work with touch input

The short story:

In order for a WPF Button control to trigger the Pressed VisualState when using finger interaction on a touch screen (similarly to clicking with a mouse), it seems to be necessary to completely deactivate the press-and-hold right mouse button emulation by setting the Stylus.IsPressAndHoldEnabled attached property to false.

The long story:

After all those articles covering Windows Phone and WinRT Apps, it’s time for something different. We’re currently working on a WPF project that is intended to run in fullscreen mode and be operated using finger touch events on a touch screen. Since the application does not feature any complex touch gestures but is only controlled through buttons that can be pressed, we decided to refrain from referencing a dedicated touch handling framework and instead let each button press be handled as if it was a simple mouse click.

To signal a user that a button is pressed when touching it with the finger, we implemented visual states on all buttons in the application: On the Pressed VisualState, a button changes its color to appear as a physical button that is currently pressed:

<Style x:Key="MyButtonStyle" TargetType="{x:Type Button}">
	<Setter Property="Template">
			<ControlTemplate TargetType="Button">
					<Border x:Name="ButtonBorder" Background="LightGray" BorderBrush="DarkBlue" BorderThickness="3" CornerRadius="5">
						<ContentPresenter Margin="20" />

						<VisualStateGroup x:Name="CommonStates">
							<VisualState x:Name="Normal"/>
							<VisualState x:Name="MouseOver"/>
							<VisualState x:Name="Disabled"/>
							<VisualState x:Name="Pressed">
									<ObjectAnimationUsingKeyFrames Storyboard.TargetName="ButtonBorder" Storyboard.TargetProperty="Background">
										<DiscreteObjectKeyFrame KeyTime="0">
												<SolidColorBrush Color="DarkGray" />

The application was developed and basically tested using common monitor and a mouse. When we connected the machine to a touch screen monitor to thoroughly test the first few screens, it turned out that the color change on currently pressed buttons – that had previously worked using mouse clicks – was gone, the button could actually be operated using finger touch events (and the command connected with it was executed), but the animation defined for the Pressed state was not considered.

The solution:

There are two things to distinguish here:

  • The button’s Click event is handled irrespective of input device, both mouse click and finger touch are forwarded to the same event handler and treated in the same way. This explains why the business logic connected with the button’s command is successfully executed independently from the visual state change.
  • The VisualState changes are triggered by dedicated mouse events: For example, the Pressed state is reached when the button’s OnMouseLeftButtonDown event occurs. The finger touch-and-hold event is usually mapped to a click with the right mouse button (when tapping on the screen and not instantly releasing the finger, Windows interprets this as right-click operation) – the OnMouseLeftButtonDown event never fires, and the Pressed state is never reached!

Fortunately, this can be fixed easily by completely disabling the right-click simulation, either for a distinct button or for the whole window / page, by setting the Stylus.IsPressAndHoldEnabled attached property to false:

<Style x:Key="MyButtonStyle" TargetType="{x:Type Button}">
	<Setter Property="Stylus.IsPressAndHoldEnabled" Value="False"/>