gestures. This Menu function should be presented unless hidden together with other navigation
Android device implementations with the support of the Assist action [Resources, 30
] MUST make this
accessisble with a single action (e.g. tap, double-click, or gesture) when other navigation keys are
visible, and are STRONGLY RECOMMENDED to use the long-press on the Home button or software
key as the single action.
Device implementations MAY use a distinct portion of the screen to display the navigation keys, but if
so, MUST meet these requirements:
Device implementation navigation keys MUST use a distinct portion of the screen, not
available to applications, and MUST NOT obscure or otherwise interfere with the portion of
the screen available to applications.
Device implementations MUST make available a portion of the display to applications that
meets the requirements defined in section 7.1.1
Device implementations MUST display the navigation keys when applications do not
specify a system UI mode, or specify SYSTEM_UI_FLAG_VISIBLE.
Device implementations MUST present the navigation keys in an unobtrusive “low profile”
(eg. dimmed) mode when applications specify SYSTEM_UI_FLAG_LOW_PROFILE.
Device implementations MUST hide the navigation keys when applications specify
7.2.4. Touchscreen Input
Android Handhelds and Watch Devices MUST support touchscreen input.
Device implementations SHOULD have a pointer input system of some kind (either mouse-like or
touch). However, if a device implementation does not support a pointer input system, it MUST NOT
report the android.hardware.touchscreen or android.hardware.faketouch feature constant. Device
implementations that do include a pointer input system:
SHOULD support fully independently tracked pointers, if the device input system supports
MUST report the value of android.content.res.Configuration.touchscreen [Resources, 85
corresponding to the type of the specific touchscreen on the device.
Android includes support for a variety of touchscreens, touch pads, and fake touch input devices.
Touchscreen based device implementations are associated with a display [Resources, 86
] such that
the user has the impression of directly manipulating items on screen. Since the user is directly
touching the screen, the system does not require any additional affordances to indicate the objects
being manipulated. In contrast, a fake touch interface provides a user input system that approximates
a subset of touchscreen capabilities. For example, a mouse or remote control that drives an on-screen
cursor approximates touch, but requires the user to first point or focus then click. Numerous input
devices like the mouse, trackpad, gyro-based air mouse, gyro-pointer, joystick, and multi-touch
trackpad can support fake touch interactions. Android includes the feature constant
android.hardware.faketouch, which corresponds to a high-fidelity non-touch (pointer-based) input
device such as a mouse or trackpad that can adequately emulate touch-based input (including basic
gesture support), and indicates that the device supports an emulated subset of touchscreen
functionality. Device implementations that declare the fake touch feature MUST meet the fake touch
requirements in section 7.2.5
Device implementations MUST report the correct feature corresponding to the type of input used.
Device implementations that include a touchscreen (single-touch or better) MUST report the platform
feature constant android.hardware.touchscreen. Device implementations that report the platform
feature constant android.hardware.touchscreen MUST also report the platform feature constant
android.hardware.faketouch. Device implementations that do not include a touchscreen (and rely on a
Page 40 of 74