This part of the User Experience Design Ontology defines characteristics of the computing platform that can influence the quality of the interactive experience. The computing platform includes the hardware device as well as the software framework in which applications can be build.
Author(s): Coomans, Marc
Publisher: Agfa Healthcare
The URI of this ontology is: http://www.agfa.com/w3c/2009/uxd-1-0-platform/
. In the remainder of this document, we will assume that you import this vocabulary in your own document by associating the prefex "ux-p" with the above URI: xmlns:ux-p="http://www.agfa.com/w3c/2009/uxd-1-0-platform#"
. The concepts of this vocabulary can then be referenced with CURIEs (compact URIs) of the form ux-p:{resource name}
.
Base classes:
ux-p:PlatformCharacteristic
):
Device Category
(ux-p:DeviceCategory
)
ux-p:Phone
)
ux-p:TouchPhone
): Mobile phone with touchscreen
ux-p:Tablet
)
ux-p:Laptop
)
ux-p:Kiosk
)
ux-p:Desktop
): Desktop computer
ux-p:TouchDesktop
): Desktop computer with touchmonitor
ux-p:Table
):
ux-p:Wall
):
ux-p:AllDeviceCategories
): use this category value to indicate that something is not limited to any specific device category (including any device kinds that are not explicitely listed above).
Input modality
(ux-p:InputModality
): a sensor through which the device can receive the input from the user.
ux-p:Keyboard
): a typewriter-style keyboard that mostly contains alphabetical keys
ux-p:NumericKeypad
): a keypad that mostly contains number keys (unlike a keyboard )
ux-p:MousePointer
): a graphic pointer on the screen that can be controlled by moving a pointing device such as a mouse, trackball, touchpad, stylus or even by finger touch.
ux-p:DeviceMotionSensor
): sensors that detect the motion of the device itself or that of an object contected to the device in the 3D space (i.e. using accelerometers, gyroscopes, orientation sensors).
ux-p:BodyMotionSensor
): sensors that detect the motion of the user's body.
ux-p:VoiceInput
): voice/speech recognition capabilities.
ux-p:TouchScreen
): a touch sensing screen that accept input by touch of fingers or a stylus.
ux-p:SingleTouch
): a touch sensing screen that recognise a single point of contact at the time.
ux-p:MultiTouch
): a touch sensing screen that recognise the presence of two or more points of contact.
ux-p:MultiPersonTouch
): a touch sensing screen that recognise the presence of points of contact for multiple persons at the same time.
ux-p:TouchGestureEnabled
): a touch sensing screen that supports multiple touch gestures. A touch gesture refers to a motion used to interact with touch screen interfaces. It may involve a single point of contact (i.e. tab, press, drag, flick), or multiple points of contact (i.e. pinch, squeeze)
ux-p:TabOnly
): a touch sensing screen that only supports the simple "tab", and optionally the "double tab" gesture.
ux-p:NotMousePointerEnabled
)
ux-p:NotVoiceEnabled
)
ux-p:NotTouchEnabled
)
ux-p:NotMotionEnabled
)
Output modality
(ux-p:OutputModality
): a sense or platform component through which the user can receive output of the device
ux-p:VisualOutput
)
ux-p:Screen
)
ux-p:SoundOutput
)
ux-p:Sounds
)
ux-p:SpeechOutput
)
ux-p:TactileOutput
)
ux-p:Vibration
)
Physical Screen Size Category
(ux-p:PhysScreenSizeCat
)
Possible values:
ux-p:PhysScreenSizeVerySmall
):
ux-p:PhysScreenSizeSmall
):
ux-p:PhysScreenSizeMedium
):
ux-p:PhysScreenSizeLarge
):
ux-p:PhysScreenSizeVeryLarge
):