|
![]() |
|||
|
||||
OverviewFollowing mainframe and personal computing, we are not only carrying multiple intelligent systems with us, but we also encounter them in our environment. Human-computer interaction has to keep up with this paradigm shift towards Ubiquitous Computing. This dissertation presents new methods to interact with the myriad of spatially distributed components in a way that is not only efficient, but also provides a high user experience. The challenge lies in the inherent complexity caused by the multitude of different systems, their virtual connections and spatial relationships. A simple scenario from the smart home domain is the creation of a light scene by adjusting multiple colored lamps in a room. It is important to re-enable user control, by partitioning or presenting this complexity in a manageable way. According to the current state-of-the-art, almost every aspect of human action is influenced by the spatial relationships in our physical reality. In this work, we investigated how spatial interfaces must be constructed to improve the interaction with intelligent environments. This is based on an exploratory approach. We created several interactive prototypes, which were iteratively refined and evaluated in formative and summative user studies with many participants. The prototypes are inspired from related areas, which are already concerned with interaction in 3-dimensional space. Possible candidates are found in the mixed reality continuum which includes Augmented Reality (AR) and Virtual Reality (VR). But the design also considered concepts derived from interaction with large display walls and prior work in the area of intelligent environments. In addition to the empirical results, we documented the design process and provide helpful tools and best practices for the development of these interactions. The contributions of this dissertation are categorized in the following three aspects: Spatial Action: The user can select a device for further control, by pointing at it from a distance. We designed and developed and interactive prototype, based on the Kinect depth sensor, to demonstrate that consumer grade hardware is sufficient to integrate spatial concepts in the interface. The necessary model of the environment is generated intuitively with the same gestures. We could show that spatial interaction has a high initial effort, compared to display based interfaces, but with it can scale much better with an increasing number of devices in the environment. Spatial Perception: Spatial relationships can only be an effective part of the interaction, if they are perceived correctly by the user. The first system is concerned with the use of hand-held augmented reality, where depth perception suffers because reality is reduces to a 2-dimensional display. With the help of a stereoscopic camera and a matching display, we were able to compensate this for challenging scenes. In contrast, the second system augments the visual with a tactile perception of the environment. Head-worn modules, consisting of ultrasonic sensors and servo motors, measure the distance to objects in the room and communicate this information via pressure applied to the skin. Spatial Interaction using Augmented Reality: In addition to simple actions in space, we studied two hand-held augmented reality applications for intelligent environments. One use-case was the interactive analysis of sensor data on-site, provided from an increasing number of sources in the digitized industrial production. By including the data in the live picture of the machine, we provided the users with important context about the sensor. The second application is concerned with visual programming of automation rules in the smart home. Causal relations could be defined by drawing connection on the augmented view on the tablet. Full Product DetailsAuthor: Matthias BerningPublisher: Createspace Independent Publishing Platform Imprint: Createspace Independent Publishing Platform Dimensions: Width: 15.20cm , Height: 1.30cm , Length: 22.90cm Weight: 0.381kg ISBN: 9781539959779ISBN 10: 1539959775 Pages: 202 Publication Date: 14 April 2017 Audience: General/trade , General Format: Paperback Publisher's Status: Active Availability: Available To Order ![]() We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |