Java程序辅导

C C++ Java Python Processing编程在线培训 程序编写 软件开发 视频讲解

客服在线QQ:2653320439 微信:ittutor Email:itutor@qq.com
wx: cjtutor
QQ: 2653320439


Direct Mobile Interaction with Contents on Large Displays -Master Thesis 2012

, Master of Creative Media Technology, HIT Lab AU, School of Computing & Information Systems

Supervisor (HIT Lab AU, School of Computing and Information Systems)


Abstract

As wall-size displays become increasingly common, they require a direct and intuitive interface to manipulate their contents. Such an interface should support mobility for users, allow multi-users collaboration and offer tactile feedback. We propose Direct Mobile Interaction (DMI), a spatially-aware approach to interact with contents on large displays through manipulating the objects displayed on the mobile device's screen, spatially related to the real world through mobile AR. In this thesis, we understood the basic technical characteristics of large displays, mobile interaction, direct manipulation and spatially-aware interface. In the interaction design, we compared the input style and input characteristics to understand the interactions that the DMI can facilitate. We also compared different input devices to confirm which one is suitable for DMI. We followed that with methodology and the development process, implementing the applications to demonstrate the proposed interface. Lastly, we demonstrated that the proposed DMI can offer different interaction and proper collaboration features, which effectively extends the interaction space of mobile devices to large display space as well as enables natural user interface with contents on large display.



Encouraging Understanding of Natural Resources using Emergent Technology -PhD Dissertation 2011-present

, PhD candidate, HIT Lab AU, School of Computing & Information Systems


Supervisors: Prof. Paddy Nixon, (HIT Lab AU, School of Computing and Information Systems)


Research Statement


Natural resources sustain all life on the planet. They are the fundamental elements of nature, being: water, air, sunlight, plants, animals, and minerals. We use and consume resources to live, work, and play. They are as essential to our well being as we are to theirs, and in this sense humans are the custodians of the environment. As custodians we need to ensure that our actions do not negatively affect the environment. We don’t know what we don’t know; therefore we are able to reduce negative environmental impacts by appropriately targeted focused education. This means education focused upon of the most significant environmental dynamics, as identified by natural resource experts. This education should be targeted at the general population of adults simply because adults make decisions that have a greater impact upon the environment than children. The education needs to be useable, engaging and suitable to the range of learning styles to be most effective.


TUIs are an ideal building block for the delivery mechanism for learning about the interconnectedness of natural resources because TUI have two essential criteria. They offer the most diverse educational potential, and they can show natural resource information in all of its varied forms (reports, database tables, infographics, and spatial information).


Greater knowledge and understanding of natural resources provides benefits by reducing detrimental impacts caused by a lack of understanding. This research aims to raise the understanding of natural resources for adults, by the use of a user interface specifically designed to incorporate kinesthetic learning styles, as well as the more traditional learning styles of current systems. Catering for the additional learning style of kinesthetics should increase the uptake of information, as well as increase the appeal of the system to a greater proportion of adults, thereby improving understanding to a greater proportion of adults than current systems.



‘Hands-On’ Engagement – A Mobile Approach to Collaborative Discussion of Digital Museum Artefacts -PhD Dissertation 2011-present

, PhD candidate, HIT Lab AU, School of Computing & Information Systems


Supervisors: Prof. Paddy Nixon, (School of Computing and Information Systems) (HIT Lab AU, School of Computing and Information Systems)


Research Statement

In an ideal world, physical museum artefacts could be touched, handled, examined and passed between interested viewers by hand. Unfortunately, this is not always possible – artefacts may be too fragile to handle or pass around, might be behind display cases which make certain viewing angles difficult or impossible, or groups of people with mutual interests in objects may not even be in the same location. This can be problematic when attempting to explain or make sense of the physical properties of artefacts.


To address these problems, we propose that direct manipulation of 3D content based on real-world interaction metaphors can help collaborators (both co- and remotely located) to construct personal and mutual physical and spatial awareness of artefacts, while networked communication and collaboration allow for ideas and knowledge to be exchanged and shared in the remote context.


Our approach makes use of tablets as a hands-on interaction technique, leaving the user feeling that exploration of 3D virtual representations of artefacts is a manual experience. Through user studies with a prototype application, we hope to show that our approach will help users to engage users with museum content and in the collaborative discussions it inspires, facilitating learning for groups who wouldn’t normally have the chance to handle artefacts together.



Augmented Reality in the Spatial Sciences: the role of spatially enabled mobile devices -Honours Thesis 2012

, School of Geography and Environmental Sciences


Supervisors: (School of Geography & Environmental Studies), (HIT Lab AU, School of Computing and Information Systems, (School of Geography & Environmental Studies).


Abstract:

Augmented Reality (AR) refers to the technology that in real time combines virtual information into a real-world experience, with the intention of enhancing the user’s experience in a way that otherwise would not be achievable. This thesis focuses on the visual augmentation aspect of AR, in particular, investigating the application of AR to the spatial sciences.The scope and impact of the spatial sciences industry is assessed in order to demonstrate the potential importance of augmented reality within the industry and across the sectors that it services. Current developments in spatial positioning are explored in order to assess their potential impact on augmented reality devices that rely on position and orientation sensors.A key issue for implementing AR within the Spatial Sciences Industry is the capacity to reliably and quantitatively manage spatial accuracy. A simulation is developed, to model the propagation of random errors when integrating the 3D coordinates of virtual objects into a scene that is being imaged through the camera of a portable, spatially enabled, device.A simple AR application is developed on an Android tablet to demonstrate the capabilities of an AR system that integrates 3D virtual targets into a photographed scene. This demonstration indicates that the developed simulation model accurately replicates the random errors evident in a propagated solution, whilst highlighting the potential limitations of currently available consumer devices.


Interactive Surface on Enhancement of Physical Interaction of Tangible Objects -Master Thesis 2012

, Master of Creative Media Technology, HIT Lab AU, School of Computing & Information Systems


Supervisor (HIT Lab AU, School of Computing and Information Systems)


Abstract

Tangible user interface (TUI) is a new user interface which allows users to manipulate the physical objects (tangible objects) to directly control the virtual objects. However in TUI, virtual objects also can enhance the physical interactions which are the interaction between physical objects. In this thesis, through the study of the restriction of physical interaction on the interactive surface, we divided the suitable physical interaction on the interactive surface into two main types, the first one is direct physical interaction, and another one is indirect physical interaction. Based on the understanding for the interactive surface characteristics, we found the interactive surface has the powerful features in multimedia, including visualization and auditory. Besides, due to the interactive surface which belongs to a kind of the computer, it also can be used to implement the artificial intelligence in order to smartly interact with users. In our application on Microsoft Surface 2.0, we implemented one of typical direct physical interaction examples, building an electricity circuit, with intelligent guide from the interactive surface for users to help them create the correct physical circuit, the enhancement also includes hints/warning animation for different operations of the electricity circuit, the provided basic knowledge note about the basic circuit and the real-time circuit diagram to compare the real-time building circuit with the target circuit diagram. Consequently, based on the application, we demonstrated that interactive surface can enhance the physical interaction.



Efficient Interaction for Google Earth 3D Navigation with a Large Display System -Master Thesis 2010

, Master of Computing, HIT Lab AU, School of Computing & Information Systems


Supervisor (HIT Lab AU, School of Computing and Information Systems)


Abstract


Until now, we have not had a set of suitable 3D navigation interaction techniques that are capable of providing a similar experience to the real world travel for a user in a very large display environment. In this thesis, we studied the potential benefits of an input/ navigation control device that has more than 2-DOF for 3D navigation interaction in a large display system. From user testing, we found that the interaction of the input device with more than 2-DOF (our 4-DOF input device derived from a commercial 6-DOF optical marker) is effective, intuitive, and natural. In particular, there was big improvement on intuitiveness and naturalness with the input device being compared to a conventional mouse and a 3D mouse. These outcomes have the direct correlation with the ability of our device to implement direct manipulation that encouraged users to perform physical navigation such as body movement and head movement. As a result, 70 percent of the participants in our user testing preferred our input device over the conventional mouse and 3D mouse.


Direct Gestural Interaction with Geographical Information Systems -PhD Dissertation 2009-present

, PhD candidate, HIT Lab AU, School of Computing & Information Systems


Supervisors: (HIT Lab AU, School of Computing and Information Systems), Dr. Arko Lucieer (School of Geography & Environmental Studies)


This project is the work of Simon Stannus as a part of his PhD research. It is used to test users' interaction with three-dimensional geographical data in a large-screen display environment.


The project's software is written in Java using the World Wind Java SDK developed by NASA and is designed around the rear-projection screens and ARTrack infrared tracking system of the HITLab's VisionSpace. The system is capable of displaying three-dimensional geographical data on a virtual Earth in large-screen stereo 3D. In addition, the user's head position is tracked (via retro-reflective markers on the 3D glasses they wear) and used to update the 3D rendering in such a way that it is possible for the user to "walk through" the data. The user also wears a pair of gloves that are also tracked by the ARTrack system and when he/she pinches, a signal is sent back to the system over bluetooth via electronics attached to the gloves.


These signals and display feedback allow the user to view a virtual representation of their hands in front of them and navigate around by "grabbing" one or two points in the virtual environment with their hands and moving them to transform the virtual world. The software also allows users to view and adjust aerial imagery and create and edit three-dimensional paths in a similarly natural manner using the gloves. In addition, the system also supports different gesture input techniques and completely different input devices (Logitech MX Air mouse and 3Dconnexion SpaceExplorer 3D mouse). These capabilities allow the system to be used to evaluate the effectiveness of the natural gesture approach in comparison to existing approaches.