Collaborative Virtual Environments (collaborative + virtual_environment)

Distribution by Scientific Domains


Selected Abstracts


Behaviour-based multiplayer collaborative interaction management

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 1 2006
Qingping Lin
Abstract A collaborative virtual environment (CVE) allows geographically dispersed users to interact with each other and objects in a common virtual environment via network connections. One of the successful applications of CVE is multiplayer on-line role-playing game. To support massive interactions among virtual entities in a large-scale CVE and maintain consistent status of the interaction among users with the constraint of limited network bandwidth, an efficient collaborative interaction management method is required. In this paper, we propose a behaviour-based interaction management framework for supporting multiplayer role-playing CVE applications. It incorporates a two-tiered architecture which includes high-level role behaviour-based interaction management and low-level message routing. In the high level, interaction management is achieved by enabling interactions based on collaborative behaviour definitions. In the low level, message routing controls interactions according to the run-time status of the interactive entities. Collaborative Behaviour Description Language is designed as a scripting interface for application developers to define collaborative behaviours of interactive entities and simulation logics/game rules in a CVE. We demonstrate and evaluate the performance of the proposed framework through a prototype system and simulations. Copyright © 2006 John Wiley & Sons, Ltd. [source]


Myriad: scalable VR via peer-to-peer connectivity, PC clustering, and transient inconsistency

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 1 2007
Benjamin Schaeffer
Abstract Distributed scene graphs are important in virtual reality, both in collaborative virtual environments and in cluster rendering. Modern scalable visualization systems have high local throughput, but collaborative virtual environments (VEs) over a wide-area network (WAN) share data at much lower rates. This complicates the use of one scene graph across the whole application. Myriad is an extension of the Syzygy VR toolkit in which individual scene graphs form a peer-to-peer network. Myriad connections filter scene graph updates and create flexible relationships between nodes of the scene graph. Myriad's sharing is fine-grained: the properties of individual scene graph nodes to share are dynamically specified (in C++ or Python). Myriad permits transient inconsistency, relaxing resource requirements in collaborative VEs. A test application, WorldWideCrowd, demonstrates collaborative prototyping of a 300-avatar crowd animation viewed on two PC-cluster displays and edited on low-powered laptops, desktops, and over a WAN. We have further used our framework to facilitate collaborative educational experiences and as a vehicle for undergraduates to experiment with shared virtual worlds. Copyright © 2006 John Wiley & Sons, Ltd. [source]


The AS interactive project: single-user and collaborative virtual environments for people with high-functioning autistic spectrum disorders

COMPUTER ANIMATION AND VIRTUAL WORLDS (PREV: JNL OF VISUALISATION & COMPUTER ANIMATION), Issue 5 2003
Anja Rutten
Abstract The AS Interactive Project aimed to assess the potential of single-user and collaborative virtual environments to support learning and enhancing of social skills in people with autistic disorders on the high-functioning end of the autistic spectrum. The project had two distinct phases of research: Phase I focused mainly on development and design using user-centred principles. Phase II was concerned with implementation of design feedback, further improvements and evaluation studies of the virtual environments developed. This paper describes the research process, summarizes results of the project and briefly outlines plans for future research. Copyright © 2003 John Wiley & Sons, Ltd. [source]


Eye gaze in virtual environments: evaluating the need and initial work on implementation

CONCURRENCY AND COMPUTATION: PRACTICE & EXPERIENCE, Issue 11 2009
Norman Murray
Abstract For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright © 2009 John Wiley & Sons, Ltd. [source]


Transformed Social Interaction, Augmented Gaze, and Social Influence in Immersive Virtual Environments

HUMAN COMMUNICATION RESEARCH, Issue 4 2005
Jeremy N. Bailenson
Immersive collaborative virtual environments (CVEs) are simulations in which geographically separated individuals interact in a shared, three-dimensional, digital space using immersive virtual environment technology. Unlike videoconference technology, which transmits direct video streams, immersive CVEs accurately track movements of interactants and render them nearly simultaneously (i.e., in real time) onto avatars, three-dimensional digital representations of the interactants. Nonverbal behaviors of interactants can be rendered veridically or transformed strategically (i.e., rendered nonveridically). This research examined augmented gaze, a transformation in which a given interactant's actual head movements are transformed by an algorithm that renders his or her gaze directly at multiple interactants simultaneously, such that each of the others perceives that the transformed interactant is gazing only at him or her. In the current study, a presenter read a persuasive passage to two listeners under various transformed gaze conditions, including augmented gaze. Results showed that women agreed with a persuasive message more during augmented gaze than other gaze conditions. Men recalled more verbal information from the passage than women. Implications for theories of social interaction and computer-mediated communication are discussed. [source]