top of page

Papers

 

The following are some of the academic papers published by our lab. Download the full list Here.

Click each image to download the paper.

 

Gilbert, J. K., & Reiner, M. (2000). 
Thought experiments in science education: potential and current realization. 
International Journal of Science Education.

Thought Experiments (TEs) have a long history of use in science. It is suggested that, if science education is to be related as clearly as possible to science, then TEs must play an appropriate part. The relationship between TEs and experiments (Es) is explored. A typology of TEs is presented with examples drawn fromthe history of physics. The potential uses of the various types of TEs in bringing about conceptual development and as a complement to conventional practical work are addressed. The analysis of three typical school and higher-education level physics textbooks shows that the potentials identified are, at present, not realised. Indeed, elements of TE design were found to be integrated with other pedagogic devices into what we have termed ‘thought simulations’ (TSs). In these, the behaviour of a phenomenon was illustrated rather than predicted and tested, theory was assumed and embedded rather than being tentative and emergent, and the outcome was assumed rather than being anticipated.

Slater, M., Frisoli, A., Tecchia, F., Guger, C., Lotto, B., Steed, A., Pfurtscheller, G., et al. (2007).
Understanding and realizing presence in the Presenccia project
IEEE Computer Graphics and Applications, 27(4), 90-93

People who experience an immersive VR system usually report feeling as if they were really in the displayed virtual situation, and can often be observed behaving in accordance with that feeling, even though they know that they’re not actually there. Researchers refer to this feeling as “presence” in virtual environ- ments, yet the term has come to have many uses and meanings, all of which evolved from the notion of tele- presence in teleoperator systems. Currently, many applications, such those that use VR in psychotherapy, rely on presence for their success. Many empirical stud- ies have attempted to determine the impact of various technological factors—such as field-of-view, frame rate, stereo, and head tracking—on how much presence VR participants reportedly experience. Typically, researchers assess this using questionnaires. However, because there’s no scientific theory that explains how or why presence occurs, application builders have no scientifically grounded guidelines on which they can build presence-inducing virtual environments.

Reiner, M., & Gelfeld, T. M. (2014). 
Estimating mental workload through event-related fluctuations of pupil area during a task in a virtual world
International Journal of Psychophysiology, 93(1), 38-44. Elsevier.

Monitoring mental load for optimal performance has become increasingly central with the recently evolving need to cope with exponentially increasing amounts of data. This paper describes a non-intrusive, objective method to estimate mental workload in an immersive virtual reality system, through analysis of frequencies of pupil fluctuations. We tested changes in mental workload with a number of task-repetitions, level of predictability of the task and the effect of prior experience in predictable task performance, on mental workload of unpredictable task performance. Two measures were used to calculate mental workload: the ratio of Low Frequency to High Frequency components of pupil fluctuations, and the High Frequency alone, all extracted from the Power Spectrum Density of pupil fluctuations. Results show that mental workload decreases with a number of repetitions, creating a mode in which the brain acts as an automatic controller. Automaticity during training occurs only after a minimal number of repetitions, which once achieved, resulted in further improvements in the performance of unpredictable motor tasks, following training in a predictable task. These results indicate that automaticity is a central component in the transfer of skills from highly predictable to low predictable motor tasks. Our results suggest a potentially applicable method to brain-computer-interface systems that adapt to human mental workload, and provide intelligent automated support for enhanced performance.

Neuper, C., Scherer, R., Reiner, M., & Pfurtscheller, G. (2005).
Imagery of motor actions: Differential effects of kinesthetic and visual-motor mode of imagery in single-trial EEG
Cognitive Brain Research, 25(3), 668-677.

Single-trial motor imagery classification is an integral part of a number of brain-computer interface (BCI) systems. The possible significance of the kind of imagery, involving rather kinesthetic or visual representations of actions, was addressed using the following experimental conditions: kinesthetic motor imagery (MIK), visual-motor imagery (MIV), motor execution (ME) and observation of movement (OOM). Based on multi-channel EEG recordings in 14 right-handed participants, we applied a learning classifier, the distinction sensitive learning vector quantization (DSLVQ) to identify relevant features (i.e., frequency bands, electrode sites) for recognition of the respective mental states. For ME and OOM, the overall classification accuracies were about 80%. The rates obtained for MIK (67%) were better than the results of MIV (56%). Moreover, the focus of activity during kinesthetic imagery was found close to the sensorimotor hand area, whereas visual-motor imagery did not reveal a clear spatial pattern. Consequently, to improve motor-imagery-based BCI control, user training should emphasize kinesthetic experiences instead of visual representations of actions. ?? 2005 Elsevier B.V. All rights reserved.

NeReiner, M., Korsnes, M., Glover, G., Hugdahl, K., & Feldmann, M. (2011)
Seeing shapes and hearing textures: Two neural categories of touch.
The Open Neuroscience Journal, 5, 8-15.

Touching for shape recognition has been shown to activate occipital areas in addition to somatosensory areas. In this study we asked if this combination of somatosensory and other sensory processing areas also exist in other kinds of touch recognition. In particular, does touch for texture roughness matching activate other sensory processing areas apart from somatosensory areas? We addressed this question with functional magnetic resonance imaging (fMRI) using wooden abstract stimulus objects whose shape or texture were to be identified. The participants judged if pairs of objects had the same shape or the same texture. We found that the activated brain areas for texture and shape matching have similar underlying structures, a combination of the primary motor area and somatosensory areas. Areas associated with object-shape processing were activated between stimuli during shape matching and not texture roughness matching, while auditory areas were activated during encoding of texture and not for shape stimuli. Matching of textures also in- volves left BA47, an area associated with retrieval of relational information. We suggest that texture roughness is recog- nized in a framework of ordering. Left-lateralized activations favoring texture might reflect semantic processing associ- ated with grading roughness quantitatively, as opposed to the more qualitative distinctions between shapes.

bottom of page