sensors-logo

Journal Browser

Journal Browser

Vision Science and Technology in Human Computer Interaction Systems

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 31 July 2024 | Viewed by 10018

Special Issue Editors


E-Mail Website
Guest Editor
Department of Informatics, Bioengineering, Robotics and Systems Engineering, University of Genoa, 16145 Genoa, Italy
Interests: natural and artificial vision perception; neuromorphic computing; perception and action; translational vision science

E-Mail Website
Guest Editor
Department of Informatics, Bioengineering, Robotics and Systems Engineering, University of Genoa, 16145 Genoa, Italy
Interests: neuromorphic computing; neurosensory engineering; VR neural rehabilitation; cognitive neuroscience

Special Issue Information

Dear Colleagues,

Vision is the predominant human sense, in terms of accuracy and reliability. Through it we extract contactless information from the outer world, we build cognitive models, we develop spatial relationships, we learn situations, and we even exploit our eyes to communicate intentions or goals with our interaction counterparts in shared workspaces. On the artificial side, theoretical and computational advances in vision technology have reached levels of reliability, such as to enable fast and efficient human–machine interactions for natural and intuitive bidirectional communications. Current interaction systems have greatly benefitted from the inclusion of advanced sensing and computing capabilities that are adapted to how humans perceive and interact with the real world. This has challenged the traditional device-centric view of facilitating human–computer interaction by overcoming the traditional concepts of input and output. HMI has now reached a level of maturity to go even one step further: to extend the boundaries of human–machine engagement from the focus of attention to the periphery of our senses, fulfilling the true interaction potential in a shared space.

The goal of this Special Issue is to collect contributions that demonstrate how evidence from vision science coupled with vision-based technology can extend, adapt, and smooth users’ experience of HCI (e.g., by including multiuser or social interaction, covert attention, and prediction).

Contributions on real-world applications, novel and nonconventional vision sensors, and HCI based on human perception models are encouraged.

Topics relevant to this Special Issue include, but are not limited to, the following:

  • RGB-D cameras, TOF systems.
  • Nonconventional vision sensors, such as event-based cameras.
  • Virtual and augmented physical reality.
  • Algorithms and techniques for scene as well as action understanding, in addition to situation awareness.
  • Gesture recognition.
  • Ambient intelligence.
  • Emotion detection and simulation.
  • Models of human perception.
  • Anticipatory user interfaces based on human models.
  • Affective and social signaling.
  • Human–technology symbiosis.

Dr. Silvio P. Sabatini
Dr. Andrea Canessa
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • vision-based sensing
  • perceptual rendering
  • human-centered design
  • cognitive ergonomics
  • evaluating interactive systems

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

28 pages, 16347 KiB  
Article
Visual Attention and Emotion Analysis Based on Qualitative Assessment and Eyetracking Metrics—The Perception of a Video Game Trailer
by Eva Villegas, Elisabet Fonts, Marta Fernández and Sara Fernández-Guinea
Sensors 2023, 23(23), 9573; https://doi.org/10.3390/s23239573 - 02 Dec 2023
Cited by 1 | Viewed by 1151
Abstract
Video game trailers are very useful tools for attracting potential players. This research focuses on analyzing the emotions that arise while viewing video game trailers and the link between these emotions and storytelling and visual attention. The methodology consisted of a three-step task [...] Read more.
Video game trailers are very useful tools for attracting potential players. This research focuses on analyzing the emotions that arise while viewing video game trailers and the link between these emotions and storytelling and visual attention. The methodology consisted of a three-step task test with potential users: the first step was to identify the perception of indie games; the second step was to use the eyetracking device (gaze plot, heat map, and fixation points) and link them to fixation points (attention), viewing patterns, and non-visible areas; the third step was to interview users to understand impressions and questionnaires of emotions related to the trailer’s storytelling and expectations. The results show an effective assessment of visual attention together with visualization patterns, non-visible areas that may affect game expectations, fixation points linked to very specific emotions, and perceived narratives based on the gaze plot. The innovation in the mixed methodological approach has made it possible to obtain relevant data regarding the link between the emotions perceived by the user and the areas of attention collected with the device. The proposed methodology enables developers to understand the strengths and weaknesses of the information being conveyed so that they can tailor the trailer to the expectations of potential players. Full article
(This article belongs to the Special Issue Vision Science and Technology in Human Computer Interaction Systems)
Show Figures

Figure 1

25 pages, 3079 KiB  
Article
Binocular Rivalry Impact on Macroblock-Loss Error Concealment for Stereoscopic 3D Video Transmission
by Md Mehedi Hasan, Md. Azam Hossain, Naif Alotaibi, John F. Arnold and AKM Azad
Sensors 2023, 23(7), 3604; https://doi.org/10.3390/s23073604 - 30 Mar 2023
Cited by 1 | Viewed by 1410
Abstract
Three-dimensional video services delivered through wireless communication channels have to deal with numerous challenges due to the limitations of both the transmission channel’s bandwidth and receiving devices. Adverse channel conditions, delays, or jitters can result in bit errors and packet losses, which can [...] Read more.
Three-dimensional video services delivered through wireless communication channels have to deal with numerous challenges due to the limitations of both the transmission channel’s bandwidth and receiving devices. Adverse channel conditions, delays, or jitters can result in bit errors and packet losses, which can alter the appearance of stereoscopic 3D (S3D) video. Due to the perception of dissimilar patterns by the two human eyes, they can not be fused into a stable composite pattern in the brain and hence try to dominate by suppressing each other. Thus, a psychovisual sensation that is called binocular rivalry occurs. As a result, undetectable changes causing irritating flickering effects are seen, leading to visual discomforts such as eye strain, headache, nausea, and weariness. This study addresses the observer’s quality of experience (QoE) by analyzing the binocular rivalry impact on the macroblock (MB) losses in a frame and its error propagation due to predictive frame encoding in stereoscopic video transmission systems. To simulate the processing of experimental videos, the Joint Test Model (JM) reference software has been used as it is recommended by the International Telecommunication Union (ITU). Existing error concealing techniques were then applied to the contiguous lost MBs for a variety of transmission impairments. In order to validate the authenticity of the simulated packet loss environment, several objective evaluations were carried out. Standard numbers of subjects were then engaged in the subjective testing of common 3D video sequences. The results were then statistically examined using a standard Student’s t-test, allowing the impact of binocular rivalry to be compared to that of a non-rivalry error condition. The major goal is to assure error-free video communication by minimizing the negative impacts of binocular rivalry and boosting the ability to efficiently integrate 3D video material to improve viewers’ overall QoE. Full article
(This article belongs to the Special Issue Vision Science and Technology in Human Computer Interaction Systems)
Show Figures

Figure 1

15 pages, 46397 KiB  
Article
An Augmented Reality Serious Game for Learning Intelligent Wheelchair Control: Comparing Configuration and Tracking Methods
by Rafael Maio, Bernardo Marques, João Alves, Beatriz Sousa Santos, Paulo Dias and Nuno Lau
Sensors 2022, 22(20), 7788; https://doi.org/10.3390/s22207788 - 13 Oct 2022
Cited by 2 | Viewed by 1671
Abstract
This work proposes an augmented reality serious game (ARSG) for supporting individuals with motor disabilities while controlling robotic wheelchairs. A racing track was used as the game narrative; this included restriction areas, static and dynamic virtual objects, as well as obstacles and signs. [...] Read more.
This work proposes an augmented reality serious game (ARSG) for supporting individuals with motor disabilities while controlling robotic wheelchairs. A racing track was used as the game narrative; this included restriction areas, static and dynamic virtual objects, as well as obstacles and signs. To experience the game, a prior configuration of the environment, made through a smartphone or a computer, was required. Furthermore, a visualization tool was developed to exhibit user performance while using the ARSG. Two user studies were conducted with 10 and 20 participants, respectively, to compare (1) how different devices enable configuring the ARSG, and (2) different tracking capabilities, i.e., methods used to place virtual content on the real-world environment while the user interacts with the game and controls the wheelchair in the physical space: C1—motion tracking using cloud anchors; C2—offline motion tracking. Results suggest that configuring the environment with the computer is more efficient and accurate, in contrast to the smartphone, which is characterized as more engaging. In addition, condition C1 stood out as more accurate and robust, while condition C2 appeared to be easier to use. Full article
(This article belongs to the Special Issue Vision Science and Technology in Human Computer Interaction Systems)
Show Figures

Figure 1

Review

Jump to: Research

14 pages, 1083 KiB  
Review
An Overview of Wearable Haptic Technologies and Their Performance in Virtual Object Exploration
by Myla van Wegen, Just L. Herder, Rolf Adelsberger, Manuela Pastore-Wapp, Erwin E. H. van Wegen, Stephan Bohlhalter, Tobias Nef, Paul Krack and Tim Vanbellingen
Sensors 2023, 23(3), 1563; https://doi.org/10.3390/s23031563 - 01 Feb 2023
Cited by 6 | Viewed by 5103
Abstract
We often interact with our environment through manual handling of objects and exploration of their properties. Object properties (OP), such as texture, stiffness, size, shape, temperature, weight, and orientation provide necessary information to successfully perform interactions. The human haptic perception system plays a [...] Read more.
We often interact with our environment through manual handling of objects and exploration of their properties. Object properties (OP), such as texture, stiffness, size, shape, temperature, weight, and orientation provide necessary information to successfully perform interactions. The human haptic perception system plays a key role in this. As virtual reality (VR) has been a growing field of interest with many applications, adding haptic feedback to virtual experiences is another step towards more realistic virtual interactions. However, integrating haptics in a realistic manner, requires complex technological solutions and actual user-testing in virtual environments (VEs) for verification. This review provides a comprehensive overview of recent wearable haptic devices (HDs) categorized by the OP exploration for which they have been verified in a VE. We found 13 studies which specifically addressed user-testing of wearable HDs in healthy subjects. We map and discuss the different technological solutions for different OP exploration which are useful for the design of future haptic object interactions in VR, and provide future recommendations. Full article
(This article belongs to the Special Issue Vision Science and Technology in Human Computer Interaction Systems)
Show Figures

Figure 1

Back to TopTop