Application of Intelligent Human-Computer Interaction

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: 20 November 2024 | Viewed by 662

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Engineering, Hongik University, Seoul 04066, Republic of Korea
Interests: virtual reality; haptics; human-computer interaction; human-robot interaction

E-Mail Website
Guest Editor
Center for Intelligent & Interactive Robotics, Korea Institute of Science and Technology, Seoul 02792, Republic of Korea
Interests: human-computer interaction; human-robot interaction; virtual reality

Special Issue Information

Dear Colleagues,

Now is the era of computers, with which people interact all day, at workplaces and at home. As a result, the significance of Human–Computer Interaction (HCI) has grown larger than ever. Meanwhile, we have also observed another technological revolution in machine intelligence and artificial intelligence in the last decade. Intelligent Human–Computer Interaction (IHCI) has emerged from such backgrounds, encompassing studies on HCI, artificial intelligence, machine intelligence, computer vision, and signal processing. 

This Special Issue is devoted to IHCI methods. The authors are encouraged to submit original research articles on HCI and machine intelligence on topics including, but not limited to, the following:

  • Interaction Design;
  • Intelligent Interfaces;
  • Augmented/Virtual/Extended Reality;
  • Machine learning studies in HCI;
  • AI studies in HCI;
  • Haptics;
  • Human–Robot Interaction;

Dr. Jaeyoung Park
Dr. Jung-Min Park
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • interaction design
  • intelligent interfaces
  • augmented/virtual/extended reality
  • machine learning studies in HCI
  • AI studies in HCI
  • haptics
  • human–robot interaction

Published Papers (1 paper)

Order results
Result details
Select all
Export citation of selected articles as:

Research

24 pages, 9262 KiB  
Article
Automated Image-Based User Interface Color Theme Generation
by Primož Weingerl
Appl. Sci. 2024, 14(7), 2850; https://doi.org/10.3390/app14072850 - 28 Mar 2024
Viewed by 373
Abstract
Color plays an essential role in the design of user interfaces and significantly impacts the user experience. To create aesthetically pleasing and user-friendly interfaces, the colors of the user interface should be consistent with the images. The latter can be challenging to achieve, [...] Read more.
Color plays an essential role in the design of user interfaces and significantly impacts the user experience. To create aesthetically pleasing and user-friendly interfaces, the colors of the user interface should be consistent with the images. The latter can be challenging to achieve, as images often have different colors and are often changed by editors or authors who do not have sufficient design knowledge. To solve this problem, we have developed a model that automatically determines the color theme of the user interface based on a given image. The model first extracts the most prominent colors from the image and then considers both aesthetic (color harmony and compatibility with the image) and usability aspects (color contrast, color diversity, and color strength). All color calculations are performed in the perceptually uniform color space CAM02-UCS. In addition, the model can be adapted to the user’s needs and requirements. To test the model, we implemented it in a web-based application in which the colors were automatically selected based on the featured image. The resulting color themes were then evaluated by the users, who were mainly professional designers. According to the results, the model generates color themes that are consistent with the image, aesthetic, and user-friendly. An important observation was also that color harmony can be achieved simply by using the most prominent colors of the image (regardless of their hue), suggesting that color harmony is strongly influenced by the context of use. The presented model holds significant practical importance as it can be utilized in various applications and tools. For instance, it can automatically choose a color theme for a user interface based on a particular image, such as a company logo or a product image. Moreover, it can dynamically adjust the colors of elements in real time based on the image that is visible simultaneously with the elements. Full article
(This article belongs to the Special Issue Application of Intelligent Human-Computer Interaction)
Show Figures

Figure 1

Back to TopTop