Next Article in Journal
Optimal Government Subsidy Decision and Its Impact on Sustainable Development of a Closed-Loop Supply Chain
Next Article in Special Issue
Exploring the Future Design Approach to Ageing Based on the Double Diamond Model
Previous Article in Journal
An Industrial Case Study on the Monitoring and Maintenance Service System for a Robot-Driven Polishing Service System under Industry 4.0 Contexts
Previous Article in Special Issue
Data-Driven Futuristic Scenarios: Smart Home Service Experience Foresight Based on Social Media Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Emotional Design Model for Future Smart Product Based on Grounded Theory

1
Academy of Art and Design, Tsinghua University, Beijing 100084, China
2
Central Academy of Fine Arts, Beijing 100105, China
3
School of Future Design, Beijing Normal University, Beijing 100091, China
4
College of Education, Zhejiang University of Technology, Hangzhou 310014, China
*
Author to whom correspondence should be addressed.
Systems 2023, 11(7), 377; https://doi.org/10.3390/systems11070377
Submission received: 30 May 2023 / Revised: 13 July 2023 / Accepted: 21 July 2023 / Published: 23 July 2023
(This article belongs to the Special Issue Futures Thinking in Design Systems and Social Transformation)

Abstract

:
Recently, smart products have not only demonstrated more functionality and technical capabilities but have also shown a trend towards emotional expression. Emotional design plays a crucial role in smart products as it not only influences users’ perception and evaluation of the product but also promotes collaborative communication between users and the product. In the future, emotional design of smart products needs to be regarded as an important comprehensive design issue, rather than simply targeting a specific element. It should consider factors such as design systems, values, business strategies, technical capabilities, design ethics, and cultural responsibilities. However, currently, there is a lack of a design model that combines these elements. Currently, there are numerous practices in emotional design for smart products from different perspectives. They provide us an opportunity to build a comprehensive design model based on a large number of design case studies. Therefore, this study employed a standardized grounded theory approach to investigate 80 smart products and conducted interviews with 12 designers to progressively code and generate a design model. Through the coding process, this research extracted 547 nodes and gradually formed 10 categories, ultimately resulting in a design model comprising 5 sequential steps. This model includes user requirements, concept definition, design ideation, design implementation, and evaluation, making it applicable to most current and future emotional design issues in smart products.

1. Introduction

In recent years, the progress in artificial intelligence technology has led to the increasing maturity of smart products in terms of recognition, identification, and expression. These products not only have the capability to handle a wide range of complex tasks but also possess the ability to recognize user emotions and engage in more natural communication through various modes of expression [1]. This will lead to a future where smart products provide users with richer emotional experiences, gaining increased attention from design researchers [2]. Several studies have provided evidence that smart products capable of expressing emotions are perceived as more likable and trustworthy within human teams [3,4]. This capability enhances mutual understanding and fosters trust among team members [5,6,7]. Early in the 20th century, research on emotional design proposed that emotions will become an integral part of machine designs in the future [8]. Nowadays, with the advancement of artificial intelligence technology, design researchers are facing a wider range of complex research opportunities within this field [9,10,11,12,13].
The user’s emotional experience with a product is subjective and arises from product interaction. Desmet and Hekkert have established a model to explain the emotional experience: aesthetic experience, experience of meaning, and emotional experience [14]. Norman also proposed three aspects of emotional design: visceral (perceptual), behavioral (expectation), and reflective (intellectual) [15]. Lin effectively combines cultural and emotional design, supporting Norman’s arguments [16]. Scherer emphasizes the connection of emotions to external or internal events, enabling the design of emotional experiences [17]. Many studies have utilized these theories to create emotional experiences in smart product design. A few examples include setting gender for voice assistants [18,19] and designing dynamically changing light colors for smart speakers [9,10,13]. Even though much research has been conducted on the emotional design of smart products, there has not yet been a design model that encompasses all of the important aspects. Furthermore, some theories that attempt to discuss emotional design systems for smart products are primarily based on the designer’s perspective [8,20], and there is currently a lack of empirical research based on a large number of existing product design cases.
As discussed in this paper, as emotional design of smart products continues to evolve with increasing complexity, it should be regarded as a comprehensive and systematic issue in the future [20]. It is not just a single design issue, but also relevant and coherent with user needs, values, business strategies, technical capabilities, design ethics, and cultural responsibility [21]. Currently, it has been proven in other domains that emotional design can be summarized into systematic design models [16]. We believe that emotional design for smart products can engage in the same kind of work. Fortunately, the emotional design of smart products has been applied in a number of cases, allowing us to develop design models more objectively and effectively by qualitatively analyzing the vast amount of available information and opinions [22]. Therefore, the goal of our study is to propose a model for emotional design of future smart products based on the analysis of a large amount of data using a grounded theory approach.

2. Literature Review

2.1. Examples of Emotional Design for Smart Products

The concept of emotional design has been widely adopted in both industry and academia. A few examples of this are displaying emoticons to represent a product’s emotional state [23,24,25], influencing a user’s perception of a product through the gender of the voice [26,27,28,29], or selecting a product’s emotion color based on the correlation between emotion and color [10,30,31]. Even though all of the above address emotional design in smart products, they are not considered systematically.
In recent years, some design studies have attempted to systematize smart products’ emotional design. Examples include using anthropomorphism as core modules [32,33] and giving smart products their own personality as clues for design. Several more examples include the dynamic change of light to simulate product breathing [34] or the design of behavioral actions for a product [35,36]. Furthermore, there are examples of two-way interactions in which the user’s behavior affects the product’s emotional presentation [37,38].
The majority of these design examples focus on the emotional presentation and interaction with smart products. Few integrate other elements, such as business strategy, technical competence, and cultural awareness.

2.2. Various Perspectives on the Emotional Design for Smart Products

The emotional design of smart products is an interdisciplinary topic that can be discussed from the perspectives of different disciplines such as computer science, design, psychology, and sociology. Norman, a design psychologist, has stated that when carrying out emotional design, the challenge is not only to express emotions through design elements but to make them communicate with humans through a system that meets the needs of the machine [8]. It is critical to understand that emotion is not just a surface expression, instead, it is a central organizational structure that facilitates the integration of the product’s many modules [39].
There have been a variety of perspectives presented on this topic by researchers from various disciplines. From Picard’s perspective, technology capability is crucial for products to recognize, understand, and express emotions [40]. And according to Fogg’s psychology theory, products should be designed according to human social cues to be viewed as individuals with emotions [41]. Meanwhile, Chapman believes that when making emotional designs, users must become collaborators and provide ideas to the designer [2].
As a result, we suggest that all of the above points are valid and should be taken into account in a comprehensive manner when developing models for the emotional design of future smart products. As part of the process of developing a model, a variety of perspectives from different disciplines can be utilized as references to improve the reliability of our design models.

2.3. Models Related to Smart Product Design

Current research on product design models is divided into two main categories: models that focus on analysis and models that concentrate on practice. A design model that emphasizes analysis, such as design thinking models [32,33,34,35,36,37,38,39,40,41,42,43,44], analyzes the design from the designer’s perspective. In general, these models are based on a macro design perspective and can be used to analyze almost any design problem; however, they do not provide effective and specific guidance on specific design issues. A more similar design model to this study may be the emotion-based design analysis model [8,20,45,46], which proposes to design in terms of the emotional needs of the users but fails to describe other important aspects of a product, including technology, business, and management.
Practice-focused design models can be classified into two categories: design and development practice. Design practice models [47] describe design processes for specific topics, such as creative interface designs [48], air conditioner appearance designs [49], and metric product designs [50]. Most of these models are derived from design teams’ experience so they are subjective in nature. On the other hand, development practice models focus mainly on the product production cycle [51,52,53] and its technical realization [54] but do not consider the role design plays during this period.
This study aims to provide an objective and reliable design model for smart products’ emotional design. In contrast to focusing on the designer’s point of view, we formulate our model by analyzing and summarizing a wide range of product information and user interviews. Therefore, we consider non-design aspects such as technology and business in our model. As a result, the resulting model is useful both for design analysis and as a guide to emotional product design practice.

2.4. Selection and Comparison of Research Methods

In order to conduct this study, we investigated and filtered research methods. We first examined common research methodologies used in the design, such as Kansei engineering [55] and research through design [56], to generate design models based on a summary of the experiences of design teams. Second, we considered qualitative research methods for our study, such as observational methods [57], interpretive phenomenological analysis [58], or grounded theory methods [59].
According to Smith and Morrow, a theoretical model represents a reasonable assumption and simplification of data [60], so we believe that grounded theory would be a better fit for this study. Through the analysis of large amounts of raw data and the abstraction of the data in a step-by-step manner, new theories or models may be constructed. Presently, there is no complete model of emotional design for smart products, as we discussed earlier. Therefore, standardized grounded theory can be used to generate valid design models from other data when there is little similar reference information available.
Thus, this study applied grounded theory to develop a theoretical model for the emotional design of smart products. In addition to providing researchers with a reliable design process, this model will also facilitate the production, development, management, and marketing of smart products, which will incorporate emotional elements into the product lifecycle.

3. Research Methodology

The objective of this study is to develop a systematic model for the emotional design of smart products. This paper aims to integrate various perspectives through qualitative research, including the analysis of case studies and user interviews to draw reliable conclusions.

3.1. Study Design

To build an emotional design model for smart products, grounded theory [59] was chosen. Grounded theory helped us build the model without prior knowledge of existing theories [61]. We used the standardized grounded theory coding steps [62] including a series of processes such as data collection, data analysis, theory coding, and theory saturation testing, as shown in Figure 1.

3.2. Research Participants

The coding was conducted jointly by three Master of Design students, and at the end of each coding session, two other Master of Design students were invited to participate in a consistency discussion from a third perspective to reduce errors caused by subjectivity [63]. For the data collection phase, a total of five postgraduate design students were invited to collaborate on smart product data collection to avoid subjectivity.

3.3. Data Collection

In order to encompass the perspectives of both designers and users on the emotional design of smart products in the model [15], this study utilized two sources of information: first, product descriptions and user reviews; and second, semi-structured in-depth interviews. For data collection, we followed the principles of purposeful sampling [64] and used an intensity sampling approach [65] which refers to finding the sample that is most relevant and informative for the study’s objectives. As intensity sampling is a subjective procedure, five students were invited to participate in the data collection process together.

3.3.1. Product Introduction and Users’ Comments

We invited five design postgraduates to collect product data and user evaluations that met the following criteria: 1. The smart product possesses the capability to recognize, define, and express emotions and is explicitly described in pertinent materials such as development documents, promotional web pages, advertising videos, and others; 2. The product had been released and used by users and were not just conceptual designs; and 3. User evaluations were confirmed based on real experiences rather than hypothetical ones, which were determined through purchase details, photos, videos, and other means. After collecting data from various sources, we employed the triangulation verification method, which involves cross-referencing data from multiple perspectives to filter them [66]. For example, the team collaborated to determine their validity by comparing user evaluations with product descriptions. We collected 77 smart product profiles and user reviews from 10 different countries as initial data. Each product profile contains at least 10 valid profiles and at least 5 valid reviews.

3.3.2. Interviews

Using a semi-structured in-depth interview, we collected data on multiple meanings and perceptions of actions and events [67]. We interviewed five men and five women with experience in smart product design. Four of them were corporate employees and six were graduate design students.
Our interviews were divided into two parts, conducted face-to-face in a quiet room, and lasted 30 min. During the first part of the interview, we sought to understand the interviewees’ experiences and viewpoints on smart products; in the second part, we invited participants to imagine creating a new design for an existing smart product that can understand users and express emotions.

3.4. Data Analysis

We conducted a two-part coding process using Nvivo: substantive coding and theoretical coding. Substantive coding includes open, axial, and selective coding. Theoretical coding involves analyzing and comparing coding results with existing research to generate new theories. We referred to studies on different types of affective phenomena [14,17] to conduct the encoding work. To reduce individual coder subjectivity, a consensus qualitative research (CQR) approach was chosen for the data analysis process [68]. According to the consensus principle [63], the coding was performed by three graduate students in the design field. Additionally, two other postgraduate design students were invited to discuss the results after the coding process had been completed.

3.4.1. Substantive Coding

During our substantive coding process, we began with open coding. The text was examined sentence by sentence, and keywords were identified from the original sources and coded accordingly. A total of 547 nodes were generated, including nerdy, voice gender, trustworthiness, color scheme, situational awareness, and so on. Afterward, we performed axial coding to analyze the associations between these nodes to produce 85 core elements. These elements included design philosophy, appearance, personality, and anthropomorphic behavior, among others. Finally, selective coding was conducted to integrate these core elements, including 10 core emotional design categories such as user emotional needs and product persona concepts.

3.4.2. Theoretical Coding

We developed an emotional design model for smart products by comparing substantive coding categories with existing research. In comparing the Double Diamond Model [42], IDEO’s Design Process Model [69], and Stanford University’s Design Thinking Model [43], we can discover clues for theoretical coding.
According to the Design Thinking Model, four substantive coding categories, such as “Design Elements”, can be grouped into design practices related to the Prototype stage. Similarly, “Role Construction” and “Personality and Behavior” can be classified as “Emotional Elements of Smart Products”, according to the Ideate session [70]. Additionally, it is possible to group the other core categories into the stages of Empathy, Define, and Test in order to formulate a complete design model framework.

3.4.3. Theoretical Saturation Test

As a saturation test, we added two interviews and three product cases. Both interviewees are male employees with experience in designing smart products. It was found that no new concepts or categories were generated, indicating that our emotional design model for smart products had largely reached saturation [71]. Additionally, we extracted 75 statements from the original data that described the emotional design thinking of smart products in order to confirm that the design thinking implied in the original data is reflected in our derived design model.

4. Results

4.1. Raw Data and Database

Data were collected from five graduate students in the field of design for substantive coding and theoretical saturation tests. As raw data, 80 product descriptions and user comments (including the original 77 cases and the 3 cases from the theoretical saturation test) on smart products were gathered from 10 different countries. We collected 971 emotional-relevant product introduction statements and 416 related user comments. Also, we interviewed 12 smart product designers (including the original 10 users and the 2 from the theoretical saturation test), ages 20–30, 6 men and 6 women, 8 of whom were postgraduates in design and 4 were company employees. In total, approximately 6 h and 11 min were recorded during the interviews.
Open coding was used to summarize the above data, resulting in 547 nodes, which represent a valid abstraction of the original material. There are a number of aspects related to the emotionalization of smart products. These include “multi-modality”, “bionics”, “identity value”, “lightness”, “portability”, “minimalist appearance”, “ease of use for children”, and so on.

4.2. Coding Result

Table 1 presents the coding results. To begin with, the 547 open coding nodes were compared and generalized by 3 students independently, resulting in the generation of 63, 87, and 69 axial codes. Forty-seven of these items were essentially consistent, with the only difference being in the description. Thirty-eight remaining axial codes with large differences were confirmed following consistency discussions. As a result, the total number of axial codes is 85. These codes include “Demand Expectations”, “Design Vision”, “Occupational Role”, “Material”, “Ease of Use”, “Operation Setting”, etc. In order to ensure that the axial codes are clear and encompass all aspects of the emotional design of smart products, two external design postgraduates were invited to participate in the two-hour discussion on consistency.
After axial coding, we systematically analyzed and compared the codes and, at a theoretical level, identified the core categories. Selective coding followed the same workflow as axial coding, with three design students working collaboratively to produce ten core categories. The three students had the same idea for eight categories: “User Emotional Needs”, “Concepts for Emotional Design”, “Role Construction”, “Character and Behavior”, “Emotionally Relevant Functions”, “Technical Capabilities”, “Product Evaluations”, and “User Emotional Experience”. Initial disagreements regarding “Use and Configuration” and “Design Elements” were resolved after a consistent discussion. The above ten categories represent the ten most critical aspects of smart products’ emotional design.

4.3. Category Definition and Model Presentation

A comparison of existing design models led to the development of an emotional design model for smart products. We referred to IDEO’s Design Process Model [69] and Stanford University’s Design Thinking Model [43] to further group the ten core categories into five categories. These categories are “User’s Emotional Needs”, “Concepts for Emotional Design”, “Emotional Attributes of Smart Products”, “The Practice of Smart Products’ Emotional Design”, and “Experience and Evaluation”. Figure 2 illustrates how these five categories are linked in a linear relationship to produce the emotional design model for smart products.

4.4. Theoretical Saturation Test

With the addition of two interviews and three product cases, our theoretical saturation test indicated that the current model did not continue to generate new concepts or categories. Also, we extracted 75 statements describing the emotional design thinking of smart products from the original data, enumerated their causal relationships, and represented them with Sankey diagrams using eCharts. Through this diagram in Figure 3, it is easy to see that the present model largely matches the design logic implied by the original data. As a result, our model for smart products’ emotional design is largely saturated [71].

4.5. Selective Coding Interpretation

In accordance with the standardized grounded theory coding process [62], this study summed 10 categories through selective coding, as shown in Table 2. These 10 categories represent an abstraction and summary of the full range of emotional design for future smart products.
  • User’s Emotional Needs: It involves user and scenario analysis. Examples include the selection of target users and target scenarios as well as preferences derived from user research, such as emotional expectations. Moreover, there are concerns regarding special needs based on socio-cultural contexts, etc. Based on this category, real emotional needs of users are explained [72] and the core issues for emotional design are defined [47].
  • Concepts for Emotional Design: The design team identifies the core design concept by combining other information, such as cultural concepts and business strategies [73]. As an example, different design teams may propose concepts such as “emotion-relevant personalization” and “minimalism” for smart home products. It is important to define these design concepts in conjunction with social values, design ethics [74], business strategies, and design visions [75] which can be used to guide the design process in a variety of ways.
  • Role Construction: When brainstorming ideas for the emotional design of a smart product [76], designers often consider it to be a living thing, with role construction playing an important role. The concept of roles can be viewed in terms of occupational roles, interpersonal roles, and product roles, such as “doctor”, “friend”, “toy”, etc. According to the theory of role construction in social psychology, people subconsciously presuppose each other’s appearances and personalities when they interact. It is also possible to predict the behavior of others using role constructs. Therefore, defining the appropriate role for a product can help a design team in generating more specific design ideas.
  • Character and Behavior: For design teams to consider smart products as living beings for creative design, it is also important to consider their character and behavior. Designers often create personality types for smart products by imagining “behavioral habits”, “social attributes”, “anthropomorphic traits”, etc. [7,77,78]. In conjunction with the actual functionality and technical capabilities, designers can express the personality and behavior of a product in various ways.
  • Emotionally Relevant Functions: Design teams often start the design from a specific functionality perspective rather than considering the experiential aspects. Two aspects determine the functionalities of a smart product: the features set by the smart product itself and the updated features derived from analysis of the emotional needs of the users. As an example, for smart speakers, the information quiz is an existing feature, whereas the child mode dialogue is a new feature designed specifically for children.
  • Design Elements: When faced with a specific product function, design teams should consider the selection of appropriate design elements [9,79]. “Software”, “hardware”, “appearance”, “materials”, etc., are all design elements necessary to achieve product function. Additionally, it is essential to consider the “shape”, “color”, “digital content”, “size”, etc. of these design elements.
  • Technical Capabilities: Technical capabilities are equally significant as design elements since the emotionality of smart products is enhanced by artificial intelligence technologies [80], such as “context awareness”, “emotion recognition”, “motion control” [81], etc. To ensure that the emotion of the smart product can be successfully expressed by the product and experienced by the user, the design team should consider the technical capabilities required to implement the function.
  • Use and Configuration: The design team must decide how users interact with the product after defining design elements and technical capabilities [82,83]. An interaction between the user and the smart product can be defined by concepts such as “interaction methods”, “operation methods”, “hardware and software configurations”, “multi-end collaboration”, and “supporting services”, which enable the user to interact with the smart product and have an experience [11].
  • Product Evaluation: A product evaluation can be identified after a product or prototype has been released and tested by real users [84], such as “ease of use”, “proactivity”, “inclusiveness” [85], etc. The design team can reflect on and further improve the product based on the objective evaluations derived from the experiments after the smart product has been developed.
  • User Emotional Experience: In addition to objective evaluations, users also have subjective emotional experiences with smart products [78,86]. For example, users may develop a certain level of attachment and closeness to the product or have anthropomorphic interpretations of the product. Most of these emotional experiences can be used as a useful reference for improving product design in terms of emotion-relevant personalization and emotion-relevant customization.

4.6. Explanation of Theoretical Sampling

The above ten categories are further subdivided into five classes, namely “User’s Emotional Needs”, “Concepts for Emotional Design”, “Emotional Attributes of Smart Products”, “The Practice of Smart Products’ Emotional Design”, and “Experience and Evaluation”. The five classes can be interpreted in light of the Double Diamond Model [42] and can also be compared to Stanford University’s Design Thinking Model [43], which includes stages such as “Empathy”, “Define”, “Ideate”, “Prototype”, and “Test”.

4.6.1. “User’s Emotional Needs” to “Concepts for Emotional Design”

Design teams often use empathy methods to identify user needs, such as observation and engagement [87]. Understanding users’ needs may allow design teams to identify what users expect from a product in terms of emotions, as well as dissatisfactions and barriers to emotions that may exist with smart products [43].
On the basis of the emotional needs of the user and the context in which they are used, a number of emotional design concepts are presented. These concepts provide the framework for the subsequent design and serve as the basis for developing a successful solution [73].
Here is an example based on the following initial data. A user is dissatisfied with the appearance of an existing smart product and requests a redesign that incorporates abstract design elements. This results in an emotional design concept, such as “minimalist design”.
If the product looks like a real animal, it would be a bit creepy, and I’d like the appearance to be more abstract than realistic”.

4.6.2. Emotional Attributes of Smart Products

Emotional attributes of smart products are the result of divergent thinking from its design concepts, with the purpose of exploring a wide range of ideas about emotional design. In light of the designer’s intention to create an intelligent object, this phase consists of three factors: role, personality, and behavior [88]. These psychologically relevant attributes [89] are inextricably linked to the emotional experience of the user and can serve as a basis for motivating creative design.
For example, in the product description below, the smart product is described as a “friend” and given the character of “loyal”. In order to fulfill this emotional expectation, the design team designed a “regular update of the user’s desired content” function.
For many years, it’s been your loyal friend and is constantly updated with interesting content you need”.

4.6.3. The Practice of Smart Products Emotional Design

The practice of smart products’ emotional design is the process of bringing ideas out of the mind and into the world. We begin this process by selecting specific product functions, either existing product functions that need to be improved or new product functions based on emotional needs. Following this, the design team considers how to implement this functionality and meet both technical and emotional requirements [9,79]. To meet these needs, it is necessary to take into account hardware, software, appearance, colors, materials, etc., as well as develop intelligent technological capabilities [80]. The final step is to define how the product may be used and configured so that its emotional design can be experienced by the user.
For example, as explained below, the design team proposed that a touch-sensitive function be developed for the robot Ollie’s animal-like features. Therefore, to fulfill this emotional function, it was necessary to equip it with a touch sensor and develop sensory capabilities.
The haptic sensor gives Ollie the ability to ‘sense contact’ and respond to touch”.

4.6.4. Experience and Evaluation

To evaluate a product and gather feedback on the solution [11], user experiments should be conducted after the product has been released. Evaluation results of a smart product’s emotional design, including objective descriptions derived from experiments as well as subjective user experiences. As the final step in emotional design, this process not only allows us to review the design but also provides a reliable direction for iterative improvement.
Here’s an example of a robot designed to have consistent behavior, so that the user can anticipate it and feel comfortable communicating with it. This type of positive user experience can be used in subsequent design iterations.
QTrobot acts in a consistent and predictable way to prevent children from feeling overwhelmed”.

5. Discussion

5.1. Comparison with Existing Models

By comparing the Double Diamond Model [42] and the Design Thinking Model [43], this study performs a theoretical coding step. These two models can explain most design issues from a macro perspective, while our model focuses on a single design issue: the emotional design of future smart products. For example, in the Design Thinking Model, the Ideate stage primarily emphasizes team-based divergent thinking. In contrast, our model provides a more specific description of divergent thinking in terms of roles, personalities, and behaviors. Therefore, it is more useful as a reference for designers regarding this topic. Furthermore, this model differs from other theoretical models of emotional design, such as the Analytical Model of Emotional Design [8,45] and the Creative Interface Design Process Model [48]. They focus their work on analyzing user emotions and presenting specific design elements from the designer’s perspective. However, they fail to take into account numerous factors, including user needs, business strategies, social values, core functions, technical capabilities, etc.
It is important to note that when designing smart products, other factors may affect the development of the product, so taking a single perspective can lead to a failed outcome [90]. Consequently, the model proposed in this study encompasses all aspects of user research, conceptual design, technology development, production, and testing. These enable design teams to look at all aspects of the process. A number of similar models have been developed, including the product development process cycle management model [51,52,53] and the product development practice model [54]. Typically, they are developed from the perspective of product management and product development, while very few are interpreted from a design perspective, and even fewer are interpreted from an emotional perspective. While our design model encompasses the entire process of product design and production, its core theme is not production management but rather emotional design. Further, we propose to integrate psychological elements such as role, personality, and behavior into the emotional design process of smart products, thereby developing a new design model.

5.2. How to Apply This Model in Smart Product Design

The model presented in this study can be a valuable reference for the emotional design of smart products. However, it is recommended that design teams consider making suitable adjustments based on specific circumstances, taking into account other relevant models or theories. Design teams can use the model to analyze the emotional aspects of smart products. In addition to analyzing and improving existing products, it can also be used to design new smart products. An innovative design team has the potential to make meaningful improvements to an existing smart product by following each step in the process. It should be noted that this design model does not require strict adherence. Depending on the design issues at hand, it is possible to skip certain steps or categories. However, we do not recommend reversing the order. For instance, it is acceptable to bypass the “Concepts for Emotional Design” category and propose “Role Construction” directly from “User’s Emotional Needs”. However, it would be challenging to draw meaningful conclusions by considering “User’s Emotional Needs” from “Role Construction” in reverse. If a design problem is identified in the final category, users can revisit the beginning of the model to reassess the problem and redesign accordingly.
For the design of new smart products, this model should not be used alone but in conjunction with other smart product design process models. As this model focuses primarily on the emotional design of smart products, it cannot fully address non-emotional factors such as solution, productivity, and management. When used alone, it may overlook other essential features of a smart product.

6. Conclusions

The concept of emotional design for smart products is currently under heavy research, but it has not yet reached its full potential. In this area, existing design theories are too fragmented and subjective to provide guidance. Our belief is that a design model of this problem is needed to provide design teams with reliable theoretical guidance on the emotional design of future smart products. It is feasible to construct a reliable, objective, and systematic model by synthesizing fragmented design cases, design perspectives, and design theories. As a result, this study collected data through user interviews and case studies and developed a design model based on grounded theory.
As shown in Figure 2, the central contribution of this paper is the model of emotional design for future smart products. Using the grounded theory approach, we derived this design model from a large amount of data. The model aims to describe the emotional design of smart products as a reproducible and iterative process. Design teams can use it as a guide to incorporate emotional design into the development of smart products. Our goal is to expand emotional design from being a single design issue to becoming a comprehensive design system by integrating multiple aspects. Our expectation is that future design researchers will develop new tools and methods based on this model.
Another unique aspect of this study is the use of grounded theory in the modeling process. Often, this type of design analysis and design process-related modeling is constructed through case studies and observation. It is, however, difficult to develop emotional design models of smart products due to a lack of design theory and prototyping difficulty. According to our findings, grounded theory can be used to develop theoretical models efficiently and fairly objectively in the absence of complete information.
This study also has certain limitations. Firstly, the model is currently in the theoretical stage and has not been tested in practice. Additionally, the 10 categories derived from the coding have not been thoroughly explored yet. Adjustments may be necessary during the implementation process. Second, all of our interviewees were Chinese. Although the product cases were collected from 10 different countries, they may still be biased due to cultural differences and social values. Third, this study covers a wide range of scenarios and product types, including healthcare, home, campus, etc., as well as physical and digital products. There was no consideration of whether the different types of products differ in theory. All three of the mentioned issues will be important areas for future work.

Author Contributions

Conceptualization, C.C. and Z.F.; methodology, C.C. and L.X.; validation, C.C. and H.W.; formal analysis, C.C., W.W., Z.Y. and H.W.; investigation, W.W.; resources, W.W. and Z.Y.; data curation, C.C.; writing—original draft preparation, C.C. and Y.C.; writing—review and editing, C.C. and Z.F.; visualization, Y.C.; supervision, C.C. and Z.F.; project administration, C.C.; funding acquisition, Z.F. All authors have read and agreed to the published version of the manuscript.

Funding

This project was funded by Graduate Education Innovation Grants, 080-110208002, Academy of Art and Design, Tsinghua University.

Data Availability Statement

Data are available on request.

Conflicts of Interest

The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Ma, X. Towards Human-Engaged AI. In Proceedings of the 27th International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 13–19 July 2018; pp. 5682–5686. [Google Scholar]
  2. Chapman, J. Emotionally Durable Design; Routledge: Oxfordshire, UK, 2012; ISBN 978-1-136-56743-8. [Google Scholar]
  3. Brave, S.; Nass, C.; Hutchinson, K. Computers That Care: Investigating the Effects of Orientation of Emotion Exhibited by an Embodied Computer Agent. Int. J. Hum.-Comput. Stud. 2005, 62, 161–178. [Google Scholar] [CrossRef]
  4. Paschkewitz, J.; Patt, D. Can AI Make Your Job More Interesting? Issues Sci. Technol. 2020, 37, 74–78. [Google Scholar]
  5. Dalvandi, B. A Model of Empathy for Artificial Agent Teamwork. Ph.D. Thesis, University of Northern British Columbia, Prince George, BC, Canada, 2013. [Google Scholar]
  6. Luca, J.; Tarricone, P. Does Emotional Intelligence Affect Successful Teamwork? In Proceedings of the 18th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education, Melbourne, Australia, 9 December 2001; pp. 367–376. [Google Scholar]
  7. Fzal, S.; Dempsey, B.; D’Helon, C.; Mukhi, N.; Pribic, M.; Sickler, A.; Strong, P.; Vanchiswar, M.; Wilde, L. The Personality of AI Systems in Education: Experiences with the Watson Tutor, a One-on-One Virtual Tutoring System. Child. Educ. 2019, 95, 44–52. [Google Scholar] [CrossRef]
  8. Norman, D.A. Emotional Design: Why We Love (or Hate) Everyday Things; Basic Books: New York, NY, USA, 2004; ISBN 978-0-465-00417-1. [Google Scholar]
  9. Zabala, U.; Rodriguez, I.; Martínez-Otzeta, J.M.; Lazkano, E. Expressing Robot Personality through Talking Body Language. Appl. Sci. 2021, 11, 4639. [Google Scholar] [CrossRef]
  10. Nijdam, N.A. Mapping Emotion to Color. In Book Mapping Emotion to Color; University of Twente: Enschede, The Netherlands, 2009; pp. 2–9. [Google Scholar]
  11. Ma, J.; Feng, X.; Gong, Z.; Zhang, Q. The Design Definition and Research of In-Car Digital AI Assistant. J. Phys. Conf. Ser. 2021, 1802, 032096. [Google Scholar] [CrossRef]
  12. Liu, L.; Zhang, A.; Zhang, L.; Xu, J. Research on Emotional Design of Intelligent Sleep Products for the Elderly Based on Kano Model and Customer Satisfaction and Dissatisfaction Coefficients. In Proceedings of the 2021 2nd International Conference on Intelligent Design (ICID), Xi’an, China, 19 October 2021; pp. 528–531. [Google Scholar]
  13. Johnson, D.O.; Cuijpers, R.H.; van der Pol, D. Imitating Human Emotions with Artificial Facial Expressions. Int. J. Soc. Robot. 2013, 5, 503–513. [Google Scholar] [CrossRef]
  14. Desmet, P.; Hekkert, P. Framework of Product Experience. Int. J. Des. 2007, 1, 57–66. [Google Scholar]
  15. Norman, D.A.; Ortony, A. Designers and users: Two perspectives on emotion and design. In Proceedings of the Symposium on Foundations of Interaction Design, Ivrea, Italy, 12−13 November 2003; pp. 1–13. [Google Scholar]
  16. Lin, R.T. Transforming Taiwan aboriginal cultural features into modern product design: A case study of a cross-cultural product design model. Int. J. Des. 2007, 1, 45–53. [Google Scholar]
  17. Scherer, K.R. What are emotions? And how can they be measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
  18. Sandygulova, A.; O’Hare, G.M.P. Children’s Perception of Synthesized Voice: Robot’s Gender, Age and Accent. In Social Robotics; Tapus, A., André, E., Martin, J.-C., Ferland, F., Ammi, M., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2015; Volume 9388, pp. 594–602. ISBN 978-3-319-25553-8. [Google Scholar]
  19. Song, S.; Baba, J.; Nakanishi, J.; Yoshikawa, Y.; Ishiguro, H. Mind the Voice!: Effect of Robot Voice Pitch, Robot Voice Gender, and User Gender on User Perception of Teleoperated Robots. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25 April 2020; pp. 1–8. [Google Scholar]
  20. Lottridge, D.; Chignell, M.; Jovicic, A. Affective Interaction: Understanding, Evaluating, and Designing for Human Emotion. Rev. Hum. Factors Ergon. 2011, 7, 197–217. [Google Scholar] [CrossRef]
  21. Asada, M. Towards Artificial Empathy. Int. J. Soc. Robot. 2015, 7, 19–33. [Google Scholar] [CrossRef] [Green Version]
  22. Li, X.; Cai, S. Emotional Design for Intelligent Products Using Artificial Intelligence Technology. In Proceedings of the 2021 2nd International Conference on Intelligent Design (ICID), Xi’an, China, 19 October 2021; pp. 260–263. [Google Scholar]
  23. Zuo, X.; Yu, X.; Du, M.; Song, Q. Generating Consistent Multimodal Dialogue Responses with Emoji Context Model. In Proceedings of the 2022 5th International Conference on Artificial Intelligence and Big Data (ICAIBD), Chengdu, China, 27 May 2022; pp. 617–624. [Google Scholar]
  24. Beattie, A.; Edwards, A.P.; Edwards, C. A Bot and a Smile: Interpersonal Impressions of Chatbots and Humans Using Emoji in Computer-Mediated Communication. Commun. Stud. 2020, 71, 409–427. [Google Scholar] [CrossRef]
  25. Fadhil, A.; Schiavo, G.; Wang, Y.; Yilma, B.A. The Effect of Emojis When Interacting with Conversational Interface Assisted Health Coaching System. In Proceedings of the 12th EAI International Conference on Pervasive Computing Technologies for Healthcare, New York, NY, USA, 21 May 2018; pp. 378–383. [Google Scholar]
  26. Rogers, K.; Bryant, D.; Howard, A. Robot Gendering: Influences on Trust, Occupational Competency, and Preference of Robot over Human. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25 April 2020; pp. 1–7. [Google Scholar]
  27. Ye, H.; Jeong, H.; Zhong, W.; Bhatt, S.; Izzetoglu, K.; Ayaz, H.; Suri, R. The Effect of Anthropomorphization and Gender of a Robot on Human-Robot Interactions. In Advances in Neuroergonomics and Cognitive Engineering; Ayaz, H., Ed.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2020; Volume 953, pp. 357–362. ISBN 978-3-030-20472-3. [Google Scholar]
  28. Yu, C.; Fu, C.; Chen, R.; Tapus, A. First Attempt of Gender-Free Speech Style Transfer for Genderless Robot. In Proceedings of the 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Sapporo, Japan, 7 March 2022; pp. 1110–1113. [Google Scholar]
  29. Eyssel, F.; Kuchenbrandt, D.; Bobinger, S. ‘If You Sound Like Me, You Must Be More Human’: On the Interplay of Robot and User Features on Human—Robot Acceptance and Anthropomorphism. In Proceedings of the Seventh Annual ACM/IEEE International Conference on Human-Robot Interaction (HRI ’12), New York, NY, USA, 5–8 March 2012; pp. 125–126. [Google Scholar]
  30. Ariyoshi, T.; Nakadai, K.; Tsujino, H. Effect of Facial Colors on Humanoids in Emotion Recognition Using Speech. In Proceedings of the RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759), Kurashiki, Japan; 2004; pp. 59–64. [Google Scholar]
  31. Kim, M.G.; Lee, H.S.; Park, J.W.; Jo, S.H.; Chung, M.J. Determining Color and Blinking to Support Facial Expression of a Robot for Conveying Emotional Intensity. In Proceedings of the RO-MAN 2008—The 17th IEEE International Symposium on Robot and Human Interactive Communication, Munich, Germany, 1–3 August 2008; pp. 219–224. [Google Scholar]
  32. Duffy, B.R. Anthropomorphism and the Social Robot. Robot. Auton. Syst. 2003, 42, 177–190. [Google Scholar] [CrossRef]
  33. de Visser, E.J.; Monfort, S.S.; McKendrick, R.; Smith, M.A.B.; McKnight, P.E.; Krueger, F.; Parasuraman, R. Almost Human: Anthropomorphism Increases Trust Resilience in Cognitive Agents. J. Exp. Psychol. Appl. 2016, 22, 331–349. [Google Scholar] [CrossRef] [PubMed]
  34. Baraka, K.; Rosenthal, S.; Veloso, M. Enhancing Human Understanding of a Mobile Robot’s State and Actions Using Expressive Lights. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 652–657. [Google Scholar]
  35. Moflin. Available online: https://www.moflin.com (accessed on 5 January 2023).
  36. Emo Robot. Available online: https://comingsoon.higizmos.com/emo (accessed on 5 January 2023).
  37. MarsCat: A Bionic Cat, a Home Robot|Elephant Robotics. Available online: https://www.elephantrobotics.com/en/mars-en/ (accessed on 5 January 2023).
  38. Sony Aibo. Available online: https://us.aibo.com/ (accessed on 5 January 2023).
  39. Marsella, S.; Gratch, J. Modeling Coping Behavior in Virtual Humans: Don’t Worry, Be Happy. In Proceedings of the Second International Joint Conference on Autonomous Agents and Multiagent Systems—AAMAS ’03, Melbourne, Australia, 14–18 July 2003; p. 313. [Google Scholar]
  40. Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA; London, UK, 1997; ISBN 0-262-16170-2. [Google Scholar]
  41. Fogg, B.J. Persuasive Technology: Using Computers to Change What We Think and Do. Ubiquity 2002, 2002, 5. [Google Scholar] [CrossRef] [Green Version]
  42. Design Council a Study of the Design Process. Available online: https://www.designcouncil.org.uk/our-work/skills-learning/resources/11-lessons-managing-design-global-brands/ (accessed on 31 December 2022).
  43. Stanford University Design Thinking Bootleg. Available online: https://dschool.stanford.edu/resources/design-thinking-bootleg (accessed on 31 December 2022).
  44. IDEO Tools. Available online: https://www.ideo.org/tools (accessed on 5 January 2023).
  45. Francalanza, E.; Borg, J.; Fenech, A.; Farrugia, P. Emotional Product Design: Merging Industrial and Engineering Design Perspectives. Procedia CIRP 2019, 84, 124–129. [Google Scholar] [CrossRef]
  46. Zhao, T.; Zhu, T. Exploration of Product Design Emotion Based on Three-Level Theory of Emotional Design. In Human Interaction and Emerging Technologies; Ahram, T., Taiar, R., Colson, S., Choplin, A., Eds.; Advances in Intelligent Systems and Computing; Springer International Publishing: Cham, Switzerland, 2020; Volume 1018, pp. 169–175. ISBN 978-3-030-25628-9. [Google Scholar]
  47. Boisseau, E.; Bouchard, C.; Omhover, J. Towards a Model of the Open-Design Process: Using the Grounded Theory for Modelling Implicit Design Processes; Maier, A., Skec, S., Kim, H., Kokkolaras, M., Oehmen, J., Fadel, G., Salustri, F., VanDerLoos, M., Eds.; The Design Society: Vancouver, BC, Canada, 2017; pp. 121–130. [Google Scholar]
  48. Wang, Z.; He, W.P.; Zhang, D.H.; Cai, H.M.; Yu, S.H. Creative Design Research of Product Appearance Based on Human–Machine Interaction and Interface. J. Mater. Process. Technol. 2002, 129, 545–550. [Google Scholar] [CrossRef]
  49. Yi, Y.; Li, Y. Study on Grounded Theory-Based Product Design. In Proceedings of the 2021 3rd International Conference on Artificial Intelligence and Advanced Manufacture, Manchester, UK, 23 October 2021; pp. 42–46. [Google Scholar]
  50. Lu, R.; Feng, Y.; Zheng, H.; Tan, J. A Product Design Based on Interaction Design and Axiomatic Design Theory. Procedia CIRP 2016, 53, 125–129. [Google Scholar] [CrossRef] [Green Version]
  51. Niu, X.; Wang, M.; Qin, S. Product Design Lifecycle Information Model (PDLIM). Int. J. Adv. Manuf. Technol. 2022, 118, 2311–2337. [Google Scholar] [CrossRef]
  52. Tao, F.; Cheng, J.; Qi, Q.; Zhang, M.; Zhang, H.; Sui, F. Digital Twin-Driven Product Design, Manufacturing and Service with Big Data. Int. J. Adv. Manuf. Technol. 2018, 94, 3563–3576. [Google Scholar] [CrossRef]
  53. Kiritsis, D. Closed-Loop PLM for Intelligent Products in the Era of the Internet of Things. Comput.-Aided Des. 2011, 43, 479–501. [Google Scholar] [CrossRef]
  54. Landowska, A.; Szwoch, M.; Szwoch, W. Methodology of Affective Intervention Design for Intelligent Systems. Interact. Comput. 2016, 28, 737–759. [Google Scholar] [CrossRef]
  55. Hong, L.; Luo, L. Kansei Engineering Design; Tsinghua University Press: Beijing, China, 2015; ISBN 978-7-302-41213-7. [Google Scholar]
  56. Zimmerman, J.; Forlizzi, J.; Evenson, S. Research through Design as a Method for Interaction Design Research in HCI. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, 29 April 2007; pp. 493–502. [Google Scholar]
  57. Ciesielska, M.; Boström, K.W.; Öhlander, M. Observation Methods. In Qualitative Methodologies in Organization Studies; Ciesielska, M., Jemielniak, D., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 33–52. ISBN 978-3-319-65441-6. [Google Scholar]
  58. Wojnar, D.M.; Swanson, K.M. Phenomenology: An Exploration. J. Holist. Nurs. 2007, 25, 172–180. [Google Scholar] [CrossRef] [Green Version]
  59. Glaser, B.; Strauss, A. Discovery of Grounded Theory: Strategies for Qualitative Research; Routledge: New York, NY, USA, 2017; ISBN 978-0-203-79320-6. [Google Scholar]
  60. Smith, R.P.; Morrow, J.A. Product Development Process Modeling. Des. Stud. 1999, 20, 237–261. [Google Scholar] [CrossRef]
  61. Paillé, P. L’analyse Par Théorisation Ancrée. CRS 1994, 23, 147–181. [Google Scholar] [CrossRef] [Green Version]
  62. Corbin, J.; Strauss, A. Grounded Theory Research: Procedures, Canons, and Evaluative Criteria. Qual. Sociol. 1990, 13, 3–21. [Google Scholar] [CrossRef]
  63. Rihacek, T.; Danelova, E. The Journey of an Integrationist: A Grounded Theory Analysis. Psychotherapy 2016, 53, 78–89. [Google Scholar] [CrossRef]
  64. Patton, M.Q. Qualitative Research & Evaluation Methods; SAGE: Thousand Oaks, CA, USA, 2002; ISBN 978-0-7619-1971-1. [Google Scholar]
  65. Chen, X. Qualitative Research in Social Sciences; Educational Science Publication House: Beijing, China, 2000; ISBN 978-7-5041-1926-1. [Google Scholar]
  66. Farquhar, J.; Michels, N.; Robson, J. Triangulation in Industrial Qualitative Case Study Research: Widening the Scope. Ind. Mark. Manag. 2020, 87, 160–170. [Google Scholar] [CrossRef]
  67. Gubrium, J.F.; Holstein, J.A. Handbook of Interview Research: Context and Method; SAGE Publications: Thousand Oaks, CA, USA, 2001; ISBN 978-1-4833-6589-3. [Google Scholar]
  68. Hill, C.E.; Thompson, B.J.; Williams, E.N. A Guide to Conducting Consensual Qualitative Research. Couns. Psychol. 1997, 25, 517–572. [Google Scholar] [CrossRef] [Green Version]
  69. Brown, T. Design Thinking. Harv. Bus. Rev. 2008, 86, 84. [Google Scholar]
  70. Coughlan, P.; Suri, J.F.; Canales, K. Prototypes as (Design) Tools for Behavioral and Organizational Change: A Design-Based Approach to Help Organizations Change Work Behaviors. J. Appl. Behav. Sci. 2007, 43, 122–134. [Google Scholar] [CrossRef] [Green Version]
  71. Charmaz, K. A Constructivist Grounded Theory Analysis of Losing and Regaining a Valued Self. In Five Ways of Doing Qualitative Analysis: Phenomenological Psychology, Grounded Theory, Discourse Analysis, Narrative Research, and Intuitive Inquiry; Guilford Press: New York, NY, USA, 2011; pp. 165–204. ISBN 978-1-60918-142-0. [Google Scholar]
  72. Black, A. Empathic Design: User Focused Strategies for Innovation. Proc. New Prod. Dev. 1998. [Google Scholar]
  73. Wormald, P. Value Proposition for Designers—VP(d): A Tool for Strategic Innovation in New Product Development. Int. J. Bus. Environ. 2015, 7, 262. [Google Scholar] [CrossRef]
  74. Damm, L. Moral Machines: Teaching Robots Right from Wrong. Philos. Psychol. 2012, 25, 149–153. [Google Scholar] [CrossRef]
  75. d’Anjou, P. Toward an Horizon in Design Ethics. Sci. Eng. Ethics 2010, 16, 355–370. [Google Scholar] [CrossRef]
  76. Zong, Y.; GuangXin, W. Anthropomorphism: The Psychological Application in the Interaction between Human and Computer. Psychol. Tech. Appl. 2016, 4, 296–305. [Google Scholar] [CrossRef]
  77. LuxAI QT Robot. Available online: https://luxai.com/ (accessed on 31 October 2022).
  78. Zhou, M.X.; Mark, G.; Li, J.; Yang, H. Trusting Virtual Agents: The Effect of Personality. ACM Trans. Interact. Intell. Syst. 2019, 9, 10. [Google Scholar] [CrossRef]
  79. Feldmaier, J.; Marmat, T.; Kuhn, J.; Diepold, K. Evaluation of a RGB-LED-Based Emotion Display for Affective Agents. arXiv 2016, arXiv:1612.07303. [Google Scholar] [CrossRef]
  80. Hegel, F.; Spexard, T.; Wrede, B.; Horstmann, G.; Vogt, T. Playing a Different Imitation Game: Interaction with an Empathic Android Robot. In Proceedings of the 2006 6th IEEE-RAS International Conference on Humanoid Robots, Genova, Italy, 4–6 December 2006; pp. 56–61. [Google Scholar]
  81. Emi, T.; Hagiwara, M. Pose Generation System Expressing Feelings and State. Int. J. Affect. Eng. 2014, 13, 175–184. [Google Scholar] [CrossRef] [Green Version]
  82. Dragan, A.D.; Lee, K.C.T.; Srinivasa, S.S. Legibility and Predictability of Robot Motion. In Proceedings of the 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Tokyo, Japan, 3–6 March 2013; pp. 301–308. [Google Scholar]
  83. Rowland, C.; Goodman, E.; Charlier, M.; Light, A.; Lui, A. Designing Connected Products: UX for the Consumer Internet of Things; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2015; ISBN 1-4493-7272-4. [Google Scholar]
  84. Hollingsed, T.; Novick, D.G. Usability Inspection Methods after 15 Years of Research and Practice. In Proceedings of the 25th annual ACM International Conference on Design of Communication—SIGDOC ’07, El Paso, Texas, USA, 22–24 October 2007; p. 249. [Google Scholar]
  85. Motti, V.G.; Caine, K. Human Factors Considerations in the Design of Wearable Devices. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2014, 58, 1820–1824. [Google Scholar] [CrossRef]
  86. Dang, J.; Liu, L. Robots Are Friends as Well as Foes: Ambivalent Attitudes toward Mindful and Mindless AI Robots in the United States and China. Comput. Hum. Behav. 2021, 115, 106612. [Google Scholar] [CrossRef]
  87. Dandavate, U.; Sanders, E.B.-N.; Stuart, S. Emotions Matter: User Empathy in the Product Development Process. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 1996, 40, 415–418. [Google Scholar] [CrossRef]
  88. van Allen, P. Prototyping Ways of Prototyping AI. Interactions 2018, 25, 46–51. [Google Scholar] [CrossRef]
  89. Franzoi, S. Social Psychology, 5th ed; McGraw-Hill Humanities/Social Sciences/Languages: Boston, MA, USA, 2008; ISBN 978-0-07-337059-0. [Google Scholar]
  90. Yang, Q.; Steinfeld, A.; Rosé, C.; Zimmerman, J. Re-Examining Whether, Why, and How Human-AI Interaction Is Uniquely Difficult to Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 21 April 2020; pp. 1–13. [Google Scholar]
Figure 1. The standardized grounded theory coding steps [58].
Figure 1. The standardized grounded theory coding steps [58].
Systems 11 00377 g001
Figure 2. The theoretical coding result of grounded theory.
Figure 2. The theoretical coding result of grounded theory.
Systems 11 00377 g002
Figure 3. Cause relationships in original data in relation to emotional design.
Figure 3. Cause relationships in original data in relation to emotional design.
Systems 11 00377 g003
Table 1. The Coding Result of Grounded Theory.
Table 1. The Coding Result of Grounded Theory.
Selective CodingAxial Coding (Opening Coding Numbers)Sub-Total for Opening Codes
User’s Emotional NeedsDemand Expectations (4), Emotional Expectations (4), User Preference (4), Demand Adaptability (7), Scenario Adaptability (7), Group Category (4), Social Concern (5), Demand Category (6)41
Concepts for Emotional DesignThe Interaction Concept (9), Design Vision (8), Design Concept (10), Design Ethics (6), Social Value (6), Business Strategy (6), Product Value (4), Pricing (2), Goal (9)60
Role
Construction
Occupational Role (9), Interpersonal Role (9), Product Identity (5), Smart Assistant Type (7)30
Character and BehaviorPersonality (8), Social Attributes (10), Emotional Characteristics (9), Anthropomorphic Behavior (14), Anthropomorphic Design Features (11), Anthropomorphic Design Attributes (8), Anthropomorphic Interaction Design (11)71
Emotionally Relevant FunctionsHome Function (5), Companion Function (5), Life Function (3), Leisure Function (4), Entertainment Function (4), Office Function (4), Information Function (5), Education Function (8), Communication Function (2), Transportation Function (5), Security Function (4), Health Function (6), Content Generation Function (3), Function Ecology (7), Technology Application (12)77
Design ElementsHardware (6), Software (5), Digital Content (5), Appearance Features (9), Appearance Size (1), Appearance Shape (5), Appearance Style (4), Material (7), Color Style (5), Design Direction (8)55
Technical CapabilitiesIterative Ability (3), Recognition Ability (7), Perception Ability (5), Understanding Ability (7), Motor Ability (5), Intelligence Ability (9), Data Ability (5), Product Technology (7), Interaction Technology (7)55
Use and ConfigurationSoftware Attribute (2), Hardware Configuration (4), Hardware Attribute (9), Multi-Terminal Collaboration (9), Operation Method (5), Operation Setting (6), Interaction Method (10), Supporting Service (6)51
Product EvaluationsInclusion (5), Limitation (3), Diversity (10), Ease of Use (9), Proactivity (10), Immediacy (2), Product Features (11), Emotion-Relevant Personalization (4)54
User Emotional ExperienceUser Experience (15), Product Impression (Subjective) (12), Product Texture Evaluation (4), Value Evaluation (4), Anthropomorphic Experience (5), User Attitude (4), Emotional Feeling After Use (9)53
Table 2. The Selective Coding Explanation and Original Data Reviews.
Table 2. The Selective Coding Explanation and Original Data Reviews.
Selective CodingExplanationOriginal Data Reviews (Axial Codes)
User’s Emotional NeedsThe emotional requirements or desires that a user has in relation to a product or service.I usually just stand in front of him and stop him, just to embarrass him. Wait to see his follow-up reaction after stopping it. (Emotional Expectations)
The traditional Chinese pentatonic scale is used to design the sound effect system, making Duer more suitable for Chinese families. (Social Concerns)
Concepts for Emotional DesignThe principles or ideas that form the basis of smart product’s emotional design.Through the minimalist animal-like design and colorful color matching, it awakens people’s imagination of freedom. (Design Concept)
My grandfather is in a nursing home. I hope Tombot brings him solace, because he looks so much like the only Pomeranian he remembers: Stewie. (Social Value)
Role
Construction
The role of a smart product in human society.An artificial intelligence singer based on AI deep learning algorithm to automatically generate music content. At present, there are many supernova singers such as Xiaobing, He Chang, Chen Shuiruo, Chen Ziyu, etc. (Occupational Role)
MarsCat is the world’s first bionic pet cat developed by Elephant Robotics, a robot pet that brings you comfort and surprise. (Interpersonal Role)
Character and BehaviorPersonality traits and behaviors of a smart product.The humorous personality that Siri was once proud of has also been overtaken by Google. (Personality)
In a DIY teddy shop, there is a manual step is to put a heart in, and warm the heart before putting it in. (Anthropomorphic Design Features)
Emotionally Relevant FunctionsThe specific functions of a smart product that are related to the emotional experience of the user. It can also notify family members when an elderly user encounters an emergency and assist the elderly in video calls. (Health Function)
Neons can help with goal-oriented tasks and also personalize tasks that require a human touch. (Entertainment Function)
Design ElementsThe physical or digital elements that can constitute the appearance, functionality, and behavior of smart products.Romibo is also equipped with many sensors, including light sensors and acceleration sensors, which can control its trajectory so that the robot can automatically avoid obstacles in front of it. (Hardware)
Among the round or square speaker shapes, Libratone’s smart Bluetooth speaker has successfully attracted attention with its cute bird appearance. (Appearance Features)
Technical CapabilitiesThe specific skills, knowledge, or resources that are necessary to perform a task or function. DuerOS has human language capabilities. It can understand human intentions and communicate with people in natural language. (Understanding Ability)
Moxi follows orders and rules when system data reveals certain changes in patients. (Data Ability)
Use and ConfigurationHow to use the smart product, and the process of preparing a product for use.When you hold your iPhone close to HomePod mini, you can take immediate control without unlocking your iPhone. (Multi-terminal Collaboration)
In terms of emotional interaction, the robot has set 41 types of expressions with dynamic design according to 24 emotions. (Interactive Method)
Product EvaluationsThe product evaluation of smart products derived from user experiments.This is Apple’s longstanding commitment to diversity and inclusion. Products and services designed to better reflect the diversity of the world we live in. (Diversity, Inclusion)
Proactively recommend the functions and information currently needed by users; proactively communicate with users to enhance understanding. (Proactivity)
User Emotional ExperienceThe emotional experience of a user while interacting with a smart product or service.Very cool, but I wish they had protections against malicious remote takeover. A robot might pick up a pistol and kill you in your sleep. (User Attitude)
The sound of the broadcast is not blunt or coquettish, and it is comfortable to listen to. (Use Experience)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chao, C.; Chen, Y.; Wu, H.; Wu, W.; Yi, Z.; Xu, L.; Fu, Z. An Emotional Design Model for Future Smart Product Based on Grounded Theory. Systems 2023, 11, 377. https://doi.org/10.3390/systems11070377

AMA Style

Chao C, Chen Y, Wu H, Wu W, Yi Z, Xu L, Fu Z. An Emotional Design Model for Future Smart Product Based on Grounded Theory. Systems. 2023; 11(7):377. https://doi.org/10.3390/systems11070377

Chicago/Turabian Style

Chao, Chiju, Yu Chen, Hongfei Wu, Wenxuan Wu, Zhijie Yi, Liang Xu, and Zhiyong Fu. 2023. "An Emotional Design Model for Future Smart Product Based on Grounded Theory" Systems 11, no. 7: 377. https://doi.org/10.3390/systems11070377

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop