Next Article in Journal
Sub-Milliwatt Transceiver IC for Transcutaneous Communication of an Intracortical Visual Prosthesis
Next Article in Special Issue
Microphone Array for Speaker Localization and Identification in Shared Autonomous Vehicles
Previous Article in Journal
Bubble-Proof Algorithm for Wave Union TDCs
Previous Article in Special Issue
Deep and Transfer Learning Approaches for Pedestrian Identification and Classification in Autonomous Vehicles
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Change of Paradigm for the Design and Reliability Testing of Touch-Based Cabin Controls on the Seats of Self-Driving Cars

1
Instituto de Telecomunicações, 3030-290 Coimbra, Portugal
2
Department of Electrical and Computer Engineering, University of Coimbra, 3030-290 Coimbra, Portugal
3
CIE Plasfil, Zona Industrial Da Gala, Lote 6, 3090-380 Figueira da Foz, Portugal
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2022, 11(1), 21; https://doi.org/10.3390/electronics11010021
Submission received: 13 November 2021 / Revised: 11 December 2021 / Accepted: 13 December 2021 / Published: 22 December 2021
(This article belongs to the Special Issue Autonomous Vehicles Technological Trends)

Abstract

:
The current design paradigm of car cabin components assumes seats aligned with the driving direction. All passengers are aligned with the driver that, until recently, was the only element in charge of controlling the vehicle. The new paradigm of self-driving cars eliminates several of those requirements, releasing the driver from control duties and creating new opportunities for entertaining the passengers during the trip. This creates the need for controlling functionalities that must be closer to each user, namely on the seat. This work proposes the use of low-cost capacitive touch sensors for controlling car functions, multimedia controls, seat orientation, door windows, and others. In the current work, we have reached a proof of concept that is functional, as shown for several cabin functionalities. The proposed concept can be adopted by current car manufacturers without changing the automobile construction pipeline. It is flexible and can adopt a variety of new functionalities, mostly software-based, added by the manufacturer, or customized by the end-user. Moreover, the newly proposed technology uses a smaller number of plastic parts for producing the component, which implies savings in terms of production cost and energy, while increasing the life cycle of the component.

1. Introduction

Recent advances in self-driving cars are expected to translate into a significant number of new vehicles circulating using this new paradigm in the coming years [1,2]. Many works state that by 2050 self-driving cars will dominate, which creates new opportunities but also new challenges [3]. In this context, the human driver will likely be released from functions, offloading that responsibility to the machine and artificial intelligence algorithms [4,5]. Once relieved from driving duties [6], the driver will benefit from new services inside the car cabin, including those related to using multimedia and communications such as games, work meetings, movies, music and Internet browsing, to name a few. Furthermore, there may no longer exist a clear separation of the passengers the way we have it today due to the transmission tunnel on the cockpit’s floor. In this context, it is possible that a brand differentiator may well consist of the interior design of the car cabin, namely the seat arrangement and orientation, as well as the floor of the car cabin [7,8].
Therefore, the drivers’ seat, as well as the seats of the other passengers no longer need to be aligned with the moving direction. Moreover, the seats should be able to rotate on their axis instead of simply sliding back and forth to adjust proper leg positioning as they do today.
To serve this purpose, the passengers no longer will be aligned with the driving direction, possibly turning to face each other, which creates the need to add control buttons to operate general functionalities from their seats, as opposed to having these control buttons on the doors and central panel.
The development of this technology will imply redesigning the seats to incorporate these controls. Today’s standard for implementing funcional controls in the automotive industry is to build buttons which are composed of mechanical devices comprised of several plastic polymer parts [9]. The actual standard presents several disadvantages as compared to robust alternatives that can provide a functionality within a single plastic part.
Touch-based technologies [10] can pave the way towards the integration of sensors coupled to a single plastic part attached to the seat. In this case, the touch sensors lying below the plastic part will be able to sense the interaction of the passenger to control a certain functionality. This is important as it allows a reduced assembling time and a larger Mean Time Between Failures (MTBF). Moreover, it allows the same part design to be incorporated into many models of the brand, since the different funcionalities are controlled by software.
In this work, we developed the proposed technology and designed a new car seat that incorporates a viable design for controlling the opening and closing of car side windows and seat positioning. We were able to develop and test the technology on a real car used in the market and also on a new seat with an innovative design able to be incorporated in self-driving cars.
This article presents the following contributions:
  • Performance analysis of capacitive touch sensors for the automotive industry;
  • Integration of microelectronics to control capacitive sensors with injected polymer plastic to be coupled to the seat;
  • Development and implementation of a testing and reliability assessment system, by automatically applying and monitoring thousands of touch interactions per plastic part;
  • Design of a futuristic car seat with integrated controls;
  • Development, implementation and testing of a functional prototype to control the opening and closing of car side windows and the position of the car seats.
This work is also part of the `Collective Efficiency Strategies’, totally aligned with the `Mobinov—Automobile Cluster Association’, which identified as one of its main goals “to contribute to making Portugal a reference in research, innovation, design, development, manufacture and testing of products and services of the automotive industry” and “strengthen the competitiveness of a fundamental sector of the economy, promoting an increase in exportation” [9].
This article is structured as follows. Section 2 describes the materials and methods used, analyzing in depth the capacitive touch sensor technology selected for the current context, and validating its use over thousands of cycles of operation. Section 3 addresses implementation and results. It analyzes the design of the newly developed car seat and the methods used to integrate injectable plastic with capacitive touch sensors in it. The discussion and future research directions in Section 4 close the article.

2. Materials and Methods

Distinct sensing technologies such as inductive, infrared, ultrasound, resistive and capacitive-based ones were analyzed and tested in depth [9], and their characteristics compared. The decision for the most appropriate technology to use in this context befell upon capacitive touch sensors depicted in Figure 1a,b. The comparison criteria ranged from ease of use, reliability, cost, ease of integration and smallest design footprint [11,12].

2.1. Touch Sensors for Activating Functions in Polymers

Capacitive touch sensing is a low-cost and low-complexity technology [13] ubiquitous in today’s devices. Thus, in a way, there is already a precedent on how people expect these interfaces to work and how to interact with them [14]. Several capacitive sensors (four examples are depicted in Figure 2) were tested with plastic samples of varying thickness and composition as exhaustively described in Subsection 3.1 of [9]. The sensors experimented offered two different designs and two different channel counts. The experimental results were successful to the extent where the touch detection rate nearly reached 100% for every sensor tested, as presented in Table 1.
Figure 2. The figure depicts the capacitive sensors tested for the current design from (ad). The electrical properties and datasheets for each sensor can be found in Table 1. The effectiveness tests performed these sensors are described in Table 2.
Figure 2. The figure depicts the capacitive sensors tested for the current design from (ad). The electrical properties and datasheets for each sensor can be found in Table 1. The effectiveness tests performed these sensors are described in Table 2.
Electronics 11 00021 g002
Table 1. Electrical characteristics of the tested sensors in Figure 2.
Table 1. Electrical characteristics of the tested sensors in Figure 2.
Sensor #Int. CircuitPadTypeDatasheet
Sensor 1TTP223-BCircular1 Channelhttps://datasheet.lcsc.com/szlcsc/TTP223-BA6_C80757.pdf (accessed on July 13 2020)
Sensor 2TTP223N-BSquared1 Channelhttps://datasheet.lcsc.com/szlcsc/TTP223-BA6_C80757.pdf (accessed on 13 July 2020)
Sensor 3TTP224Squared4 Channelshttps://download.mikroe.com/documents/datasheets/ttp224.pdf (accessed on 13 July 2020)
Sensor 40401-8224Circular4 ChannelsClone TTP224
Table 2. Effectiveness tests for sensors 1–4 depicted in Figure 2.
Table 2. Effectiveness tests for sensors 1–4 depicted in Figure 2.
SensorPolymerDescriptionThickness
(mm)
Test
Number
HitsEffectivenessAssessment
Sensor 1PC ABSTaroblend 662.53030100.00%Approved
Sensor 1PA6GF30Ultramid B3E2G62.53030100.00%Approved
Sensor 1PPTD20Hostacom TRC 352N2.5302996.67%Approved
Sensor 1PPSabic PHC31812.5302996.67%Approved
Sensor 1PA6Badamid B70S Natur2.5302996.67%Approved
Sensor 2PC ABSTaroblend 662.53030100.00%Approved
Sensor 2PA6GF30Ultramid B3E2G62.53030100.00%Approved
Sensor 2PPTD20Hostacom TRC 352N2.53030100.00%Approved
Sensor 2PPSabic PHC31812.53030100.00%Approved
Sensor 2PA6Badamid B70S Natur2.5302996.67%Approved
Sensor 3PC ABSTaroblend 661.8302996.67%Approved
Sensor 3PA6GF30Ultramid B3E2G61.83030100.00%Approved
Sensor 3PPTD20Hostacom TRC 352N1.8302996.67%Approved
Sensor 3PPSabic PHC31811.83030100.00%Approved
Sensor 3PA6Badamid B70S Natur1.83030100.00%Approved
Sensor 4PC ABSTaroblend 661.83030100.00%Approved
Sensor 4PA6GF30Ultramid B3E2G61.83030100.00%Approved
Sensor 4PPTD20Hostacom TRC 352N1.83030100.00%Approved
Sensor 4PPSabic PHC31811.8302996.67%Approved
Sensor 4PA6Badamid B70S Natur1.83030100.00%Approved
After validating the technology and method, in order to reduce the cost and the design footprint, it was decided to conceive the sensors directly on the Printed Circuit Board (PCB) (see Figure 3). The way these capacitive sensors work is based on a method known as self-capacitance [15], where each sensor is read using a single input of the system. Self-capacitance touch sensors use a single sensor electrode to measure the apparent capacitance between the electrode and the ground of the touch sensor. This method offers good immunity to noise induced from neighbouring sensors and circuitry.
To build a proof-of-concept design using this sensor, a microcontroller is required for reading sensing data and acting accordingly. A microcontroller consists of an embedded processor with auxiliary peripherals, I/O and electronics to interface with external sensors and other hardware. The microcontroller necessary for this prototype should ease the job of integrating all the technology by having peripherals to interact with capacitive sensors and potentially interact with an automobile’s Electronic Control Unit (ECU) [16], as depicted in Figure 4.
With these criteria in mind, the ATSAMC21J18A microcontroller from Microchip was adopted. It features a 32-bit processor architecture, a processor speed of 48 MHz, 32 KB of RAM, 264 KB of non-volatile (Flash) storage memory and 52 general-purpose input/output (GPIO) pins. It also features two important peripherals needed to ease the integration: a Peripheral Touch Controller (PTC) and a Controller Area Network (CAN) [17]. The PTC peripheral makes the process of sampling and validating the capacitive touch sensors more reliable and robust as all of these steps are performed automatically and periodically by the hardware itself. The CAN peripheral [18] was planned as a viable means of integrating this prototype with the automobile’s ECU since it implements a standard bus of communications used by the automobile industry. The microcontroller development board and the developed prototype are depicted in Figure 4 and Figure 5, respectively.
After prototyping this first version and validating the integration method, further design changes were implemented in order to reduce cost and footprint even more. Where previously there was a need to match the amount of GPIOs of the microcontroller to the number of sensors employed using the “Self-Capacitance” [15] design previously explained how we further reduced the number of GPIOs needed using a design called “Mutual-Capacitance” [19]. By using this alternative approach, if sensors are arranged with the electrical connections in a matrix-like array, it would only require M × N GPIOs on the microcontroller, M being the number of rows and N being the number of columns.
Compared against the previous version of the prototype, where 19 GPIOs were needed in order to read the 19 sensors using the “Self-Capacitance” design, in this case only 10 GPIOs are required using a configuration of 3 rows and 7 columns. This design improvement allowed for a greater reduction in cost by making it viable to select a cheaper but powerful microcontroller at the cost of a slightly more complex PCB design (with negligible effect in final cost). The microcontroller chosen was the ATTINY3217 [20], which allows a saving of nearly 71% of the cost when compared to the previously chosen microcontroller.

2.2. Touch Surface Backlighting

Another challenge tackled is the need for adequate backlighting on the touch surface. Backlighting can be used to appropriately guide the user to the touch control, to illuminate pictograms depicting the control function, or both functions.
Integrating backlighting poses a technical challenge as the light needs to be concentric with the touch sensor but cannot interfere with the touch sensing, neither by constraining the surface area for the sensor employment nor by inducing noise on the sensor line. The employed method consisted of drilling a hole in the middle of the sensor that is large enough to allow proper dispersion of the light on the control pictograms. The LEDs were assembled on the underside of the PCB, but shining upwards as suggested in Figure 6.
This approach allows the LED not to interfere electrically with the sensor while properly illuminating the control surface.
After carefully iterating over several drilling diameters, a proper dispersion was achieved using 3 mm, as depicted in Figure 7 and this development was employed in the final prototype. The backlighting effect is illustrated in the intermediate prototype developed shown in Figure 8.

2.3. Validation and Reliability

The testing of such a piece of hardware can be performed manually or automated with machinery. Testing manually is a time-intensive task and—albeit closer to the real usage of the equipment—requires at least one person to perform the tests by hand and keep track of the success and failure rates. Furthermore there is no way to ensure that the tester performs every test in the same way, with the same motions and adequate finger pressure. Automating the testing by using machinery allows faster tests and a greater degree of accuracy in the motions required. In addition, performing tests faster allows for a greater number of test samples to be acquired in the same time-frame, thus making the whole validation more reliable. Therefore, whenever possible, tests should be automated. An adequate machine for performing this task is a 3-axis Computer Numerical Control (CNC) system [21]. These machines have 3 axis of independent movement and are ubiquitous in several industries (e.g., ranging from 3D printers to assembly lines). They work by having a coupled computer following a precise script telling the machine the movements that have to be done. The adopted setup uses an industrial 3-axis CNC machine [21] with a capacitive “finger” probe (please see Figure 9c) attached in the gripper part as depicted in Figure 9d.
In order to automate the testing procedure we made use of available digital input lines on the CNC controller board in order to allow the CNC to be told by the tested equipment if a given probe touch was detected. All of this was orchestrated using a G-Code script which is the standard scripting language to program a CNC machine to perform a set of movements. The source code and flowchart developed are illustrated in the next page and in Figure 10.
The algorithm employed in our G-Code script can be summarized in a few simple steps:
  • Move the probe to the initial position (2 cm above the sensor);
  • Move the probe down 2 cm (touching the sensor);
  • Wait for a digital signal on the CNC controller’s input, warning that a touch was detected;
  • If the signal comes within 5 s, register this iteration as a success; otherwise, mark as a failure;
  • Move the probe up 2 cm (no longer touching the sensor);
  • Repeat the process from 2.
#<loop_count> = 0
#<timeout> = 0
#<timeout_limit> = 50    ;tenths of second
#<successes> = 0
#<misses> = 0
#<t_start> = datetime[]
(logopen,Teste_Touch_Tranca.log)
G00 X338 Y188 Z-430
o100 repeat [10000]    ;repeat ten thousand times
    #<s> = [datetime[] - #<t_start>]
    (print,#<s>, Iteration #<loop_count,0>, Successes: #<successes,0>, Misses: #<misses,0>)
    (log,#<s>, Iteration #<loop_count,0>, Successes: #<successes,0>, Misses: #<misses,0>)
    G00 Z-430    ;move to start position
    o200 while [[#<timeout> LE #<timeout_limit>] AND [#<_hw_input> EQ 1]] ;wait for previous touch release
     G04 P0.1
     #<timeout> = [#<timeout> + 1]
    o200 endwhile
    o300 if [#<timeout> EQ #<timeout_limit>]
     (print,Touch release timeout)
     (log,Touch release timeout)
    o300 endif
    #<timeout> = 0
    G00 Z-440    ;move to touch position
    o400 while [[#<timeout> LE #<timeout_limit>] AND [#<_hw_input> EQ 0]] ;wait for touch signal or
    timeout
     G04 P0.1
     #<timeout> = [#<timeout> + 1]
    o400 endwhile
    o500 if [[#<timeout> LE #<timeout_limit>] AND [#<_hw_input> EQ 1]] ;check if we reached a timeout
     #<timeout> = [#<timeout> / 10]
     (print,Touch signal received after #<timeout,1>s)
     (log,Touch signal received after #<timeout,1>s)
     #<successes> = [#<successes> + 1]    ;increment touch success counter
    o500 else
     (print,Touch signal timeout)
     (log,Touch signal timeout)
     #<misses> = [#<misses> + 1]    ;increment touch miss counter
    o500 endif
    #<timeout> = 0
    #<loop_count> = [#<loop_count> + 1]    ;increment iteration counter
o100 endrepeat
#<t_duration> = [datetime[] - #<t_start>]
G00 Z-420
(print,Took #<t_duration> seconds doing #<loop_count,0> iterations)
(print,#<successes,0> touches detected and #<misses,0> touches missed)
(log,Took #<t_duration> seconds doing #<loop_count,0> iterations)
(log,#<successes,0> touches detected and #<misses,0> touches missed)
(logclose)
The complete definition of the testing methodology allowed each one of the 19 slider and on-off sensors on the PCB to be tested and validated in the CNC structure developed, illustrated in Figure 11. Each test was comprised of 10,000 iterations, running from 2 to 3 h. This allowed testing of all the sensors within a week’s time.
All tests reported more than 99% reliability, the worst performing sensor achieving 99.45 % reliability (see Figure 12 and Table 3) and the best performing one achieving 100% (please refer to Figure 13 and Table 4).
Figure 12 is a good example of a phenomenon observed where errors tend to show up later during the tests. This might be explained taking into consideration that the integration system calibrates the capacitive sensor baseline over time and, because each one of the 10,000 iterations is performed as quickly as possible ( 1.1 s for the worst case scenario) the integration system might not have enough time to reliably calibrate back to where the baseline was before, resulting in this accumulated error over time. In a real usage scenario no touch sensor or any other button is used thousands of times during a single trip and not as frequently as in these tests. Considering the strict test conditions, these sensors were also submitted, and considering that the tests show reliability values above 99%, it can be concluded that they are appropriate and reliable.
Another interesting aspect is that the worst performing sensors are positioned within the PCB layout near noise sources such as the power supply circuitry and communication data lines. Therefore, future designs should take care in properly isolating the sensor lines from such circuitry either by properly employing ground planes, by changing the routing of the data and sensor lines or by simply changing the layout of the components. Sensors positioned far from these noise sources have close to no errors as shown in the sensors portrayed in Figure 13, Figure 14 and Figure 15 and the corresponding Table 4, Table 5 and Table 6.
Finally, one aspect of the technology that deserves consideration is what happens if a sensor becomes out of control. For this, we can assume two possible cases: (1) the sensor starts sending undesired control commands or (2) the sensor freezes and stops working. For both cases there are approaches that must necessarily be foreseen. First, if there is a malfunction of the sensing mechanism, then it may need repair assistance as happens with current systems and cars. However, if the system systematically (or sporadically) assumes an erratic behavior after a certain number of utilizations, then a solution that incorporates a periodic reset of the electrical reference values of the touch system can easily be implemented. For any case, it must be noted that in the thousands of tests conducted through the CNC-based testing system developed, we never experienced a situation where the sensor became out of control. The tests we are currently implementing consist of the utilization of a CNC specifically developed for testing the developed system thousands of times and assessing its response capability and robustness. The manuscript describes these tests, which indicate from 0 (zero) to a maximum of 55 failures detected per 10,000 touches performed, which means the system is robust and reliable (less than 0.55% of failures detected in the worst case scenario, and (0) zero in the best one). Moreover, the newly proposed touch-based system will require less maintenance and become more robust to the presence of water and humidity (e.g., resulting from rain, coming from an open door window) inside the car cabin and near the control buttons, a problem that has been known and described by manufacturers for decades.

2.4. Comparing New and Current Paradigms

The integration tests were developed on a mass-produced automobile (please see Section 4 of [9]). The original vehicle’s button responsible for controlling the doors’ windows and side mirrors is comprised of 26 discreet components. Amongst these components there are those made of plastic, rubber, metal and PCBs with circuitry as shown in Figure 16. Each one of these components has its own production and/or assembly line which, all together, contribute to a complex sourcing and assembly process that produces a part with many moving components. Moreover, each component fabrication process needs to comply with physical tolerances that may lead to faults. Furthermore, hand assembly is often required and, thus, it is not an error-free process. All this contributes to a part that may often result in waste during fabrication and, having a reduced quality standard due to the tolerances may lead to short MTBF, which produces high maintenance and replacement costs.
The developed prototype has no moving parts as it is made up of a polymer plastic cover and two PCBs. Both the polymer plastic component and the PCBs with their respective circuitry have a fully automated and mature fabrication process that does not require human intervention. Thus, there is little room for tolerance error propagation along the process. This results in larger MTBF which, in turn, represents lower maintenance and replacement costs.
Of particular interest is the fact that this part is fully programmable and can communicate with the automobile using industry standard protocols. Therefore, it is suitable to be mass-produced and adapted to several automobiles or even different applications, or environments such as aircraft seats, working space seats, etc.

3. Implementation and Results

The first prototype was developed to run a proof of concept with a mass-produced automobile from 2018 [9]. However, the monitor-supported prototype seat (depicted in Figure 17b) was also built to house the SPaC part so that it could be presented to customers and disseminate the new technology. This seat was conceived, developed and built by the company AlmaDesign [22].
In Figure 17a we can see the prototype seat with the integrated SPaC part. The location of the SPaC part in an autonomous vehicle environment of the future will allow the passenger to access different commands for controlling functions regardless of the position of the seat inside the vehicle. In the current case study, these functions are: positioning of the seat, opening and closing of the windows, and control of the rear-view mirrors. In case it becomes necessary to change/add functions, the developed technology has flexibility to allow further development and scaled integration.
This implementation led to the development of a touch system with two PCBs interconnected by a flat-cable, as the curved geometry of the plastic part dictated by the aesthetics conceived by the design team did not allow the electronic system to be implemented using only a single PCB. This is depicted in more detail in Figure 5. There is a possibility of using a flexible PCB, which would allow the PCB to adapt to complex contours of the control surface. However, this would raise costs significantly and consequently this possibility was discarded.
Figure 17a,b illustrates the seat developed under the context of this change of paradigm that are capable of integrating futuristic vehicles’ requisites. The new touch-based technology developed adapts to the design of novel seats that integrate functionalities of the car cabin. In fact, as self-driving cars assume more relevance in a global context, it is expected that the passengers no longer have to travel aligned with the driving direction. This implies that they may not be able to reach all parts of the cabin in order to control some functionalities.
Therefore, the current buttons were designed for the seat to incorporate the car functionalities. They can also be applied in other parts of the vehicle. Moreover, the incorporated functionalities can control the seat positioning and movement, as well as the window closest to that passenger. Figure 18a,b details the controls of window opening and rear mirror positioning. Rear seats, which in many cases may not move, can include advanced multimedia controls that are tipically accessible only by front-seat users.
Another aspect to consider is that unknown (at this point) functionalities may have to be included in the near future to control novel features of self-driving vehicles, which do not exist in current automobiles. The proposed technology makes room for a variety of human–machine interfaces, including the use of other types of sensors, wgucg have never been tried before on a single vehicle. These should allow the passengers of self-driving vehicles to better enjoy the travelling experience towards the destination.
Since the prototype seat was developed with aesthetic concerns in mind, a demo screen was developed to emulate the operation of the previously described car functionalities (shown in Figure 17b). This allows the seat to be demonstrated at conferences, fairs or to any other interested parties while proving the touch-based control concept for the car seat. The demo screen shows animations depicting some functionalities idealized for the seat control panel. The functionalities are: seat rotation (as illustrated in Figure 19) seat height adjustment, seat front and back sliding, seat reclination, and front and back door windows opening and closing.

4. Discussion

The current paradigm of car seats aligned with the driving direction is about to change [23]. This may imply that passengers no longer have access to the usual controls on the front panel or in doors inside the car cabin [24], thus creating the need to incorporate new control functionalities near the seats’ surfaces.

4.1. Conclusions of This Study

In this paper we address this change of paradigm in the sense that we propose new low-cost and easy to implement touch-based models for controlling car cabin functionalities. The sensors are based on self capacitance technology coupled to a low-cost microcontroller that connects to the automobile’s ECU [16]. To this end, we have developed the necessary electronics and performed thousands of tests on real plastic injected components, to assess reliability and robustness. We have been able to verify that, for the thousands of tests performed to each touch-based sensor, the maximum failure rate achieved was below 0.55 % for a behaviour that is far more demanding than real-life utilization on a vehicle. On a conventional utilization, no sensor is consecutively used thousands of times without a reset. This is even more significant as these can be integrated with car seats in order to allow the passenger control over functionalities that otherwise would be no longer accessible when the seat changes position, a possibility that may become a reality when the passenger becomes free from driving duties.
The main contributions of this paper can be summarized below:
  • Innovative touch-based technology for controlling functionalities inside the car cabin;
  • Thousands of cycles automatic reliability test and validation of touch-based sensors;
  • Novel and appealing design of the cabin of self-driving cars;
  • Moving control buttons and features to other parts of the cabin instead of the doors and central panel;
  • More natural and intuitive human-machine interaction similar to the one used in mobile phones;
  • Scalable and flexible addition of new functionalities via software technology;
  • Retro-compatible technology;
  • Supported by versatile communication protocols;
  • The same low-cost solution can be incorporated into cars of distinct segments.

4.2. Future Research Directions

There is an active discussion [25] about the attention that touch-based technology requires from the driver and passengers. It is not clear if the feedback obtained by pressing a touch sensor is enough to create the sense of action completed by the user.
As future research directions there is an ample debate that haptic feedback [26] may have to be incorporated in touch-based control [27]. The haptic effect caused by the pressure of a mechanical button needs to be emulated for the new vehicle’s control interfaces to act more naturally and give the user the perception that the action has been launched.
Another aspect that is indirect implications to this paper regards the ongoing evolution of deep neural networks and the challenges still to be faced [28] before self-driving cars can be massively adopted. In particular, several aspects of autonomous vehicles still have to undergo strict assessment concerning functional safety [29]. Furthermore, the use of deep learning for predicting decisions [30] based on data captured from cameras [31] and their association to other metrics such as positioning, velocity of the car, traffic or the presence of pedestrians near by, will have to be validated for many more thousands of kilometers to come.

Author Contributions

Conceptualization, T.C., C.A., P.S., J.S., C.R., R.L., R.P., F.M., R.M., G.T. and G.F.; methodology, C.A., T.C., P.S., J.S., G.T. and G.F.; software, C.A. and T.C.; validation, C.A., T.C. and G.F.; formal analysis, P.S. and G.F.; investigation, C.A., T.C., P.S., J.S. and G.F.; resources, P.S., G.T. and G.F.; data curation, P.S. and J.S.; writing—original draft preparation, C.A., T.C. and G.F.; writing—review and editing, P.S., C.R., R.L., F.M. and G.F.; visualization, J.S., R.P., R.M.; supervision, P.S. and G.F.; project administration, P.S.; funding acquisition, P.S. and G.T. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been financially supported by project SPaC (POCI-01-0247-FEDER-038379), co-financed by the European Community Fund FEDER through POCI—Programa Operacional Competitividade e Internacionalização. It has also been financially supported by Instituto de Telecomunicações and Fundação para a Ciência e a Tecnologia under grants UIDB/50008/2020 and UIDP/50008/2020.

Acknowledgments

The authors would like to acknowledge the support provided by CJR Motors and Leiribéria.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
SPaCSmart Plastic Cover
LCDLiquid-Crystal Display
ADCAnalog-to-Digital Converter
DACDigital-to-Analog Converter
CANController Area Network
LINLocal Interconnect Network
PCBPrinted Circuit Board
ECUEngine Control Unit
MTBFMean Time Between Failures
OEMOriginal Equipment Manufacturer
MDPIMultidisciplinary Digital Publishing Institute
FEDERFundo Europeu de Desenvolvimento Regional (Portugal)

References

  1. Ji, K.; Orsag, M.; Han, K. Lane-Merging Strategy for a Self-Driving Car in Dense Traffic Using the Stackelberg Game Approach. Electronics 2021, 10, 894. [Google Scholar] [CrossRef]
  2. Park, M.; Kim, H.; Park, S. A Convolutional Neural Network-Based End-to-End Self-Driving Using LiDAR and Camera Fusion: Analysis Perspectives in a Real-World Environment. Electronics 2021, 10, 2608. [Google Scholar] [CrossRef]
  3. Talpes, E.; Sarma, D.D.; Venkataramanan, G.; Bannon, P.; McGee, B.; Floering, B.; Jalote, A.; Hsiong, C.; Arora, S.; Gorti, A. Compute solution for Tesla’s full self-driving computer. IEEE Micro 2020, 40, 25–35. [Google Scholar] [CrossRef]
  4. Driverless Cars Are Coming—A Paradigm Shift. Available online: https://www.computer.org/publications/tech-news/neal-notes/driverless-cars-are-coming-a-paradigm-shift (accessed on 17 August 2020).
  5. Eliot, L. AI Self-Driving Cars Consonance: Practical Advances in Artificial Intelligence and Machine Learning; LBE Press Publishing: New York, NY, USA, 2020. [Google Scholar]
  6. Morales-Alvarez, W.; Sipele, O.; Léberon, R.; Tadjine, H.H.; Olaverri-Monreal, C. Automated Driving: A Literature Review of the Take over Request in Conditional Automation. Electronics 2020, 9, 2087. [Google Scholar] [CrossRef]
  7. Autonomous Vehicles Will Create a Radical Paradigm Shift in Vehicle Design. Available online: https://www.automotive-fleet.com/159798/autonomous-vehicles-will-create-a-radical-paradigm-shift-in-vehicle-design (accessed on 12 June 2020).
  8. Krenicky, T.; Ruzbarsky, J. Alternative Concept of the Virtual Car Display Design Reflecting Onset of the Industry 4.0 into Automotive. In Proceedings of the IEEE 22nd International Conference on Intelligent Engineering Systems (INES), Las Palmas de Gran Canaria, Spain, 21–23 June 2018; pp. 407–412. [Google Scholar]
  9. Alves, C.; Custódio, T.; Silva, P.; Silva, J.; Rodrigues, C.; Lourenço, R.; Pessoa, R.; Moreira, F.; Marques, R.; Tomé, G.; et al. smartPlastic: Innovative Touch-Based Human-Vehicle Interface Sensors for the Automotive Industry. Electronics 2021, 10, 1223. [Google Scholar] [CrossRef]
  10. Kouba, S.; Dickson, N. Intuitive Touch Technologies—Semiconductor-Based Electronic Components and their Integration. Auto Tech Rev. 2016, 5, 38–43. [Google Scholar] [CrossRef]
  11. Rodríguez-Machorro, J.C.; Ríos-Osorio, H.; Águila-Rodríguez, G.; Herrera-Aguilar, I.; González-Sánchez, B.E. Development of capacitive touch interfaces. In Proceedings of the International Conference on Electronics, Communications and Computers (CONIELECOMP), Cholula, Mexico, 22–24 February 2017; pp. 1–8. [Google Scholar]
  12. Park, J.K.; Lee, C.J.; Kim, J.T. Analysis of multi-level simultaneous driving technique for capacitive touch sensors. Sensors 2017, 17, 2016. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Brasseur, G. Design rules for robust capacitive sensors. IEEE Trans. Instrum. Meas. 2003, 52, 1261–1265. [Google Scholar] [CrossRef] [Green Version]
  14. Kim, J.; Song, W.; Jung, S.; Kim, Y.; Park, W.; You, B.; Park, K. Capacitive Heart-Rate Sensing on Touch Screen Panel with Laterally Interspaced Electrodes. Sensors 2020, 14, 3986. [Google Scholar] [CrossRef] [PubMed]
  15. Capacitive Touch Sensor Design Guide. Available online: http://ww1.microchip.com/downloads/en/Appnotes/Capacitive-Touch-Sensor-Design-Guide-DS00002934-B.pdf (accessed on 13 July 2020).
  16. Seo, S.-H.; Park, J.-H.; Hwang, S.-H.; Jeon, J.W. 3-D Car Simulator for Testing ECU Embedded Systems. In Proceedings of the SICE-ICASE International Joint Conference, Busan, Korea, 18–21 October 2006; pp. 550–554. [Google Scholar]
  17. Coanda, H.-G.; Ilie, G. Design and implementation of an embedded system for data transfer in a car using CAN protocol. In Proceedings of the 12th International Conference on Electronics, Computers and Artificial Intelligence (ECAI), Bucharest, Romania, 25–27 June 2020; pp. 1–4. [Google Scholar]
  18. Elshaer, A.M.; Elrakaiby, M.M.; Harb, M.E. Autonomous Car Implementation Based on CAN Bus Protocol for IoT Applications. In Proceedings of the 13th International Conference on Computer Engineering and Systems (ICCES), Cairo, Egypt, 18–19 December 2018; pp. 275–278. [Google Scholar]
  19. Design with Surface Sensors for Touch Sensing Applications on MCUs. Available online: https://www.st.com/resource/en/application_note/dm00087990-design-with-surface-sensors-for-touch-sensing-applications-on-mcus-stmicroelectronics.pdf (accessed on 10 October 2019).
  20. ATtiny3216/3217–Datasheet. Available online: https://ww1.microchip.com/downloads/en/DeviceDoc/ATtiny3216-17-DataSheet-DS40002205A.pdf (accessed on 4 February 2021).
  21. Del Guerra, M.; Coelho, R.T. Development of a low cost Touch Trigger Probe for CNC Lathes. J. Mater. Process. Technol. 2006, 179, 117–123. [Google Scholar] [CrossRef]
  22. AlmaDesign: Managing Process and Culture in a Design Studio. Available online: https://www.almadesign.pt/studio (accessed on 17 December 2019).
  23. Fleming, B. Advances in Automotive Electronics [Automotive Electronics]. IEEE Veh. Technol. Mag. 2015, 10, 4–11. [Google Scholar] [CrossRef]
  24. The Future of the Car: A Paradigm Shift of the Century. Available online: https://autocrypt.io/future-of-car-paradigm-shift-of-the-century/ (accessed on 7 June 2021).
  25. Liu, S.; Li, L.; Tang, J.; Wu, S.; Gaudiot, J.-L. Creating Autonomous Vehicle Systems, 2nd ed.; Morgan & Claypool Publishers: Ulverston, UK, 2020. [Google Scholar]
  26. Hannaford, B.; Okamura, A.M. Haptics. In Springer Handbook of Robotics; Springer: Cham, Switzerland, 2016; pp. 1063–1084. [Google Scholar]
  27. Miedl, F.; Tille, T. 3-D surface-integrated touch-sensor system for automotive HMI applications. IEEE/ASME Trans. Mechatron. 2015, 21, 787–794. [Google Scholar] [CrossRef]
  28. Rao, Q.; Frtunikj, J. Deep Learning for Self-Driving Cars: Chances and Challenges. In Proceedings of the IEEE/ACM 1st International Workshop on Software Engineering for AI in Autonomous Systems (SEFAIAS), Gothenburg, Sweden, 28 May 2018; pp. 35–38. [Google Scholar]
  29. Xu, J.; Howard, A. How much do you Trust your Self-Driving Car? Exploring Human-Robot Trust in High-Risk Scenarios. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020; pp. 4273–4280. [Google Scholar] [CrossRef]
  30. Gilpin, L.H. Anticipatory Thinking: A Testing and Representation Challenge for Self-Driving Cars. In Proceedings of the 55th Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, USA, 24–26 March 2021; pp. 1–2. [Google Scholar] [CrossRef]
  31. Mihalea, A.; Samoilescu, R.; Nica, A.C.; Trăscău, M.; Sorici, A.; Florea, A.M. End-to-end models for self-driving cars on UPB campus roads. In Proceedings of the IEEE 15th International Conference on Intelligent Computer Communication and Processing (ICCP), Cluj-Napoca, Romania, 5–7 September 2019; pp. 35–40. [Google Scholar] [CrossRef]
Figure 1. Touch-based technology. (a) General illustration that resembles the use of mobile phones and (b) technical detail of the touch surface.
Figure 1. Touch-based technology. (a) General illustration that resembles the use of mobile phones and (b) technical detail of the touch surface.
Electronics 11 00021 g001
Figure 3. First prototype of a touch sensor with self capacitance technology. The resulting touch sensor design is shown in (a). The bottom PCB is shown in (b). The PCB assembled on a plastic part is depicted in (c).
Figure 3. First prototype of a touch sensor with self capacitance technology. The resulting touch sensor design is shown in (a). The bottom PCB is shown in (b). The PCB assembled on a plastic part is depicted in (c).
Electronics 11 00021 g003
Figure 4. PCB and plastic part connected to the sensors and the ATSAMC21J18A microcontroller development board.
Figure 4. PCB and plastic part connected to the sensors and the ATSAMC21J18A microcontroller development board.
Electronics 11 00021 g004
Figure 5. PCB with sensors based on a mutual-capacitance design assembled on a plastic part.
Figure 5. PCB with sensors based on a mutual-capacitance design assembled on a plastic part.
Electronics 11 00021 g005
Figure 6. Illustration on how to couple a LED back light to a capacitive touch sensor. The hole where the LED scatters light has a diameter equal to 3 mm.
Figure 6. Illustration on how to couple a LED back light to a capacitive touch sensor. The hole where the LED scatters light has a diameter equal to 3 mm.
Electronics 11 00021 g006
Figure 7. PCB with LEDs shining from behind the sensors and the measuring of the light’s hotspot and dispersion diameters.
Figure 7. PCB with LEDs shining from behind the sensors and the measuring of the light’s hotspot and dispersion diameters.
Electronics 11 00021 g007
Figure 8. Prototype with backlight testing of two different LED colors.
Figure 8. Prototype with backlight testing of two different LED colors.
Electronics 11 00021 g008
Figure 9. The figure depicts the testing environment specifically designed and developed from scratch to validate the touch-based prototype. In (a) the structure of the CNC is shown. (b) Depicts the CNC controller board with digital inputs. (c) Illustrates the the “finger” probes that emulate a human touch. The gripper in (d) was custom-designed and printed using 3D printing technology to allow attachment the probe to the moving part of the CNC strucutre.
Figure 9. The figure depicts the testing environment specifically designed and developed from scratch to validate the touch-based prototype. In (a) the structure of the CNC is shown. (b) Depicts the CNC controller board with digital inputs. (c) Illustrates the the “finger” probes that emulate a human touch. The gripper in (d) was custom-designed and printed using 3D printing technology to allow attachment the probe to the moving part of the CNC strucutre.
Electronics 11 00021 g009
Figure 10. Flowchart of the CNC test script. The G-Code source code is illustrated next.
Figure 10. Flowchart of the CNC test script. The G-Code source code is illustrated next.
Electronics 11 00021 g010
Figure 11. (a) Testing of a slider-type sensor. (b) Testing an on–off sensor.
Figure 11. (a) Testing of a slider-type sensor. (b) Testing an on–off sensor.
Electronics 11 00021 g011
Figure 12. Sensor presenting the worst performance after 10,000 iterations.
Figure 12. Sensor presenting the worst performance after 10,000 iterations.
Electronics 11 00021 g012
Figure 13. Sensor presenting the best performance after 10,000 iterations.
Figure 13. Sensor presenting the best performance after 10,000 iterations.
Electronics 11 00021 g013
Figure 14. Sensor presenting a typical performance after 10,000 iterations.
Figure 14. Sensor presenting a typical performance after 10,000 iterations.
Electronics 11 00021 g014
Figure 15. Sensor presenting another typical performance after 10,000 iterations.
Figure 15. Sensor presenting another typical performance after 10,000 iterations.
Electronics 11 00021 g015
Figure 16. Example of the number of components and complexity of (a) side mirror’s and (b) windows’ control buttons.
Figure 16. Example of the number of components and complexity of (a) side mirror’s and (b) windows’ control buttons.
Electronics 11 00021 g016
Figure 17. (a) Button-controlled seating prototype developed for controlling car cabin functionalities. (b) Demo screen developed for emulating the car’s windows and seat positioning functionalities.
Figure 17. (a) Button-controlled seating prototype developed for controlling car cabin functionalities. (b) Demo screen developed for emulating the car’s windows and seat positioning functionalities.
Electronics 11 00021 g017
Figure 18. (a) Front and (b) lateral details of the developed car seats for controlling functionalities inside the cabin.
Figure 18. (a) Front and (b) lateral details of the developed car seats for controlling functionalities inside the cabin.
Electronics 11 00021 g018
Figure 19. Driver’s seat rotation animation as shown on the demo screen.
Figure 19. Driver’s seat rotation animation as shown on the demo screen.
Electronics 11 00021 g019
Table 3. Statistical data regarding the worst-performing sensor.
Table 3. Statistical data regarding the worst-performing sensor.
Statistical Data
Number of tests10,000
Test duration (s)10,842
Average response time (s)0.138
Failure count55
Failure rate0.55%
Failure distribution, Q11
Failure distribution, Q21
Failure distribution, Q321
Failure distribution, Q422
Table 4. Statistical data regarding the best performing sensor.
Table 4. Statistical data regarding the best performing sensor.
Statistical Data
Number of tests10,000
Test duration (s)6909
Average response time (s)0.18
Failure count0
Failure rate0%
Failure distribution, Q10
Failure distribution, Q20
Failure distribution, Q30
Failure distribution, Q40
Table 5. Statistical data regarding the sensor in Figure 14.
Table 5. Statistical data regarding the sensor in Figure 14.
Statistical Data
Number of tests10,000
Test duration (s)6820
Average response time (s)0.172
Failure count3
Failure rate0.03%
Failure distribution, Q10
Failure distribution, Q20
Failure distribution, Q32
Failure distribution, Q41
Table 6. Statistical data regarding the sensor in Figure 15.
Table 6. Statistical data regarding the sensor in Figure 15.
Statistical Data
Number of tests10,000
Test duration (s)10,625
Average response time (s)0.139
Failure count7
Failure rate0.07%
Failure distribution, Q11
Failure distribution, Q21
Failure distribution, Q33
Failure distribution, Q42
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Custódio, T.; Alves, C.; Silva, P.; Silva, J.; Rodrigues, C.; Lourenço, R.; Pessoa, R.; Moreira, F.; Marques, R.; Tomé, G.; et al. A Change of Paradigm for the Design and Reliability Testing of Touch-Based Cabin Controls on the Seats of Self-Driving Cars. Electronics 2022, 11, 21. https://doi.org/10.3390/electronics11010021

AMA Style

Custódio T, Alves C, Silva P, Silva J, Rodrigues C, Lourenço R, Pessoa R, Moreira F, Marques R, Tomé G, et al. A Change of Paradigm for the Design and Reliability Testing of Touch-Based Cabin Controls on the Seats of Self-Driving Cars. Electronics. 2022; 11(1):21. https://doi.org/10.3390/electronics11010021

Chicago/Turabian Style

Custódio, Tiago, Cristiano Alves, Pedro Silva, Jorge Silva, Carlos Rodrigues, Rui Lourenço, Rui Pessoa, Fernando Moreira, Ricardo Marques, Gonçalo Tomé, and et al. 2022. "A Change of Paradigm for the Design and Reliability Testing of Touch-Based Cabin Controls on the Seats of Self-Driving Cars" Electronics 11, no. 1: 21. https://doi.org/10.3390/electronics11010021

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop