Next Article in Journal
Adaptive Smart eHealth Framework for Personalized Asthma Attack Prediction and Safe Route Recommendation
Previous Article in Journal
Optimal Water Management Strategies: Paving the Way for Sustainability in Smart Cities
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Ranking Sustainable Smart City Indicators Using Combined Content Analysis and Analytic Hierarchy Process Techniques

College of Architecture and Planning, Imam Abdulrahman Bin Faisal University, P.O. Box 1982, Dammam 31441, Saudi Arabia
Smart Cities 2023, 6(5), 2883-2909; https://doi.org/10.3390/smartcities6050129
Submission received: 22 August 2023 / Revised: 25 September 2023 / Accepted: 12 October 2023 / Published: 19 October 2023

Abstract

:
Sustainable Smart Cities have a significant potential to ensure equal access to public services, achieve sustainability and governance transparency, improve livability, and anticipate and mitigate increasingly changing threats. This study aims at prioritizing a core set of Sustainable Smart City (SSC) indicators using a combined methodology: (a) Content Analysis and (b) Analytic Hierarchy Process. The study’s contribution is that it successfully developed a more robust ranking of the above-mentioned set of indicators by combining AHP and co-occurrence analyses. The final combined ranking is intended to serve as a Decision Support Tool to streamline the decision-making process and help decision-makers prioritize dimensions to measure, achieve, or monitor actions when they cannot be undertaken simultaneously in contexts of economic recessions, financial constraints, and resource mobilization challenges. The findings draw attention to the need for considering the concept of SSCs through the prism of interconnecting the various current technology-driven “smart silos” under an inclusive umbrella that focuses on the combinations and connectedness to achieve a systemic approach to sustainability and smartness that none of those single areas can achieve in isolation. The results also revealed an interesting paradox, which relegated the Technology and ICT dimension to the bottom of the ranking, contrary to the widespread consensus and opinion, opening an opportunity for discussion among peers.

1. Introduction

Cities, as dynamic living centers and hubs of production and consumption, play an essential role in the global economy [1], are major contributors to the world’s GDP [2], and serve as arenas for political and social changes. Effectively managed cities can benefit residents by generating economies of scale through shared amenities [3], fostering health and well-being. Cities not only leverage industries’ creativity and help firms to innovate [4], but also serve as cultural incubators that determine livability and attract, retain, and nurture the segment of the creative labor force needed to succeed in the new economy. Cities facilitate access to cultural activities, are perfect places for social interaction and fertile grounds for diversity and people’s proximity [5], as well as motivators for innovation and job creation through the flows of new ideas. Humans’ imagination of their future has traditionally focused on urban environments because most of this future has always occurred within the city [6]. The unprecedented growth of cities since the industrial revolution combined with the United Nations’ prediction that an estimated 68% of the world population will live within urban areas by 2050 [6], will require major urban infrastructure developments to provide adequate services and cope with the needs of urban dwellers. Cities also have no option but to adapt to the increasingly warming and resource-deprived world. From traffic congestion to energy usage and obsolete infrastructure, the major potential of Smart Cities (SCs) is their ability to help solve a wide array of issues that “traditional cities” face but struggle to solve. Through digital channels, they can guarantee equal access to public services for all citizens and achieve transparency, sustainability, and improved livability. The SC concept also refers to cities’ ability to anticipate and mitigate threats using intrinsic intelligence in their information systems [7]. Thus, for a city, being smart would be a way to respond smartly to the growing and ever-changing challenges threatening its operation and sustainability. An SC aims to improve resource use efficiency, reduce environmental impacts, increase social inclusion, and foster economic development. However, the need for achieving both sustainability and smartness suggests that focusing on the extended concept of Sustainable Smart City (SSC) will better match the expectations of city administrators and users.
To achieve urban sustainable futures, planners and decision-makers have a panoply of indicators at their disposal, and the need to prioritize one set of indicators over the others appears to be more urgent than anticipated in achieving successful SSCs. It can become critical in contexts of complexity and uncertainty, financial constraints, or resource mobilization challenges. Prioritizing indicators is also vital for improving productivity and time management in many instances where the decision-making processes face the dilemma of deciding which set of indicators needs to take priority.
This study aims to develop a set of prioritized indicators for SSCs using a combination of two methods - (1) Content Analysis of selected related literature and (2) Expert judgments—as a means of prioritizing these selected dimensions using the Analytic Hierarchy Process (AHP) technique. The priorities resulting from the AHP analysis are then combined with the weights determined from the content analysis of the literature review to derive a final ranking of the selected SSC dimensions. This objective is intended primarily as a Decision Support Tool to address challenges and difficulties that may arise when deciding which indicators to prioritize as metrics for deploying or assessing SSC achievements. The practical significance of this study can be seen in the assistance it can provide decision-makers seeking prioritized action plans when it is difficult to implement many SSC initiatives or activities simultaneously. By exploring and combining two ways to rank the set of SSC dimensions, the analysis will examine the effectiveness of using an AHP-based method in investigating the priority/importance of SSC dimensions. Soliciting experts’ judgments on SSC indicators offers valuable output for decision-makers, practitioners, and stakeholders working with SSC assessment. Similarly, scrutinizing the literature to investigate the assessment tools that have been acknowledged to be highly beneficial in simplifying the understanding of complicated urban issues [8], can help identify the appropriate sets of indicators to achieve efficient decision-making processes to support governance, city management, as well as measurement and monitoring of progress towards identified targets of sustainability and smartness [8,9,10,11,12].
According to scholarly literature, significant research efforts have been undertaken on many aspects of the assessment tools. An equally substantial gap can be identified: providing a robust prioritized set of SSC indicators to streamline the decision-making process remains a relatively unexplored facet. And the purpose of this research is, precisely, to fill that need.
The study is organized into five sections, including the introduction (Section 1) and the conclusion (Section 5). Section 2, Literature Review, explores the origins of the “hybrid” concept of SSC as a solution to the deficiencies of the two preceding concepts and the body of tools developed worldwide as metrics for assessing sustainability and smartness. Section 3, Methods and Data Collection, explains how the two methods, Content Analysis and the AHP technique, were carried out. The criteria and strategy for scanning the literature to identify the most commonly used indicators, as well as the process of obtaining the experts’ opinions to prioritize the previously identified indicators from the co-occurrence analysis, will be explained. This section also describes how the final combined ranking of the set of indicators will be achieved. Section 4, Findings and Discussion, will discuss the weights assigned to the SSC indicators based on their frequency of use in the literature, the priorities decided by the panel of experts for the same set of indicators, and the resulting combined ranking. Finally, the study will conclude with some research perspectives based on these findings.

2. Literature Review

2.1. From Sustainable Development and Smart City to Sustainable Smart City

At its inception, the paradigm of Sustainable Development (SD) emerged in response to the increasing environmental and social challenges created by the traditional economic development model. SD seeks to develop a sort of equilibrium between the development dimensions (environmental, economic, social, and cultural). The “World Commission on Environment and Development” (WCED) originated the most widely accepted definition of SD in its 1987 Brundtland report “Our Common Future” [13], stating that SD is the development that meets the needs of the present without compromising the ability of future generations to meet their own needs. Then, the concept evolved gradually, raising awareness of anthropogenic impacts on the environment and society, and creating a dynamic for saving the planet. This process culminated in 2015 with the initiative “Transforming Our World: The 2030 Agenda for Sustainable Development”, an action plan of 17 sustainable development goals (SDGs) to end poverty, protect the planet, and ensure peace and prosperity for all by 2030. SD relies on a set of principles that help to achieve a more sustainable world [14,15]: (1) intergenerational as well as intragenerational equity, (2) stakeholders’ effective participation in decision-making processes, (3) integration and a holistic approach to environmental, economic, social, and cultural dimensions, and (4) the principle of a cautious approach to minimize damage. Nevertheless, despite the brighter future it promises for humanity, the concept of SD has not been without controversy. As stated by Huovila et al. [16], the concept has been described as obsolete in today’s digitalized society. Also, Barkemeyer et al. [17] considered that urban SD has been incorporated into a business-oriented agenda that has weakened the concept.
On the other hand, in the past decade, especially since the adoption of the SDGs, approaches to urban issues have gradually converged on, and conveyed principles of SC that have come to dominate the scene among scholars, city planners, and practitioners. Cities’ visions, goals, and priorities for becoming smart may differ depending on local contexts; however, the use of ICT to improve urban operations and service delivery appears to be a common denominator. The concept of SC can be traced back as early as the 1960s and 1970s when the American “Community Analysis Bureau” was collecting data and directing services in Los Angeles using databases, aerial photography, and cluster analysis [18], but the term “Smart City” appeared a decade later, coined by the American author E. Diller [19]. In the 1990s, the first virtual digital city was created in Amsterdam (De Digital Stad (DDS)), providing online access to information and services, and in the 2000s, IBM and Cisco launched separate initiatives to promote SC solutions based on ICT infrastructure and data analytics [20].
Various definitions have been proposed as to what constitutes an SC, and they continue to (co-)exist throughout the urban sphere, feed the scholarly literature, and have relevance to the policy and practice of those who study or manage cities. IBM [18], The European Union [21], and Cisco [22], each have its definition. However, and to a large extent, these definitions are ICT-oriented and refer to a city that uses data collected from various sources, devices and citizens, to monitor and manage urban systems and efficiently deliver urban services, relying on Information and Communication Technologies (ICTs) capabilities. The reference to ICT is also in line with early attempts to define the concept, such as Eger [23], Washburn et al. [24], Harrison et al. [25], Batty et al. [26], Lazaroiu and Roscia [27], Lombardi et al. [28], Bakıcı et al. [29], Mulligan and Olsson [30], and Townsend [31]. Other earlier studies, such as the work of Shapiro [32], had broadened the scope of smartness to include its outcomes, such as quality of life, while more recent research had complimented the focus by highlighting sustainability and services to the citizens [33,34,35,36,37,38,39], as did Caragliu et al. [40], when they underlined that for a city to be smart, investments in human and social capital, as well as traditional and modern ICT communication infrastructure, should support sustainable economic growth and improved quality of life, through participatory governance. Similarly, after reviewing the published literature on smart cities from 1993 to 2012, Cocchia [41] enumerated the key attributes of a smart city as (a) global, (b) intelligent, and (c) knowledge-based. Nevertheless, despite this profusion of definitions around the SC model, Hamman et al. [42] reported no clear consensus on what constitutes an SC.
Over time, the concept of SC has evolved, influenced by various historical milestones related mainly, but not only, to advances in visible or embedded technologies. Currently, a set of core principles serve as a guide to achieve SCs: (a) Citizen-centered process; (b) Innovation [40]; (c) Integration [43]; and (d) Sustainability. Technological advances can help foster long-term urban growth, and according to many scholars, the use of ICT to plan or manage cities is at the root of the SC model [44,45,46].
The SC model has also been criticized on several levels. Hollands [47] pointed out a competitive form of urban “entrepreneurialism” that compromises citizens’ participation in an SC model dominated by corporate and entrepreneurial governance. Furthermore, in a panel data analysis of 15 UK cities, Yigitcanlar and Kamruzzaman [48] found a lack of contribution of SCs in achieving tangible sustainable urban outcomes. Other authors, such as Patrão et al. [49], reported contrasts between the rise in the implementation of SC projects and a persistent lack of a common and acceptable definition for the SC concept, yet noticed a degree of consensus on the substantial role of technologies in shaping urban development. Earlier, Temenos [50] had criticized the idea of SCs as being a tool to legitimize growth-oriented policies, and according to Kambites [51], the concept has also been “abusively” adapted to boost neoliberalism’s attempts to advocate for the compatibility of economic growth with environmental protection. It can be easily seen that in many other circumstances, smart city projects have also been described as a target of urban branding to restore the image of declining cities in times of citizen involvement and control over governmental action, as well as a way to attract foreign finance and corporate investments.
Despite these criticisms, the literature underscores the importance of attaining advanced levels of sustainability and smartness for the future of cities, and in response, the hybrid concept of SSC has arisen and gained prominence [39,52,53] to ensure that SC solutions are aligned with sustainability, and conversely, that sustainability considerations are in harmony with the needs of modern, highly digitalized cities. Over the years since the economic, environmental, and sociocultural concerns were set as a shared perspective for SD, digitalization and innovation have gained equal importance and visibility as core requirements of sustainable communities. After examining more than 100 definitions of smart cities, the International Telecommunication Union (ITU) proposed the concept of “Smart Sustainable Cities” [54]. It argued that an SSC would benefit from the use of ICT to achieve a better quality of life and increase its competitiveness and the effectiveness of urban services [55]. The conceptual difference between Smart Cities and Sustainable Cities would be that SCs use intelligent technologies to achieve sustainability [56].

2.2. Sustainable Smart City Indicators

Since the Rio Earth Summit in 1992, several assessment tools with different sets of indicators have been developed, relying on many data sets to monitor, analyze, and provide solutions to the challenges of the complex aspects of SD. Currently, this area is of immediate relevance to the pressing need to achieve the SDGs and is, thus, of particular interest for research and practice. On the other hand, factors such as differences in culture, the structure of the economic systems, and the nature of the administrative and political institutions, all contribute to local success. Consequently, the indicators to measure or assess sustainability and smartness would be influenced by local factors and might differ or be differently perceived from one community to another. Nonetheless, several measures considered core requirements in spheres such as economics, technology and innovation, governance, socio-cultural environments, quality of life, and infrastructure, remain critical across most cities and communities. This section summarizes the literature on the SSC indicators used as metrics tools. The following material, which investigated a large spectrum of these sets of indicators, has been portrayed chronologically from the most recent to the oldest for the sake of convenience.
Pandey and Albert [57] reviewed commonly used standards of community sustainability, proposed by several international evaluation systems, associations, special interest groups, and researchers. The authors debated the selection of indicators for building a system to measure and monitor a region’s progress. The study classified the indicators into the six factors suggested by the “Political, Economic, Social, Technological, Environmental, and Legal (PESTEL)” analysis. It noted an over-representation of indicators monitoring environmental factors (in older evaluation systems) against indicators across the PESTEL factors (in newer evaluation systems). The study highlighted, in particular, the gaps in measuring the progress in fields such as Governance, Technology, and Innovation of a city. Adopting a different point of view, Giffinger and Kramar [58] questioned “How might the use of indicators promote smart urban planning in a place-based approach?”. They examined the policy implications of the European SCs’ approach. They agreed with previous studies, such as Komninos and Mora [59] and Cocchia [41], that the literature was fragmented, resulting in “heterogeneous understandings [and definitions] of an SC”. In a critical analysis, Sharifi [60] examined 34 tools for SC assessment. The designated tools were gauged using a comprehensive multi-criteria approach. The study findings pointed to a lack of success of these tools in addressing criteria related to “stakeholder engagement, uncertainty management, interlinkages, and feasibility.” Moreover, the author noticed an imbalanced distribution of indicators favoring dimensions such as “mobility”, “economy”, and “environment” at the expense of “people” and “data”, which are otherwise crucial for effective SC deployment. Compared with others, the originality of this study is that it highlighted a major shortcoming of the analyzed tools: the limited use of modeling and scenario-making techniques in the face of uncertainty. Lombardi and Giordano [61] reviewed the UN-HABITAT’s Global Urban Indicators Database and identified a blend of quantitative indicators, qualitative checklists, and hybrid performance indicators. They also inventoried more than 150 systems of city indicators covering diverse geographical and thematic criteria, in addition to a variety of methods that aimed to evaluate sustainability in the built environment.
Similarly, Stratigea et al. [62] investigated some global approaches for assessing the performance of SCs. Their findings enabled the identification of a city-specific set of sustainability indicators. They perceived the selection of adequate indicator sets as an intriguing issue that has triggered confusion, prevented proper monitoring of urban sustainability projects, and even as a source of mistrust and opacity, leading to support for pre-defined policy directions and decisions. Using comparative analysis, Huovila et al. [16] developed a “taxonomy” to evaluate seven standards (including 413 indicators) for SSC. They evaluated each indicator against (a) five categories of urban sustainability and smartness, (b) ten domains/dimensions, and (c) five indicator types. They pointed out two predominant streams of indicators, (1) those aimed at evaluating the implementation of SC approaches, and (2) those designed for sustainability assessment, and also reported a scarcity of scientific literature on “standardized frameworks of city indicators”. However, this shortage appears to contrast with the wide range of indicator frameworks utilized to assess urban smartness and sustainability revealed by other studies, such as the earlier works of Albino et al. [10] and Sharifi and Murayama [63]. In an attempt to organize SC indicators in thematic areas, Petrova-Antonova and Ilieva [64] conducted a survey and examined a set of performance and sustainability indicators for smart cities. They considered the selection of adequate indicator sets as a challenging task.
Caird [65] introduced a different perspective for a typology of the indicators: (a) Assessment Models, such as the “Smart City Maturity”, the “Smart City Reference”, and the “European Smart Cities Ranking”; (b) Assessment Tools such as the “Smarter City Assessment”, the “Smart City Index Master Indicators” framework; and (c) Assessment Indexes such as the “Ericsson Networked Society City” and “Cities of Opportunity”. Anand et al. [66], in turn, identified a set of 20 sustainability indicators for designing SCs in developing countries and determined their relative importance using fuzzy and fuzzy-AHP methods. Albino et al. [10] presented other tools, such as the “Smart Ranking Systems” developed by the University of Vienna, the “Intelligent Community Forum’s Smart 21 communities”, the “Global Power City Index”, the “Smarter Cities Ranking”, the “World’s Smartest Cities”, the “IBM Smart City”, and the McKinsey Global Institute rankings.
Several other studies have also approached urban smartness and sustainability through an indicator-based lens for benchmarking global cities, as Phillis et al. [67] reported. The spectrum of these approaches includes the UN-HABITAT “City Prosperity Index (CPI)”, the “Cities in Motion Index (CIMI)” published by the IESE Business School, the “Global Power City Index” mentioned by Ichikawa et al. [68]; and the “Sustainable Assessment by Fuzzy Evaluation Index (SAFE)” ([67]). These studies aimed to benchmark global cities by employing up to 77 weighted indicators and different aggregation methods.
In a systematic literature review, Purnomo and Prabowo [69] reviewed 30 studies from four sources (Science Direct, ACM DL, IEEE Xplore, and Google Scholar). They identified six sustainability dimensions (18 indicators) as the most frequently used. Likewise, in a review of environmental assessment tools for sustainable urban design, Ameen et al. [70] scrutinized the well-known tools rating the sustainability of urban areas (Building Research Establishment Environmental Assessment Method for Communities (BREEAM), Leadership in Energy Environmental Design (LEED), LEED for Neighborhood Development (LEED-ND), and Comprehensive Assessment System for Built Environment Efficiency (CASBEE)), in addition to other sustainability assessment tools for urban design and development, such as the “Pearl Community Rating System (PCRS)” and the “Global Sustainability Assessment System (GSAS)”. They highlighted a discrepancy in the global sustainability assessment tools in local and international contexts. Castanheira and Bragança [71] studied how the sustainability assessment tool “SBToolPT-UP” has evolved from the building level to the broader built environment, providing an eloquent example of how the need to assess sustainability at the urban and regional levels has led to adaptations of tools initially designed for the building level. This example also demonstrates a scale shift in the approach to sustainability and illustrates the extension of the concept from a simple addition of many sustainable buildings scattered around the city to an SC as a whole, as well as how the emphasis of the sustainability assessment tools has extended from buildings to urban operations.
Regarding the significance of the factors considered by the assessment tools, the scientific discipline perspective appears to be an essential factor affecting the focus, which may range from the big data questions described by Laurini [72], to governance considerations, as reported by Meijer and Bolívar [73], and from issues related to technical innovations, as discussed by Schaffers et al. [74], to those associated with economy and sustainability, as stated by Caragliu et al. [40]. Nam and Pardo [75], have previously identified human, technical, and institutional factors as critical for SC deployment.
Most of this abundant literature is more suggestive of a focus on inventory, examining the significance of the addressed issues, investigating through an indicator-based lens for benchmarking cities, etc. Considerable research has been conducted on many aspects, but the objective of streamlining the decision-making process by prioritizing sets of indicators remains a relatively unexplored aspect.

3. Methods and Data Collection

This study employed a combined methodology to rank SSC indicators based on the results of the two methods. The goal was to provide a panel of experts, in an AHP analysis, the opportunity to validate (or refute) a ranking of pre-selected SSC indicators with the highest occurrence frequency in the relevant literature. Rather than relying solely on the content analysis results, the novel combination of the AHP and co-occurrence analysis was expected to produce a more robust ranking of the SSC dimensions than could be provided by either methodology alone. Figure 1 shows a flow chart of the method, while Section 3.1 and Section 3.2 explain the two phases in detail.

3.1. Content Analysis: Selection of the Most Used SSC Dimensions from the Literature Review

The commonly used sustainability indicators for SCs were identified and selected following a literature review based on the following criteria: (a) Google Scholar was used to broadly search for scholarly literature that included the keywords “sustainability”, “smart city”, “indicators”, “assessment tools”, and “Sustainable Smart City”; (b) The research/study should be published after 2015; and (c) the research/study must feature the following characteristics: (1) be open access, (2) include an explicit list of indicators, and (3) should not be city-specific, sector-specific (case study), or conducted through the prism of highly specialized topics, to avoid targeted selection of indicators.
The rationale behind criterion (b) is that the investigated literature showed an over-representation of indicators monitoring environmental factors in the older evaluation systems. Because 2015 marked a shift regarding sustainability, limiting the investigated literature to publications post-2015 was justified. Indeed, gaps in measuring the progress in fields such as Governance, Technology, and Innovation have been noticed [57], and many tools have also been focusing on the environmental facets of SD and resource management [61]. Clearly, until the advent of the 2030 agenda in 2015, the concept of SD suffered from a kind of “environmental overdose” in the sense of an over-representation of environmental concerns at the expense of equally important social aspects, such as participatory democracy, promotion of good governance, gender equality, and social equity. The concept of SSC has seen, during the past decade, a paradigm shift from an initially environment-driven focus (for sustainability) and an ICT-driven focus (for SCs) to adopting a more inclusive approach, in which the above-mentioned social dimensions are being recognized as vital [60]. The 17 focus areas of the SDGs expand concerns to more comprehensive global development pillars. By integrating broader issues and a more systemic approach, this global shift anticipates a strategic transformation towards increasing competitiveness and quality of life [39], approaching sustainability through the new prism of global climate change. There is a widespread consensus that the Millennium Development Gols (MDGs) failed to represent the complexity of global sustainable development, while SDGs were developed with stronger socioeconomic aspects. This study assumes that studies published before 2015 would have been influenced by the limitations of the pre-2015 paradigm, even if they undeniably played a crucial role in the developments that have occurred.
A careful analysis of the complete list of dimensions points to some studies as being characterized by short lists of dimensions and the use of single words (two, at most) to designate them. On the other hand, other studies refer to many of the dimensions by using combined wordings, heterogeneous multiple appellations, and different expressions, thus creating terminological confusion that requires harmonization/normalization to eliminate composed designations, overlapping indicators, and duplications, as well as statements that are unclear or have comparable meanings. As a result, a new harmonized list of SSC dimensions was generated after appropriate adjustments. From the initial list of dimensions/indicators, a harmonized list was derived to achieve consistency by (a) simplifying some designations and (b) assigning one single appellation to areas with close or similar meanings. The final list is used to depict the most frequently mentioned dimensions.
The common indicators were highlighted using a content-based analysis (term co-occurrence) after eliminating redundant indicators that belonged to the same tool/system and combining similar ones. This harmonization procedure was mainly required for indicators within the “United for Smart Sustainable Cities (U4SSC)” initiative [76], and for tools examined by Giffinger and Kramar [58], Purnomo and Prabowo [69], and Guelzim et al. [77],. Examples of the combination and renaming of indicators or dimensions are provided in Figure 2 and Figure 3 below.
As the pre-selection procedure suggests, the designation, description, and number of dimensions were to be defined after finalizing the harmonization process and occurrence counting. For a dimension/indicator to be selected, it should fall above the threshold value of the eight most commonly used indicators. The threshold of eight was determined according to the reasonable number of pairs described in Section 3.2 (To avoid the experts being distracted by a crowded matrix).

3.2. Analytic Hierarchy Process (AHP) Method: Ranking the Selected Dimensions

According to Abba et al. [79] and Saaty and Kearns [80], AHP is an effective technique for evaluating quantitative and qualitative decision-making processes, specifically in complex unstructured decision problems. In this study, based on experts’ judgments, the application of AHP to determine the relative importance of SSC dimensions/themes was intended to rank eight pre-identified SSC dimensions.
To synthesize the priorities, the AHP method uses a comparative judgment of the statements or alternatives (in this case, selected SSC dimensions/themes). The data were collected through an expert-based priority pair-wise comparison.
The experts were requested to rate the eight selected dimensions, comparing one pair of statements at a time. A pre-final version of the comparison table was evaluated by two senior experts (pilot survey) for validation and feedback: (a) to ensure that the questions accurately reflected the topic under investigation and (b) to suggest improvements to the selected dimensions regarding terminology and content.
Regarding the size of the experts’ sample in AHP surveys, although it is generally admitted that a small sample size might adversely affect a study’s reliability, data analysis, and achievement of statistically robust results, there is also support for small sample sizes. According to Darko et al. [81], AHP has a major advantage over other Multi-Criteria Decision Making (MCDM) methods, which is its ability to handle small sample sizes without the need for statistically significant (large) sample sizes. The number of experts required to produce reliable results has been a subject of debate among researchers: Ishizaka and Labib [82] argued that several experts are preferred in an expert-based survey to ensure minimum bias in processing the experts’ judgments, whereas others, like Tavares et al. [83], considered that even one single qualified expert judgment could be accepted and considered as representative. Saaty and Özdemir [84] agreed that the opinion of a single expert who is highly knowledgeable and experienced in the topic examined can fulfill the requirements. Several studies have utilized a small number of respondents, and this study assumes that as far as the number of experts is higher than the average number shown in Table 1, the judgments are accepted.
To determine the relative priority of the selected dimensions, this study relied on a pair-wise comparison matrix (PCM) based on the nine-point scale of relative importance developed by Saaty [92], as shown in Table 2.
To avoid the experts being distracted by a crowded matrix, which might lead to uncertainty of judgments, inconsistency, or arbitrary answers, the final set to be considered should not generate a high number of pairs. The following method determined the maximum number of dimensions (statements) that would guarantee a reasonable number of pairs.
The question of how many pairs can be made of a set of (n) indicators can be answered by:
n   choose   2 =   n ! n 2 ! 2 !
Given that: n! = n (n − 1) ((n − 2) (n − 3) ... (2)(1)) = n (n − 1) (n − 2)!
and: 2! = (2) (1) = 2
Equation (1) can be simplified to Equation (2) below, which is the formula for the number of pairs (P) needed in “at least n” statements.
P = n n 1 n 2 ! n 2 !   2 ! = n n 1 2 ! = n n 1 2
After consultations with peers, this study assumed that P ≤ 30 is a reasonable number of pairs that do not distract the experts. Then, using Equation (2), the number of dimensions (n) can be calculated by solving the polynomial equation of the second order:
n2n – 60 = 0
For P = 30 = n n 1 / 2 , the number of areas/dimensions to be paired should be n = 8.26. Hence, the selected integer of eight dimensions is accepted and provides 28 pairs. P = 28 is this study’s reasoned, yet “subjective”, choice, assuming that this would not distract the experts. One could choose any other number of pairs using different arguments. However, using a high number of criteria for the sake of comprehensiveness may sometimes be neither desirable nor feasible [60], because a long list of indicators may pose difficulties at the time of data collection and may not necessarily be measurable. Table 3 shows that starting from nine statements (dimensions/indicators), an expert would have to handle more than 36 pairs, which can be easily distracting. As a result, the temptation to provide random answers would be strong.

3.2.1. Aggregation of the Experts’ Priorities

Balogun et al. [93] and Şener et al. [94] have reported four main methods to combine experts’ judgments of a given set of statements into consensus priority matrices in the AHP technique: (1) vote or compromise; (2) separate models or players; (3) geometric/arithmetic mean method; and (4) consensus method. The present study selected the geometric mean method because it is reputed to be more appropriate and reliable for aggregating experts’ priorities than the other techniques, in addition to its ability to manage issues related to the reciprocity of judgments, according to Abba et al. [79] and Forman and Peniwati [95]. The geometric mean values of the experts’ priorities were determined using Equation (3), and the results were then introduced into the PCM to derive the local priorities.
= x 1 x 2 x 3 x n n
  • where: n is the number of experts (respondents).
  • And: x is the decision value scored by each respondent.

3.2.2. Calculation of AHP Weights Bounded by the Consistency Ratio Rule

The judgment matrix allows for computing the relative weights of the alternatives using the Eigenvector method, which provides a numerical value for consistency: the consistency index of the matrix (CI). CI is a metric that measures the difference between the maximum Eigenvalue (λmax) of the judgment matrix and the Eigenvalue (n) of a perfectly consistent matrix [96].
The judgments applied in the AHP models suffer from some limitations because they rely on the personal intuition of the surveyed experts and vary amongst them based on their perceptions, knowledge, expertise and experience, valuations, and biases [97,98]. All these factors affect the inferred judgments. In addition, variations/inconsistencies in human judgment can occur for a variety of reasons, such as disparities in participant backgrounds in group decision-making [99,100]. The nine-point scale has also been criticized for yielding results that do not meet established standards and consistency norms [101], supporting the need to frame the outcomes in MCDM or multi-attribute decision models. Most multi-objective approaches do not verify consistency [102], so the AHP technique stands out concerning its consistency check. The literature reports several methods and thresholds for judging the consistency of experts’ preferences or priorities, and these measures are useful tools to help validate true preferences [98,103]. On this matter, Saaty and Vargas established the concept of “acceptable inconsistency” based on a consistency ratio (CR) of 0.1 or less [104,105,106,107]. They even supported extending this ratio threshold when it is greater than 0.1 (indicating inconsistencies), stating that it should definitely be below 0.2. CR indicates the extent to which the experts’ judgments were consistent compared with a random comparison matrix (judgments have been decided randomly (arbitrarily)). If the condition C R 0.10   10 % is met, the computed CR value is acceptable. Otherwise, the judgments are seen as highly random and need revision. Boucher et al. [108] noted that this is simply a guideline, and earlier, Islei and Lockett [109] argued that Saaty’s notion of consistency index/ratio provides a basic metric with limited statistical properties. Additionally, the consistency check has received criticisms for being, for example, hard to satisfy even in cases where the pair-wise comparison has been made in a reasonable, logical, and non-random manner [110]. Despite the critics, this study employed Saaty CR to verify the validity of the experts’ judgments because it has been widely used in previous studies.
In the present study, to determine the weight of each of the eight SSC dimensions, each pair was expressed using the judgment matrix represented by Equation (4) below:
Judgment   Matrix = ( a ij ) = a 1.1 a 1.2 . . . a 1 . m a 2.1 a 2.2 . . . a 2 . m . . . . . . . . . . . . . . . . . . a m .1 a m .2 . . . a m . m
  • where: aij indicates the relative priority of elements ai to aj.
  • If aij is scored 9, 7, 5, 3, or 1, then aji receives the reciprocal, 1/9, 1/7, 1/5, 1/3, or 1/1 (=1).
Using the Approximate Eigenvector method, the weight coefficient for each decision element was calculated employing Equation (5):
W i = M i i = 1 n M i
where: M i = j = 1 n a i j n , aij denotes the relative scores of element i to element j ranging from 1 to 9 and their reciprocal scores, and n represents the number of statements (dimensions).
With n being the dimension of the matrix, λmax designates the highest judgment’s Eigenvalue (Principal Eigenvalue), computed using Equation (6):
λ m a x = i = 1 n j = 1 n a i j w i n w i
Then, the Consistency Index (CI) is obtained using Equation (7):
C I = λ m a x n n 1
Finally, the results were validated using the Consistency Ratio (CR), plotted using Equation (8):
C R = C I R I
where: RI is the Random Consistency Index provided by Table 4.

3.3. Combined Ranking Using a Relative Scoring System

To derive a combined ranking of the eight dimensions, each of the two rankings (co-occurrence ranking and AHP ranking) was converted to a scoring system (local score). Every dimension received a score ranging from 0 to 10 based on relative rankings [111,112]. For each ranking, the best-performing dimension (BPD) scored 10, while all other dimensions receive a score based on their performance (percentage score) compared with the BPD, using Equation (9):
S i = X i × 10 X m a x
where:
  • Si is the local score of the dimension “i” out of 10.
  • Xi is the score given by the study to the dimension “i”.
  • Xmax is the score granted by the study for the BPD.
The score for the BPD in the two studies (co-occurrence and AHP) was calculated using Equation (10) below. The equation shows that the BPD, with the maximum ranking, would necessarily receive a perfect score of 10.
X   BPD = X B P D   × 10 X B P D = X m a x × 10 X m a x = 10
The overall score (combined score), representing the rank of significance/importance for each dimension, is an arithmetic mean of the two individual scores (Si) generated from the combination of the co-occurrence analysis and the AHP method. This scoring system assumes that the two methods are equally important. Yet, a weighted mean could have been employed by assigning proper weight to each method based on agreed criteria. This study did not choose the weighted mean scoring, assuming that a consensus should support the attribution of different weights to the literature co-occurrence over the AHP analysis (and vice versa) to allow for weighted means. Both ranking methods have advantages that are difficult to gauge against one another unless in-depth studies are performed. Indeed, the content analysis was performed on literature handling assessment tools generated by respected organizations and administrations and analyzed by esteemed researchers. Similarly, in the AHP analysis, and to the best of the author’s knowledge, the judgment request was directed to a panel of experts selected for their knowledge and experience, and their priority decisions were reputed to guarantee a reliable importance-based ranking of the selected dimensions.

4. Findings and Discussion

This section introduces and discusses the priorities of the dimensions as outcomes of the abovementioned Content Analysis, described in Section 3.1, and the AHP method detailed in Section 3.2. The ultimate objective, as stated previously, was to develop a ranking of the selected set of SSC dimensions through a combination of the experts’ priorities and the literature co-occurrence results. First, it presents the most frequently used dimensions as the result derived from the eleven studies selected in advance from relevant literature published after 2015. Second, the results of the AHP experts’ judgments are presented and discussed with reference to the CR of the PCM, which aggregated the experts’ opinions. This section ends with discussing the final combined ranking of the selected dimensions, and the factors that might have affected the results, including judgments’ bias.

4.1. Ranking of Selected SSC Dimensions through Literature Content Analysis

Based on the eleven studies chosen, 79 dimensions, including 768 indicators, were reviewed to proceed to the content analysis process and identify the most commonly utilized dimensions/indicators. To name a few, Sharifi [60,113] ended up with a set of seven themes and 44 indicators, but has examined 80 themes and 902 indicators, as shown in Table 5, row 3. This provides insight into the extensiveness of the set of dimensions/indicators analyzed.
As described in Section 3.1, the co-occurrence analysis conducted on the selected relevant literature identified eight major SSC dimensions/areas as the most frequently used in the various assessment tools developed globally. They were chosen because they demonstrated the highest frequency across the top eleven relevant studies (red box, Table 6). According to this study’s premise, the frequency of recurrence of the various dimensions indicates their significance: the more frequently used a dimension is, according to the weights shown in Table 6 and Figure 4, the more important it is for an SSC. These eight dimensions were presented to the panel of experts to proceed with the AHP analysis, as explained in Section 3.2.
Table 6 above, shows that Huovila et al. [16] and UNECE [76] were the most comprehensive studies among the selected sets. They virtually covered all the nominated dimensions, with an evident focus from Huovila et al. on the Living dimension, followed by Environment, and in third position, Economy and Production and Infrastructure have received equal consideration. UNECE, however, provides equal attention to Technology and ICT, Infrastructure, Environment, and Living. The Nature dimension, as expected, appeared as an outsider neglected by all the tools examined in the set of studies, probably because of its reputation being under Environment. If Nature were listed under Environment in the previously described harmonization process, the overall results of the content analysis would not change. The Environment dimension would have remained in second place with an occurrence of 24 as opposed to 23 initially, by transferring the weight from the last column to the third column of Table 6.
Technology and ICT stands out for its low performance and ranked near the bottom of the list, as shown in Figure 4. However, because it has collected equal occurrence as People and Society, one might question why they were both relegated to the bottom of the list when common belief would suggest that these two dimensions should be at the top of the list of the assessment tools given the consensus of valuing human and social considerations to achieve well-being, and the conviction that technologies are crucial to guarantee high levels of efficiency in monitoring and speeding SSC deployment.

4.2. Priorities of the Selected SSC Dimensions As a Result of the AHP Analysis

4.2.1. AHP Experts’ Profile

Considering the average number of experts used in previous studies (Table 1Section 3.2), the PCM was sent to 18 experts believed to have relevant knowledge in the fields of Urban and Regional Planning, Building Engineering, Architecture, Landscape, or related disciplines. Based on the object under investigation (cities), these disciplines were selected because they all deal with cities in some way or parts or components of cities. In their thinking and practice, they create the physical environment in which people live, but are more than just the built environment; they stand as a representation of how people see themselves, as well as how they see the world. A response rate of 89% allowed to compute 16 pair-wise comparison tables. Figure 5 depicts the characteristics of the respondents, of whom a large majority hold a Ph.D. and worked in academia.
Of this group of experts from various disciplinary and cultural backgrounds, more than two-thirds specialized in Urban Planning, two-thirds have more than ten years of experience, and two-thirds have very good and excellent knowledge of the SSC concept. These characteristics can justify a high confidence level in the experts’ ability to provide scientifically reliable judgments. Unfortunately, two experts from the private construction sector did not respond, which would have further diversified the sample.
Regarding the respondents’ geographical location, and as indicated in the Section 3, one single expert could meet the requirements. As a result, the number of experts based on their geographical origin (location) would not be a deciding element in confirming the AHP results. The research would have contacted a single knowledgeable person from a single country. If the consistency ratio was less than 0.1, the results would have been acceptable.

4.2.2. Computing the Pair-Wise Comparison Priorities

Equation (4) yields the PCM of Table 7 to calculate the pair-wise comparison priorities, where the relative weights assigned by the panel of experts are introduced. Equation (5) helps calculate the weight (priority) for each of the eight decision elements, as shown in the priority vector, normalized as seen in the one-column matrix at the far right (Table 8).
To indicate the degree of deviation (from the consistency) in the experts’ judgments, the Principal Eigenvalue ( λ m a x ) was derived from the PCM (using Equation (6)) by performing a Matrix multiplication of the PCM by the priority vector (Table 8).
The Eigenvector is then computed (Table 9), serving as a base to obtain the principal Eigenvalue from the average of this resulting vector:
λmax = AVERAGE (L1:L8) = AVERAGE (8.106 : 8.174) = 8.19096
Then, based on the Principal Eigenvalue, the CI and CR were computed using, respectively, Equations (7) and (8). Considering the dimension of the matrix n = 8, the CI for the matrix of this study receives the value CI = 0.02728
Given the number of items presented to the experts (eight dimensions), the RI that suits this study was extracted from Table 4 with the value RI = 1.41
Thus, CR = C I R I = 0.02728 1.41 = 0.01935  (1.93% < 10%)
Because the value of CR is substantially lower than the reference threshold (10%), as explained in the methods section, the CR rule is largely met, and the experts’ priorities are regarded to be consistent and valid.
As presented in Table 9 (priority column), the results show that People and Society (sixth row) was the highest scoring dimension (19.4%) and ranked first in priority over all other dimensions according to the experts’ judgment. This score contrasts with the content analysis of the literature review, in which it scored 9% (with 13 occurrences) and ranked sixth (ex aequo with Technology and ICT). Infrastructure was ranked second (13.8%), followed by Living (13.4%), in contrast to their previous performances in the content analysis: respectively, fifth (9.7%, with 14 occurrences) and first (24.3%, with 35 occurrences). This shows a sharp contrast between the experts’ opinion and the literature co-occurrence regarding People and Society, which gained five positions, whereas Living lost two levels, as did the Environment, which fell by two stages from second position (15.9%) in the literature co-occurrence analysis to fourth (12.7%) in the AHP priority ranking.
The finding that Technology and ICT was adjudged the least important dimension by the experts may cause controversy because it is at odds with the widely held conviction that the SC is judged on its ability to exploit new technologies. Surprisingly, this dimension also performed poorly, scoring 9% and ranking sixth in the literature co-occurrence study. As shown in Table 10, the AHP analysis improved the rank for three dimensions (People and Society, Infrastructure, and Mobility and Transportation). It generated a rank decrease for five others (Living, Environment, Governance, Economy and productivity, and Technology and ICT), showing a contrasting output of the two methods and confirming the choice made by this study to combine the rankings rather than relying on a single ranking.
When the two analyses are considered separately, they contradict on several aspects. For example, while the co-occurrence analysis determined that the Living dimension was the most significant (due to its frequency of use), the AHP technique allocated this rank to People and Society rather than the Living, which ranked within the top three. To a large extent, however, both perspectives agreed on the critical relevance of these dimensions in which the human being is at the center of attention. These two dimensions encompass a collection of indicators geared toward human quality of life, health condition, comfort, social aspects, etc.
Environmental considerations, on the other hand, were not agreed upon by the two analyses. Notably, the literature continues to prioritize this aspect (ranked second) even after 2015, and the predicted influence of the SDGS on broadening concerns beyond environmental factors to human and social considerations. Environmental concerns remain deeply rooted in the minds of those who develop the SSC assessment systems and tools. Experts, in contrast, ranked this dimension fourth, preferring aspects such as People and Society, Infrastructure, and Living, which indicates their awareness and commitment to the perspective adopted by the SDGs.

4.3. Combined Ranking of the SSC Dimensions

The co-occurrence and the AHP rankings were converted to a scoring system using the results of Table 10. Based on relative rankings, each dimension obtains a score ranging from 0 to 10, as described in Section 3.3. All local scores for all eight dimensions (S1-8) are presented in Table 11
For example, in the AHP ranking, Environment (i = 4) received a score of X4 = 12.75%, and the BPD (People and Society) was credited a score of Xmax = 19.40%. The local score can be calculated using Equation (9) as follows:
S4 = (12.75 × 10)/19.40 = 6.57 out of 10.
Similarly, in the co-occurrence ranking, Environment (i = 2) was attributed a score of X2 = 15.97%, and the BPD (People and Society) was credited a score of Xmax = 24.31%. The local score can be obtained using Equation (9) as follows:
S2 = (15.97 × 10)/24.31 = 6.57 out of 10.
Although the two scores are similar, this cannot be attributed to a concordance between the two methods because Environment was initially rated, respectively, second and fourth. This equality is due to its respective distance to the BPD in each study.
Column 6 of Table 11 displays the final combined score, which represents the priority rank for each dimension and is obtained from the two individual scores (Si) generated by the co-occurrence analysis and the AHP approach. The final combined ranking in column 7 highlights the unexpected position attributed to Technology and ICT, literally pushed to the bottom of the list. When utilized as a Decision Support Tool, seeking to determine which dimension to prioritize in the event of constraints, Technology and ICT will not be favored.
Figure 6, below, shows the final combined ranking with a proposed list of indicators under each dimension, allowing for generating a specific rubric for measuring elements such as SSC deployment, progress, and compliance.
As revealed by the final combined ranking, the findings contrast the common understanding of the concept of SSCs concerning the importance of Technology and ICT. Expected to be highly ranked, this dimension has received the least priority, opening an opportunity for discussion among scholars by questioning why the widely believed conviction that technologies are at the core of an SSC model received such disclaim, especially from the panel of experts. As previously mentioned, the definitions of an SC were concordant regarding the ICT-oriented aspect as forming the essence of the SSC [18,21,22,23,25,27,28,29,30,31,115,116]. Furthermore, ICT has always been regarded as a key component of smart cities, which involve digital devices and infrastructures streaming big data in space and time [117,118]. Because of the general agreement on the significance of this dimension, it was expected that both experts and the literature would rank it highly within the selected set. Several thoughts may arise in an attempt to explain these results.
A lack of adequate knowledge about the role of technological advances, visible or embedded, in the construction of SSCs would not be possible to explain the low rank assigned to this important dimension by the panel of experts, whose judgments were consistent with the literature co-occurrence analysis. In addition, the experts’ profile presented in Section 4.2.1, including area of expertise, years of experience, and education level, is likely to justify a high confidence level in their ability to provide scientifically reliable judgments. But this dimension can be so obvious for smartness that it cannot be evaluated or prioritized within a collection of dimensions. However, the multidisciplinary and multicultural profile of the panel of experts could be further investigated as a possible cause of this surprising rank. Various scenarios can then be envisaged, seeking the opinion of different expert groups with similar knowledge (expertise-wise homogenous groups) and comparing the priorities to examine whether the expert’s field of expertise impacts his answer. Would a panel of exclusively Network Infrastructure specialists, for example, rank the Technology and ICT dimension higher? Would a board formed entirely of Public Law specialists or Human Rights activists rank the Governance dimension higher? Would a group consisting solely of World Health Organization experts put Living first? And would specialists in urban and regional economics rate Economy and Productivity as the most important dimension? In other words, would homogeneous groups of experts be tempted to select the dimension with which they were familiar?
Moreover, further investigation into the effect of language sequencing on experts’ judgment is needed. It can be interesting to examine whether the experts’ responses would be unchanged if the word “Sustainable” were removed while preserving “Smart City” (or vice versa)? Would the brain focus on the word “smart” if the query was “Smart City” rather than “Sustainable Smart City”? Would the sequence of the words (“Sustainable—Smart—City” or “Smart—Sustainable—City”) affect the results? Would the experts focus on the first or later words?
However, there is more to this issue than bias.
Aside from bias considerations, exploring different aspects of the underlying causes of this ranking would provide broader possibilities based on the premises of a paradigm shift concerning SSC and how they are viewed in a highly digitalized future. The findings challenged the assumption that using ICT in urban management and planning can make a city more efficient and sustainable. They may suggest that assertions about the importance of Technology and ICT for SSCs do not mean it takes priority. It is self-evident that without ICT and advanced technologies, a city cannot be smart. On the other hand, it cannot function as a city unless the other dimensions are running efficiently. While average citizens expect to enjoy good living conditions, clean air and energy, and convenient transportation systems, they might think first of 5G internet, AI-operated buildings, and flying taxis. Fascinated by cutting-edge technologies and technology-mediated services, they might initially omit the other dimensions. An expert, in contrast, considers all aspects as a system. Experts may tend to put aside the obvious entirely because it is obvious and known to all, and emphasize often neglected and less evident factors. They consider that infusing technology into each of the city’s subsystems, one by one (transportation, energy, education, healthcare, physical infrastructure, etc.), does not necessarily make the city smarter. The city should be regarded as an organic system, a network, and a linked structure [119]. Rios [120] considered the SC to be a city that inspires, shares culture and knowledge, and encourages people to create and prosper. Experts may diverge from what seems to be known to all because they can see beyond technology, considering a city to be sustainable and smart when investments in human and social capital (and IT infrastructure) promote long-term prosperity through participatory governance [121]. Questioning if this is likely to clarify why experts have favored dimensions like Living and People and Society over others, remains dependent on a more in-depth analysis.
The discrepancy between the common belief and the experts’ judgment can also be linked to the experts’ view of the city’s holistic and long-term sustainability. While technology and ICT play a crucial role in the development of an SC, they are not the only factors determining its sustainability. The experts gave more weightage to dimensions like Living and People and Society in light of their direct impact on the well-being of citizens and the environment. Therefore, their judgment may have considered a broader range of factors that influence the overall sustainability of an SC, leading to a different ranking than the common belief. The vision of a futuristic city as a utopia, where a form of science fiction might find room for imagination, would also be tied to public opinion. This penchant would direct people’s spontaneous preference for Technology and ICT. Experts, on the other hand, as informed intellects, tend to consider all options that make up the city, regardless of their relative importance. This logic can exclude what is widely deemed vital for an SC. Experts believe that the SSC goes beyond the use of digital technologies and pursues better resource management with low emissions. These goals could be achieved by smarter urban transport networks, improved water supply and waste disposal facilities, and effective building lighting and heating systems.
As previously stated, the hybrid concept of SSC arose from the convergence, and in response to the weakness, of the concepts of sustainability and smartness. The new concept suggests that, despite its evident relevance, the dimension Technology and ICT no longer has the monopoly of attention in the city of the future, according to educated specialists. This dimension was not deemed a priority to express the shift to a system approach to SSC, but that does not mean it was considered unimportant. Experts look at the SSC holistically as a whole system, including various subsystems and the relationships between them.

5. Conclusions

The current study explored and combined two ways of ranking a set of SSC dimensions using (1) content analysis of selected related literature and (2) expert judgments as a means for prioritizing the previously selected dimensions. The analysis highlighted the effectiveness of using an AHP-based method to investigate the priority/importance of each dimension. Soliciting experts’ judgments of SSC indicators is believed to generate valuable output for decision-makers, practitioners, and stakeholders working with SSC assessment. Prioritizing actions when they cannot be undertaken simultaneously is crucial for a seamless decision-making process without any mistakes, delays, or setbacks. Similarly, this study demonstrated the effectiveness of scanning the literature for assessment tools that are particularly valuable for understanding complex urban issues and enabling efficient, transparent decision-making processes for governance, city management, and monitoring progress toward sustainability and smartness targets. Targeting precisely to address such challenges and difficulties, this study has successfully developed a set of prioritized indicators for SSCs, providing a Decision Support Tool to streamline the decision-making process. This ready-to-use prioritized set can help in real-world situations and be used to favor Living over the other dimensions if only a single dimension can be deployed or monitored and controlled. In the case of favorable circumstances (technical, financial, etc.) that allow running more than one dimension, the decision-maker can conclude that, according to this tool, dimensions like Living, People and Society, and Environment are to be favored over Infrastructure and Economy and production.
However, the study’s findings have revealed an interesting paradox that could initiate a fruitful debate: a contrast to the common understanding of the concept of SSCs concerning the importance of the dimension Technology and ICT, which has been attributed the least priority. The significance of this study relied on the choice to provide the panel of experts with the opportunity to validate (or refute) this low score rather than relying solely on the content analysis results. Moreover, the combination of the AHP and the co-occurrence analyses is, in fact, this study’s novel contribution, providing a more robust ranking of the SSC dimensions than can be provided using either of these methodologies alone.
Because this study sought the opinion of experts from various disciplinary and cultural backgrounds, future research should dig deeper to explore this paradox through the scenarios of “expertise-wise homogenous groups” discussed in Section 4, and investigate whether, and to what extent, the experts’ field of expertise impact their judgment. The study also examined the possible effect of language sequencing on experts’ opinion. This study shed light, as well, on the fact that the issue of discrepancy between the common belief and the experts’ judgment has more than bias to reveal, and analyzed different facets of the experts’ opinion as opposed to the average citizen’s view. Facts such as the experts’ holistic and long-term sustainability view of the city, socio-cultural aspects, well-being of citizens, etc., were also seen by this study as factors sitting behind the more weightage attributed to dimensions like Living and People and Society, at the expense of others like Technology and ICT.
This study’s findings open an opportunity for discussion concerning the concept of SSCs and its impact on society. It appears to be a natural progression for contemporary society, in general, to interconnect the various current “smart silos” such as smart homes, smart cars, smart transportation, etc., where technology is central, under an inclusive umbrella that focuses on the various combinations and connectedness to achieve sustainability and smartness that none of those single areas can achieve in isolation.
Many decision-makers, planners, and scholars have been convinced for the last decade that the future of the city will embrace significant digitalization, allowing the rise of a hyper-connected city. However, as the outcomes of this study demonstrate, this Smart City model is far from making a consensus between urban people’s wishes and experts’ priorities. Although they are eager for a better-connected city, experts do not believe this stake to be central: the digital might be acclaimed but in limited doses. We may imagine the future city as digital, but it must be primarily more pleasant to live in, safer, and more inclusive. Planning for SSCs must prioritize urban outcomes over technological advances [122,123,124,125]. Khan and Bele [126] expressed a similar opinion and concluded that to transform cities into SSCs, people must be at the center of development efforts in addition to the use of ICT.
In the near future, a broader range of stakeholders, and the public, may be expected to embrace the system approach to SSCs and align with the experts’ opinion.

Funding

This research received no external funding.

Data Availability Statement

Data sharing is not applicable to this article.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Cohen, B. Urbanization in developing countries: Current trends, future projections, and key challenges for sustainability. Technol. Soc. 2006, 28, 63–80. [Google Scholar] [CrossRef]
  2. Ramaprasad, A.; Sánchez-Ortiz, A.; Syn, T. Ontological review of smart city research. In Proceedings of the Twenty-Third Americas Conference on Information Systems, Boston, MA, USA, 10–12 August 2017. [Google Scholar]
  3. Ramaprasad, A.; Sánchez-Ortiz, A.; Syn, T. A unified definition of a smart city. In Proceedings of the Electronic Government: 16th IFIP WG 8.5 International Conference, EGOV 2017, St. Petersburg, Russia, 4–7 September 2017; pp. 13–24. [Google Scholar]
  4. Lee, N.; Rodríguez-Pose, A. Creativity, cities, and innovation. Environ. Plan. A 2014, 46, 1139–1159. [Google Scholar] [CrossRef]
  5. World Economic Forum. Global Risks 2015 Report; World Economic Forum: Cologny, Switzerland, 2015. [Google Scholar]
  6. Anderson, M. Smart Cities Examples: The Smart, The Not-So-Smart, And the Ugly. 2022. Available online: https://onekeyresources.milwaukeetool.com/en/smart-cities (accessed on 15 May 2023).
  7. Liotine, M.; Ramaprasad, A.; Syn, T. Managing a Smart City’s Resilience to Ebola: An Ontological Framework. In Proceedings of the 49th Hawaii International Conference on System Sciences (HICSS), Washington, DC, USA, 5–8 January 2016; pp. 2935–2943. [Google Scholar]
  8. Hiremath, R.; Balachandra, P.; Kumar, B.; Bansode, S.; Murali, J. Indicator-based urban sustainability—A review. Energy Sustain. Dev. 2013, 17, 555–563. [Google Scholar] [CrossRef]
  9. Holden, M. Sustainability indicator systems within urban governance: Usability analysis of sustainability indicator systems as boundary objects. Ecol. Indic. 2013, 32, 89–96. [Google Scholar] [CrossRef]
  10. Albino, V.; Berardi, U.; Dangelico, R.M. Smart cities: Definitions, dimensions, performance, and initiatives. J. Urban Technol. 2015, 22, 3–21. [Google Scholar] [CrossRef]
  11. Dameri, R.P. Smart city implementation. In Creating Economic and Public Value in Innovative Urban Systems; Springer: Genoa, Italy, 2017. [Google Scholar] [CrossRef]
  12. Kourtit, K.; Nijkamp, P. Big data dashboards as smart decision support tools for i-cities–An experiment on stockholm. Land Use Policy 2018, 71, 24–35. [Google Scholar] [CrossRef]
  13. Mensah, J. Sustainable development: Meaning, history, principles, pillars, and implications for human action: Literature review. Cogent Soc. Sci. 2019, 5. [Google Scholar] [CrossRef]
  14. World Commission on Environment and Development (WCED). Our Common Future; Oxford University Press: Oxford, UK, 1987. [Google Scholar]
  15. United Nations. Report of the United Nations Conference on Environment and Development. 1992. Available online: https://digitallibrary.un.org/record/152955 (accessed on 15 May 2023).
  16. Huovila, A.; Bosch, P.; Airaksinen, M. Comparative analysis of standardized indicators for Smart sustainable cities: What indicators and standards to use and when? Cities 2019, 89, 141–153. [Google Scholar] [CrossRef]
  17. Barkemeyer, R.; Holt, D.; Preuss, L.; Tsang, S. What happened to the ‘development’ in sustainable development? Business guidelines two decades after Brundtland. Sustain. Dev. 2014, 22, 15–32. [Google Scholar] [CrossRef]
  18. The Welding Institute (TWI). What Is a Smart City?—Definition and Examples. 2023. Available online: https://www.twi-global.com/technical-knowledge/faqs/what-is-a-smart-city (accessed on 15 May 2023).
  19. Diller, E. Smart Cities: Art Museums and Urban Renewal; Scofidio + Renfro Architects LLP: New York, NY, USA, 1984. [Google Scholar]
  20. Global Data Thematic Intelligence. History of Smart Cities: Timeline 2020. 2020. Available online: https://www.verdict.co.uk/smart-cities-timeline (accessed on 15 May 2023).
  21. European Commission. City Initiatives—What Are Smart Cities? 2023. Available online: https://commission.europa.eu/eu-regional-and-urban-development/topics/cities-and-urban-development/city-initiatives/smart-cities_en (accessed on 15 May 2023).
  22. CISCO. What Is a Smart City? 2023. Available online: https://www.cisco.com/c/en/us/solutions/industries/smart-connected-communities/what-is-a-smart-city.html#:~:text=A%20smart%20city%20uses%20digital%20technology%20to%20connect%2C,constant%20feedback%20so%20they%20can%20make%20informed%20decisions (accessed on 15 May 2023).
  23. Eger, J.M. Smart growth, smart cities, and the crisis at the pump a worldwide phenomenon. I-WAYS-J. E-Gov. Policy Regul. 2009, 32, 47–53. [Google Scholar] [CrossRef]
  24. Washburn, D.; Sindhu, U.; Balaouras, S.; Dines, R.A.; Hayes, N.; Nelson, L.E. Helping CIOs Understand “Smart City” Initiatives, Defining the Smart City, its Drivers, and the role of the CIO. Growth 2009, 17, 1–17. [Google Scholar]
  25. Harrison, C.; Eckman, B.; Hamilton, R.; Hartswick, P.; Kalagnanam, J.; Paraszczak, J.; Williams, P. Foundations for smarter cities. IBM J. Res. Dev. 2010, 54, 1–16. [Google Scholar] [CrossRef]
  26. Batty, M.; Axhausen, K.W.; Giannotti, F.; Pozdnoukhov, A.; Bazzani, A.; Wachowicz, M.; Ouzounis, G.; Portugali, Y. Smart cities of the future. Eur. Phys. J. Spec. Top. 2012, 214, 481–518. [Google Scholar] [CrossRef]
  27. Lazaroiu, G.C.; Roscia, M. Definition methodology for the smart cities model. Energy 2012, 47, 326–332. [Google Scholar] [CrossRef]
  28. Lombardi, P.; Giordano, S.; Farouh, H.; Yousef, W. Modelling the smart city performance. Innov. Eur. J. Soc. Sci. Res. 2012, 25, 137–149. [Google Scholar] [CrossRef]
  29. Bakıcı, T.; Almirall, E.; Wareham, J. A smart city initiative: The case of Barcelona. J. Knowl. Econ. 2013, 4, 135–148. [Google Scholar] [CrossRef]
  30. Mulligan, C.E.; Olsson, M. Architectural implications of smart city business models: An evolutionary perspective. IEEE Commun. Mag. 2013, 51, 80–85. [Google Scholar] [CrossRef]
  31. Townsend, A.M. Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia; WW Norton & Company: New York, NY, USA, 2013. [Google Scholar]
  32. Shapiro, J.M. Smart cities: Quality of life, productivity, and the growth effects of human capital. Rev. Econ. Stat. 2006, 88, 324–335. [Google Scholar] [CrossRef]
  33. Herrschel, T. Competitiveness and sustainability: Can ‘smart city regionalism’square the circle? Urban Stud. 2013, 50, 2332–2348. [Google Scholar] [CrossRef]
  34. Lee, J.; Lee, H. Developing and validating a citizen-centric typology for smart city services. Gov. Inf. Q. 2014, 31, S93–S105. [Google Scholar] [CrossRef]
  35. Anthopoulos, L. Defining smart city architecture for sustainability. In Proceedings of the 14th Electronic Government and 7th Electronic Participation Conference (IFIP2015), Thessaloniki, Greece, 30 August–2 September 2015; pp. 140–147. [Google Scholar]
  36. Huston, S.; Rahimzad, R.; Parsa, A. Smart’sustainable urban regeneration: Institutions, quality and financial innovation. Cities 2015, 48, 66–75. [Google Scholar] [CrossRef]
  37. Hara, M.; Nagao, T.; Hannoe, S.; Nakamura, J. New key performance indicators for a smart sustainable city. Sustainability 2016, 8, 206. [Google Scholar] [CrossRef]
  38. Lee, J.; Kim, D.; Ryoo, H.-Y.; Shin, B.-S. Sustainable wearables: Wearable technology for enhancing the quality of human life. Sustainability 2016, 8, 466. [Google Scholar] [CrossRef]
  39. Ahvenniemi, H.; Huovila, A.; Pinto-Seppä, I.; Airaksinen, M. What are the differences between sustainable and smart cities? Cities 2017, 60, 234–245. [Google Scholar] [CrossRef]
  40. Caragliu, A.; Del Bo, C.; Nijkamp, P. Smart cities in Europe. J. Urban Technol. 2011, 18, 65–82. [Google Scholar] [CrossRef]
  41. Cocchia, A. Smart and digital city: A systematic literature review. In Smart City: How to Create Public and Economic Value with High Technology in Urban Space; Dameri, C.-S., Ed.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 13–44. [Google Scholar]
  42. Hamman, P.; Anquetin, V.; Monicolle, C. Contemporary meanings of the ‘Sustainable City’: A comparative review of the French and English language literature. Sustain. Dev. 2017, 25, 336–355. [Google Scholar] [CrossRef]
  43. Anthopoulos, L.; Fitsilis, P. Evolution roadmaps for smart cities: Determining viable paths. In Proceedings of the European Conference on e-Government (ECEG 2013), Como, Italy, 13–14 June 2013; pp. 27–36. [Google Scholar]
  44. Hao, J.; Zhu, J.; Zhong, R. The rise of big data on urban studies and planning practices in China: Review and open research issues. J. Urban Manag. 2015, 4, 92–124. [Google Scholar] [CrossRef]
  45. Samarajiva, R.; Lokanathan, S.; Madhawa, K.; Kreindler, G.; Maldeniya, D. Big data to improve urban planning. Econ. Political Wkly. 2015, 1, 42–48. [Google Scholar]
  46. Yigitcanlar, T.; Kamruzzaman, M.; Buys, L.; Ioppolo, G.; Sabatini-Marques, J.; da Costa, E.M.; Yun, J.J. Understanding ‘smart cities’: Intertwining development drivers with desired outcomes in a multidimensional framework. Cities 2018, 81, 145–160. [Google Scholar] [CrossRef]
  47. Hollands, R.G. Critical interventions into the corporate smart city. Camb. J. Reg. Econ. Soc. 2015, 8, 61–77. [Google Scholar] [CrossRef]
  48. Yigitcanlar, T.; Kamruzzaman, M. Does smart city policy lead to sustainability of cities? Land Use Policy 2018, 73, 49–58. [Google Scholar] [CrossRef]
  49. Patrão, C.; Moura, P.; de Almeida, A.T. Review of smart city assessment tools. Smart Cities. Smart Cities 2020, 3, 1117–1132. [Google Scholar] [CrossRef]
  50. Temenos, C. The sustainable development paradox: Urban political economy in the United States and Europe. J. Econ. Geogr. 2009, 9, 140–142. [Google Scholar] [CrossRef]
  51. Kambites, C.J. Sustainable development’: The ‘unsustainable’ development of a concept in political discourse. Sustain. Dev. 2014, 22, 336–348. [Google Scholar] [CrossRef]
  52. Höjer, M.; Wangel, J. Smart sustainable cities: Definition and challenges. In ICT Innovations for Sustainability; Springer International Publishing: Berlin/Heidelberg, Germany, 2015; pp. 333–349. [Google Scholar]
  53. Bibri, S.E.; Krogstie, J. Smart sustainable cities of the future: An extensive interdisciplinary literature review. Sustain. Cities Soc. 2017, 31, 183–212. [Google Scholar] [CrossRef]
  54. Elgazzar, R.F.; El-Gazzar, R. Smart cities, sustainable cities, or both: A Critical Review and Synthesis of Success and Failure Factors. In Proceedings of the 6th International Conference on Smart Cities and Green ICT Systems, Porto, Portugal, 22–24 April 2017. [Google Scholar]
  55. International Telecommunication Union (ITU). Measuring the Information Society Report 2016. 2016. Available online: https://www.itu.int/en/ITU-D/Statistics/Pages/publications/mis2016.aspx (accessed on 15 June 2023).
  56. Marsal-Llacuna, M.L. City indicators on social sustainability as standardization technologies for smarter (citizen-centered) governance of cities. Soc. Indic. Res. 2016, 128, 1193–1216. [Google Scholar] [CrossRef]
  57. Pandey, M.; Albert, S. A review of commonly used indicators of community sustainability. In Performance Metrics for Sustainable Cities; Routledge: London, UK, 2021; pp. 19–34. [Google Scholar]
  58. Giffinger, R.; Kramar, H. Benchmarking, profiling, and ranking of cities: The “European smart cities” approach. In Performance Metrics for Sustainable Cities; Routledge: London, UK, 2021; pp. 35–52. [Google Scholar]
  59. Komninos, N.; Mora, L. Exploring the big picture of smart city research. J. Reg. Sci. 2018, 17, 15–38. [Google Scholar]
  60. Sharifi, A. A critical review of selected smart city assessment tools and indicator sets. J. Clean. Prod. 2019, 233, 1269–1283. [Google Scholar] [CrossRef]
  61. Lombardi, P.; Giordano, S. Evaluating the Smart and Sustainable Built Environment in Urban Planning. In Smart Cities and Smart Spaces: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2019; pp. 1217–1232. [Google Scholar] [CrossRef]
  62. Stratigea, A.; Leka, A.; Panagiotopoulou, M. In search of indicators for assessing smart and sustainable cities and communities’ performance. In Smart Cities and Smart Spaces: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2019; pp. 265–295. [Google Scholar] [CrossRef]
  63. Sharifi, A.; Murayama, A. A critical review of seven selected neighborhood sustainability assessment tools. Environ. Impact Assess. Rev. 2013, 38, 73–87. [Google Scholar] [CrossRef]
  64. Petrova-Antonova, D.; Ilieva, S. Smart cities evaluation—A survey of performance and sustainability indicators. In Proceedings of the 44th Euromicro Conference on Software Engineering and Advanced Applications (SEAA), Prague, Czech Republic, 29–31 August 2018; pp. 486–493. [Google Scholar]
  65. Caird, S. City approaches to smart city evaluation and reporting: Case studies in the United Kingdom. Urban Res. Pract. 2018, 11, 159–179. [Google Scholar] [CrossRef]
  66. Anand, A.; Rufuss, D.D.W.; Rajkumar, V.; Suganthi, L. Evaluation of sustainability indicators in smart cities for India using MCDM approach. Energy Proc. 2017, 141, 211–215. [Google Scholar] [CrossRef]
  67. Phillis, Y.A.; Kouikoglou, V.S.; Verdugo, C. Urban sustainability assessment and ranking of cities. Comput. Environ. Urban Syst. 2017, 64, 254–265. [Google Scholar] [CrossRef]
  68. Ichikawa, H.; Yamato, N.; Dustan, P. Competitiveness of global cities from the perspective of the global power city index. Procedia Eng. 2017, 198, 736–742. [Google Scholar] [CrossRef]
  69. Purnomo, F.; Prabowo, H. Smart city indicators: A systematic literature review. J. Telecommun. Electron. Comput. Eng. 2016, 8, 161–164. [Google Scholar]
  70. Ameen, R.F.M.; Mourshed, M.; Li, H. A critical review of environmental assessment tools for sustainable urban design. Environ. Impact Assess. Rev. 2015, 55, 110–125. [Google Scholar] [CrossRef]
  71. Castanheira, G.; Bragança, L. The evolution of the sustainability assessment tool: From buildings to the built environment. Sci. World J. 2014, 2014, 491791. [Google Scholar] [CrossRef]
  72. Laurini, R. Geographic Knowledge Infrastructure: Applications to Territorial Intelligence and Smart Cities; ISTE-Elsevier: London, UK, 2017. [Google Scholar]
  73. Meijer, A.; Bolívar, M.P. Governing the smart city: A review of the literature on smart urban governance. Int. Rev. Adm. Sci. 2016, 82, 392–408. [Google Scholar] [CrossRef]
  74. Schaffers, H.; Komninos, N.; Pallot, M.; Aguas, M.; Almirall, E. Smart Cities as Innovation Ecosystems Sustained by Future Internet. Technical Report. 2012. Available online: https://www.diva-portal.org/smash/get/diva2:1241231/FULLTEXT01.pdf (accessed on 15 May 2023).
  75. Nam, T.; Prado, T.A. Conceptualizing smart city with dimensions of technology, people, and institutions. In Proceedings of the 12th Annual International Digital Government Research Conference: Digital Government Innovation in Challenging Times, College Park, TX, USA, 12–15 June 2011; pp. 282–291. [Google Scholar]
  76. United Nations Economic Commission for Europe (UNECE). Collection Methodology for Key Performance Indicators for Smart Sustainable Cities, (U4SSC) initiative. 2017. Available online: https://unece.org/DAM/hlm/documents/Publications/U4SSC-CollectionMethodologyforKPIfoSSC-2017.pdf (accessed on 15 July 2023).
  77. Guelzim, T.; Obaidat, M.S.; Sadoun, B. Introduction and overview of key enabling technologies for smart cities and homes. In Smart Cities and Homes; Morgan Kaufmann: Cambridge, MA, USA, 2016; pp. 1–16. [Google Scholar]
  78. Skvarciany, V.; Jurevičienė, D.; Žitkienė, R.; Lapinskaitė, I.; Dudė, U. A different approach to the evaluation of smart cities’ indicators. TalTech J. Eur. Stud. 2021, 11, 130–147. [Google Scholar] [CrossRef]
  79. Abba, A.H.; Noor, Z.Z.; Yusuf, R.O.; Din, M.F.M.; Abu Hassan, M.A. Assessing environmental impacts of municipal solid waste of Johor by analytical hierarchy process. Resour. Conserv. Recycl. 2013, 73, 188–196. [Google Scholar] [CrossRef]
  80. Saaty, T.L.; Kearns, K.P. Analytical Planning: The Organization of System; Elsevier: Amsterdam, The Netherlands; Pergamon Press: New York, NY, USA, 2014; Volume 7. [Google Scholar]
  81. Darko, A.; Chan, A.P.; Ameyaw, E.E.; Owusu, E.K.; Pärn, E.; Edwards, D.J. Review of application of analytic hierarchy process (AHP) in construction. Int. J. Constr. Manag. 2019, 19, 436–452. [Google Scholar] [CrossRef]
  82. Ishizaka, A.; Labib, A. Review of the main developments in the analytic hierarchy process. Expert Syst. Appl. 2011, 38, 14336–14345. [Google Scholar] [CrossRef]
  83. Tavares, R.M.; Tavares, J.L.; Parry-Jones, S. The use of a mathematical multicriteria decision-making model for selecting the fire origin room. Build. Environ. 2008, 43, 2090–2100. [Google Scholar] [CrossRef]
  84. Saaty, T.L.; Özdemir, M.S. How many judges should there be in a group? Ann. Data Sci. 2014, 1, 359–368. [Google Scholar] [CrossRef]
  85. Dano, U.L. An AHP-based assessment of flood triggering factors to enhance resiliency in Dammam, Saudi Arabia. GeoJournal 2022, 87, 1945–1960. [Google Scholar] [CrossRef]
  86. Gigović, L.; Pamučar, D.; Bajić, Z.; Drobnjak, S. Application of GIS-interval rough AHP methodology for flood hazard mapping in urban areas. Water 2017, 9, 360. [Google Scholar] [CrossRef]
  87. Dahri, N.; Abida, H. Monte Carlo simulation-aided analytical hierarchy process (AHP) for flood susceptibility mapping in Gabes Basin (southeastern Tunisia). Environ. Earth Sci. 2017, 76, 302. [Google Scholar] [CrossRef]
  88. Danumah, J.H.; Odai, S.N.; Saley, B.M.; Szarzynski, J.; Thiel, M.; Kwaku, A.; Kouame, F.K.; Akpa, L.Y. Flood risk assessment and mapping in Abidjan district using multi-criteria analysis (AHP) model and geoinformation techniques (Cote d’Ivoire). Geo-Environ. Disasters 2016, 3, 10. [Google Scholar] [CrossRef]
  89. Papaioannou, G.; Vasiliades, L.; Loukas, A. Multicriteria analysis framework for potential flood prone areas mapping. Water Resour. Manag. 2015, 29, 399–418. [Google Scholar] [CrossRef]
  90. Abbas, H.B.; Routray, J.K. Assessing factors affecting flood-induced public health risks in Kassala State of Sudan. Oper. Res. Health Care 2014, 3, 215–225. [Google Scholar] [CrossRef]
  91. Ouma, Y.; Tateishi, R. Urban flood vulnerability and risk mapping using integrated multi-parametric AHP and GIS: Methodological overview and case study assessment. Water 2014, 6, 1515–1545. [Google Scholar] [CrossRef]
  92. Saaty, R. Decision Making in Complex Environment: The Analytic Hierarchy Process (AHP) for Decision Making and the Analytic Network Process (ANP) for Decision Making with Dependence and Feedback; Super Decisions: Pittsburgh, PA, USA, 2003. [Google Scholar]
  93. Balogun, A.; Matori, A.; Hamid-Mosaku, A.I.; Dano, U.L.; Chandio, I.A. Fuzzy MCDM-based GIS model for subsea oil pipeline route optimization: An integrated approach. Mar. Georesources Geotechnol. 2017, 35, 961–969. [Google Scholar] [CrossRef]
  94. Şener, Ş.; Şener, E.; Nas, B.; Karagüzel, R. Combining AHP with GIS for landfill site selection: A case study in the Lake Beyşehir catchment area (Konya, Turkey). Waste Manag. 2010, 30, 2037–2046. [Google Scholar] [CrossRef] [PubMed]
  95. Forman, E.; Peniwati, K. Aggregating individual judgments and priorities with the analytic hierarchy process. Eur. J. Oper. Res. 1998, 108, 165–169. [Google Scholar] [CrossRef]
  96. Saaty, T.L.; Wong, M.M. Projecting average family size in rural India by the analytic hierarchy process. J. Math. Sociol. 1983, 9, 181–209. [Google Scholar] [CrossRef] [PubMed]
  97. Boucher, T.O.; MacStravic, E.L. Multiattribute evaluation within a present value framework and its relation to the analytic hierarchy process. Eng. Econ. 1991, 37, 1–32. [Google Scholar] [CrossRef]
  98. Muralidharan, C.; Anantharaman, N.; Deshmukh, S.G. Confidence interval approach to consistency ratio rule in the applications of analytic hierarchy process. West Indian J. Eng. 2003, 26, 17–28. [Google Scholar]
  99. Willett, K.; Sharda, R. Using the analytic hierarchy process in water resources planning: Selection of flood control projects. Socio-Econ. Plan. Sci. 1991, 25, 103–112. [Google Scholar] [CrossRef]
  100. Levy, J.K. Multiple criteria decision making and decision support systems for flood risk management. Stoch. Environ. Res. Risk Assess. 2005, 19, 438–447. [Google Scholar] [CrossRef]
  101. Kuenz Murphy, C. Limits on the analytic hierarchy process from its consistency index. Eur. J. Oper. Res. 1993, 65, 138–139. [Google Scholar] [CrossRef]
  102. Olson, D.L.; Venkataramanan, M.; Mote, J.L. A technique using analytical hierarchy process in multiobjective planning models. Socio-Econ. Plan. Sci. 1986, 20, 361–368. [Google Scholar] [CrossRef]
  103. Dadkhah, K.M.; Zahedi, F. A mathematical treatment of inconsistency in the analytic hierarchy process. Math. Comput. Model. 1993, 17, 111–122. [Google Scholar] [CrossRef]
  104. Saaty, T.L. A scaling method for priorities in hierarchical structures. J. Math. Psychol. 1977, 15, 234–281. [Google Scholar] [CrossRef]
  105. Saaty, T.L. Rank generation, preservation, and reversal in the analytic hierarchy decision process. Decis. Sci. 1987, 18, 157–177. [Google Scholar] [CrossRef]
  106. Saaty, T.L. What Is the Analytic Hierarchy Process? Springer: Berlin/Heidelberg, Germany, 1988. [Google Scholar]
  107. Saaty, T.L.; Vargas, L.G. Inconsistency and rank preservation. J. Math. Psychol. 1984, 28, 205–214. [Google Scholar] [CrossRef]
  108. Boucher, T.O.; Luxhoj, J.T.; Descovich, T.; Litman, N. Multicriteria evaluation of automated filling systems: A case study. J. Manuf. Syst. 1993, 12, 357–378. [Google Scholar] [CrossRef]
  109. Islei, G.; Lockett, A.G. Judgemental modelling based on geometric least square. Eur. J. Oper. Res. 1988, 36, 27–35. [Google Scholar] [CrossRef]
  110. Karapetrovic, S.; Rosenbloom, E.S. A quality control approach to consistency paradoxes in AHP. Eur. J. Oper. Res. 1999, 119, 704–718. [Google Scholar] [CrossRef]
  111. CSER (Center for Strategic Economic Research). 2013 Prosperity Index, Measuring the Sacramento Region’s Competitive Position, 9th ed.; CSER: Sacramento, CA, USA, 2013. [Google Scholar]
  112. Gazzeh, K.; Abubakar, I.R. Regional disparity in access to basic public services in Saudi Arabia: A sustainability challenge. Util. Policy 2018, 52, 70–80. [Google Scholar] [CrossRef]
  113. Sharifi, A. A global dataset on tools, frameworks, and indicator sets for smart city assessment. Data Brief 2020, 29, 105364. [Google Scholar] [CrossRef]
  114. IESE Business School. Cities in Motion Index. 2015. Available online: https://media.iese.edu/research/pdfs/ST-0366-E.pdf (accessed on 15 April 2023).
  115. Söderström, O.; Paasche, T.; Klauser, F. Smart cities as corporate storytelling. City 2014, 18, 307–320. [Google Scholar] [CrossRef]
  116. Coe, A.; Paquet, G.; Roy, J. E-governance and smart communities: A social learning challenge. Soc. Sci. Comput. Rev. 2001, 19, 80–93. [Google Scholar] [CrossRef]
  117. Kitchin, R. The real-time city? Big data and smart urbanism. GeoJournal 2014, 79, 1–14. [Google Scholar] [CrossRef]
  118. Suma, S.; Mehmood, R.; Albeshri, A. Automatic event detection in smart cities using big data analytics. In Smart Societies, Infrastructure, Technologies and Applications: First International Conference, SCITA; Springer International Publishing: Jeddah, Saudi Arabia, 2018; pp. 111–122. [Google Scholar]
  119. Moss Kanter, R.; Litow, S.S. Informed and Intercnnected: A Manifesto for Smarter Cities; General Management Unit Working Paper, 09-141; Harvard Business School: Boston, MA, USA, 2009. [Google Scholar]
  120. Rios, P. Creating “The Smart City” (Doctoral Dissertation). 2012. Available online: https://archive.udmercy.edu/handle/10429/393 (accessed on 15 July 2023).
  121. Caragliu, A.; Del Bo, C.; Nijkamp, P. Smart cities in Europe. In Proceedings of the 3rd Central European Conference in Regional Science, Kosice, Slovak Republic, 7–9 October 2009; pp. 7–9. [Google Scholar]
  122. Allam, Z.; Newman, P. Redefining the smart city: Culture, metabolism and governance. Smart Cities 2018, 1, 4–25. [Google Scholar] [CrossRef]
  123. Anthopoulos, L. Smart utopia VS smart reality: Learning by experience from 10 smart city cases. Cities 2017, 63, 128–148. [Google Scholar] [CrossRef]
  124. Ibrahim, M.; El-Zaart, A.; Adams, C. Smart sustainable cities roadmap: Readiness for transformation towards urban sustainability. Sustain. Cities Soc. 2018, 37, 530–540. [Google Scholar] [CrossRef]
  125. Lee, J.H.; Hancock, M.G.; Hu, M.C. Towards an effective framework for building smart cities: Lessons from Seoul and San Francisco. Technol. Forecast. Soc. Change 2014, 89, 80–99. [Google Scholar] [CrossRef]
  126. Khan, S.; Bele, A. Transforming lifestyles and evolving housing patterns: A comparative case study. Open House Int. 2016, 41, 76–86. [Google Scholar] [CrossRef]
Figure 1. Methodology flow chart for ranking SSC indicators.
Figure 1. Methodology flow chart for ranking SSC indicators.
Smartcities 06 00129 g001
Figure 2. Combination, renaming, and harmonization of indicators/dimensions: Example of transformation of the list from Skvarciany et al. [78].
Figure 2. Combination, renaming, and harmonization of indicators/dimensions: Example of transformation of the list from Skvarciany et al. [78].
Smartcities 06 00129 g002
Figure 3. Combination, renaming, and harmonization of indicators/dimensions: Example of transformation of the list from Huovila et al. [16].
Figure 3. Combination, renaming, and harmonization of indicators/dimensions: Example of transformation of the list from Huovila et al. [16].
Smartcities 06 00129 g003
Figure 4. Most frequently used SSC dimensions (harmonized list) from selected relevant literature published after 2015.
Figure 4. Most frequently used SSC dimensions (harmonized list) from selected relevant literature published after 2015.
Smartcities 06 00129 g004
Figure 5. AHP respondents’ profile analysis.
Figure 5. AHP respondents’ profile analysis.
Smartcities 06 00129 g005
Figure 6. Final combined ranking and proposed list of indicators under the selected set of SSC dimensions. © List of indicators compiled by the author based on [16,58,60,61,64,66,69,76,77,78,113,114].
Figure 6. Final combined ranking and proposed list of indicators under the selected set of SSC dimensions. © List of indicators compiled by the author based on [16,58,60,61,64,66,69,76,77,78,113,114].
Smartcities 06 00129 g006
Table 1. Examples of the number of respondents in AHP surveys from recent studies.
Table 1. Examples of the number of respondents in AHP surveys from recent studies.
StudyNumber of Experts (Respondents)
1[85]18
2[78]7
3[86]10
4 [87]8
5[88]9
6[89]9
7[90]10
8[91]16
Max18
Average11
Min7
Table 2. Scale of relative importance in AHP analysis [92].
Table 2. Scale of relative importance in AHP analysis [92].
Scale of ImportanceDefinitionInterpretation
1The same importanceTwo factors producing the same input to the goal
3Having little importanceSomewhat important over its compared factor
5Having more importanceStrongly important.
7Having very high importanceVery strong importance.
9Extremely importantExtreme significance.
2, 4, 6 and 8Middle valuesThe middle values are used to compare two neighboring judgments whenever required.
ReciprocalsIf (v) is the decision value when (i) is compared with (j), then 1/v is the decision value when (j) is compared with (i).
Table 3. Number of pairs according to the rule “n choose 2” (“at least n” statements).
Table 3. Number of pairs according to the rule “n choose 2” (“at least n” statements).
Number of Statements
(Dimensions/Themes)
Number of PairsNumber of Statements
(Dimensions/Themes)
Number of Pairs
10936
211045
331155
461266
5101378
6151491
72115105
82816120
Table 4. Random Consistency Index (RI) values for different matrix sizes in AHP analysis [80].
Table 4. Random Consistency Index (RI) values for different matrix sizes in AHP analysis [80].
Matrix Size (Number of Items)12345678910
RI000.580.91.121.241.321.411.451.49
Table 5. Summary of relevant literature on sustainability indicators for smart cities.
Table 5. Summary of relevant literature on sustainability indicators for smart cities.
StudyNo of
Themes
No of
Indicators
Comments
1.[16]6413Analysis of a set of 16 Smart City Assessment tools. Investigation of two ISO * standards, one ETSI ** standard, three ITU *** standards, and the SDG 11+ + monitoring framework.
2.[58]629/
3.[60,113]744Derived from the investigation of 34 sets of indicators.
Although Sharifi (2019) ended up with seven themes and 44 indicators, these studies examined 80 themes and 902 indicators with a global geographic focus.
4.[61]1038Based on the UN-HABITAT’s Global Urban Indicators Database.
5.[64]635/
6.[66]720/
7.[69]618Derived from an investigation of 30 related studies.
8.[76]391Based on the United for Smart Sustainable Cities (U4SSC) initiative.
9.[77]1313/
10.[78]510/
11.[114]1057Cities in Motion Index (CMI).
Total Themes/indicators79768
* International Organization for Standardization. ** European Telecommunications Standards Institute, a European Standards Organization (ESO). *** International Telecommunication Union.
Table 6. Weight of SSC dimensions (harmonized list) in selected relevant studies published after 2015.
Table 6. Weight of SSC dimensions (harmonized list) in selected relevant studies published after 2015.
Dimension:LivingEnviron-
mrent
Economy & ProductivityGovern-
ance
Infra-
structure
People & SocietyTechnology
& ICT
Mobility & TransportationUrban PlanningNature
StudyDimensions with the highest frequency
1.[16]●●●●●●●●●●●●●●●●●●●●
●●●●
●●●●●●●●●●●●●●●●●●●●●-
2.[58]---
3.[60,113]---
4.[61]●●●●●●●●●●●-
5.[64]----
6.[66]●●----
7.[69]---
8.[76]●●●●●●●●●●●●●●●●●●●●●●●●●●-
9.[77]●●●●●●●●●●---
10.[78]-----
11.[114]●●-●●-
Total
Weight
352319151413131251
Table 7. AHP PCM.
Table 7. AHP PCM.
LivingEnvironmentEconomy & ProductivityGovernanceMobility & TransportationPeople & SocietyInfra-structureTechnology & ICT
Living1.0001.2751.4171.1721.0940.8660.8711.215
Environment0.7841.0001.4401.4951.7600.3640.8671.551
Economy & Productivity0.7060.6951.0001.1291.1790.4070.7281.592
Governance0.8540.6690.8851.0001.0940.8311.2291.223
Mobility & Transp.0.9140.5680.8480.9141.0000.4220.6871.388
People
& Society
1.1542.7442.4581.2042.3691.0001.2491.511
Infrastructure1.1481.1531.3740.8141.4550.8011.0001.584
Technology
& ICT
0.8230.6450.6280.8180.7210.6620.6311.000
Table 8. Generating the priority vector of the aggregated AHP experts’ judgment.
Table 8. Generating the priority vector of the aggregated AHP experts’ judgment.
Priority Vector
Pair-wise Comparison Matrix (GeoM)
1.0001.2751.4171.1721.0940.8660.8711.215 1.099
0.7841.0001.4401.4951.7600.3640.8671.551 1.048
0.7060.6951.0001.1291.1790.4070.7281.592 0.863
0.8540.6690.8851.0001.0940.8311.2291.223 0.955
0.9140.5680.8480.9141.0000.4220.6871.388 0.797
1.1542.7442.4581.2042.3691.0001.2491.511 1.595
1.1481.1531.3740.8141.4550.8011.0001.584 1.134
0.8230.6450.6280.8180.7210.6620.6311.000 0.732
Table 9. Generating the Eigenvector from the PCM and the priority vector.
Table 9. Generating the Eigenvector from the PCM and the priority vector.
Normalized Eigen
Pair-wise Comparison Matrix Priority Vector Vector
1.0001.2751.4171.1721.0940.8660.8711.215 0.134 1.083 8.106
0.7841.0001.4401.4951.7600.3640.8671.551 0.127 1.056 8.283
0.7060.6951.0001.1291.1790.4070.7281.592 0.105 0.854 8.137
0.8540.6690.8851.0001.0940.8311.2291.223X0.116=0.9548.214
0.9140.5680.8480.9141.0000.4220.6871.388 0.097 0.787 8.121
1.1542.7442.4581.2042.3691.0001.2491.511 0.194 1.632 8.413
1.1481.1531.3740.8141.4550.8011.0001.584 0.138 1.114 8.080
0.8230.6450.6280.8180.7210.6620.6311.000 0.089 0.727 8.174
Table 10. Co-occurrence and AHP ranking comparison of the selected SSC dimensions.
Table 10. Co-occurrence and AHP ranking comparison of the selected SSC dimensions.
Phase 1: Content Analysis (Co-Occurrence)Phase 2: AHP Analysis
DimensionCo-Occurrence *Score
(%)
RankDimensionPriorityScore
(%)
Rank
Living0.243124.311People & Society0.19419.401
Environment0.159715.972Infrastructure0.13813.792
Economy & Productivity0.131913.193Living0.13413.363
Governance0.104210.424Environment0.12712.754
Infrastructure0.09729.725Governance0.11611.615
Technology & ICT0.09039.036Economy & Productivity0.10510.506
People & Society0.09039.036Mobility & Transportation0.0979.697
Mobility & Transportation0.08338.337Technology & ICT0.0898.908
1.000100.00 1.000100.00
* Normalized score.
Table 11. Combined Ranking of SSC dimensions from co-occurrence and AHP rankings.
Table 11. Combined Ranking of SSC dimensions from co-occurrence and AHP rankings.
SSC DimensionsCo-Occurrence StudyAHP StudyCombined
Score
Final Combined
Ranking
Study Score
(%)
Local Score Si
(Out of 10)
Study Score
(%)
Local Score Si
(Out of 10)
Living24.31BPD 10.0013.366.898.441
Environment15.976.5712.756.576.573
Economy & Productivity13.195.4310.505.415.425
Governance10.424.2911.615.985.146
Infrastructure9.724.0013.797.115.554
Technology & ICT9.033.718.904.594.158
People & Society9.033.7119.40BPD 10.006.862
Mobility & Transportation8.333.439.694.994.217
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gazzeh, K. Ranking Sustainable Smart City Indicators Using Combined Content Analysis and Analytic Hierarchy Process Techniques. Smart Cities 2023, 6, 2883-2909. https://doi.org/10.3390/smartcities6050129

AMA Style

Gazzeh K. Ranking Sustainable Smart City Indicators Using Combined Content Analysis and Analytic Hierarchy Process Techniques. Smart Cities. 2023; 6(5):2883-2909. https://doi.org/10.3390/smartcities6050129

Chicago/Turabian Style

Gazzeh, Karim. 2023. "Ranking Sustainable Smart City Indicators Using Combined Content Analysis and Analytic Hierarchy Process Techniques" Smart Cities 6, no. 5: 2883-2909. https://doi.org/10.3390/smartcities6050129

Article Metrics

Back to TopTop