Next Article in Journal
A Review on the Motion of Magnetically Actuated Bio-Inspired Microrobots
Previous Article in Journal
Progressive Dilution of Point Clouds Considering the Local Relief for Creation and Storage of Digital Twins of Cultural Heritage
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Relationships between Organizational Factors and Systems Engineering Process Performance in Launching Space Vehicles

1
National Aeronautics and Space Administration’s Launch Services Program at Kennedy Space Center, Kennedy Space Center, FL 32899, USA
2
Department of Industrial Engineering and Management Systems, University of Central Florida, Orlando, FL 32816, USA
3
Kern Technology Group, Virginia Beach, VA 23462, USA
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(22), 11541; https://doi.org/10.3390/app122211541
Submission received: 13 October 2022 / Revised: 7 November 2022 / Accepted: 11 November 2022 / Published: 14 November 2022
(This article belongs to the Section Aerospace Science and Engineering)

Abstract

:
The launch vehicle industry has long been considered a pioneering industry in systems engineering. Launch vehicles are large complex systems that require a methodical multi-disciplinary approach to design, build, and launch. Launch vehicles are used to deliver payloads—such as humans, robotic science missions, or national security payloads—to desired locations in space. Previous research has identified deficient or underperforming systems engineering as a leading contributor to launch vehicle failures. Launch vehicle failures can negatively affect national security, the economy, science, and society, thus highlighting the importance of understanding the factors that influence systems engineering in launch vehicle organizations in the United States. The purpose of this study was to identify and evaluate the relationships between organizational factors and systems engineering process performance. Structural equation modeling was used to develop a model of the relationships of these factors and test hypotheses. The results showed that organizational commitment, top management support, the perceived value of systems engineering, and systems engineering support significantly influence systems engineering process performance in the launch vehicle industry. Implications of this study for improving the performance of systems engineering in launch vehicle organizations are discussed.

1. Introduction

Launch vehicles are any rocket propelled systems used to deliver a payload to a desired location in space. They are used to transport spacecraft such as communications satellites, deep space robotic missions, and crewed space capsules for both private and government organizations. The National Aeronautics and Space Administration (NASA) and United States Space Force (USSF) are the major governmental users of launch vehicles in the United States (US). Many private telecommunications companies use launch vehicles to deliver satellites into orbit. Most private companies and governmental organizations do not own or operate the launch vehicles that they use. They instead outsource the development and operation of launch vehicles to private companies to deliver their spacecraft to space [1,2]. Spacecraft missions include communication, television services, science discovery, national security, and transporting humans. Satellite costs can range from several million dollars to several billion dollars [3]. Given the importance and costs of these satellites, the loss of satellites can have significant economic, scientific, societal, and national security impacts [4,5].
Spacecraft are most often lost or destroyed because of launch vehicle failures, thus indicating the need for the US Government and private satellite owners to minimize the risk of launch vehicle failure. The risk of launch failure is also a major concern for launch vehicle service providers, because a launch failure could detrimentally affect profits. Launch failures can also significantly affect the launch vehicle’s economic viability [4]. A study by J. Steven Newman, in 2001, examined 50 space flight failures and found that inadequate systems engineering contributed to all of them [6]. Newman’s study highlights the need to identify the factors affecting systems engineering in launch vehicle organizations.
Systems engineering is widely defined as a methodical interdisciplinary approach to design, build, operate, and manage complex systems [7,8,9,10,11]. In the context of launch vehicles, systems engineering is an interdisciplinary approach to design, manufacture, integrate, test, manage, and launch vehicles. Recognizing the criticality of systems engineering in the launch vehicle industry, in 2012, NASA added a systems engineering evaluation element to its launch vehicle certification policy. NASA’s launch vehicle certification policy is a plan to thoroughly evaluate the risk associated with a given commercial launch vehicle before trusting the launch vehicle to deliver a NASA spacecraft into orbit [12]. The US Space Force, another major user of commercial launch vehicles, follows a similar policy of evaluating the systems engineering of a launch vehicle provider as part of the launch vehicle certification [13].
Despite the substantial recognition that systems engineering is critical to the success of a launch vehicle, little research has examined the factors affecting systems engineering in launch vehicle organizations. Many launch vehicle failure studies have focused on the direct and proximate causes of failure, whereas little attention has been paid to systems engineering, including the underlying factors affecting systems engineering, in launch vehicle organizations. This study focused on filling that knowledge gap by identifying the underlying factors affecting systems engineering in US launch vehicle organizations. The next section of this paper presents a review of the literature on systems engineering and launch vehicle failures. The subsequent section presents a review of the methods. Analysis of the data collected from a survey is presented in the fourth section of this paper. The conclusions drawn from the data analysis are shown in the final section.

2. Literature Review

2.1. Systems Engineering and Launch Vehicle Failures

A large amount of literature has been published on launch vehicle failures; however, very little focus has been placed on how systems engineering has influenced these failures. A study by J. Steven Newman at NASA examined 50 spaceflight failures and identified systems engineering as a contributor to all of them. In Newman’s study, 41 of the 50 spaceflight failures were launch vehicle failures [6]. A second study conducted by NASA in 2005 focused on engineering lessons learned and systems engineering applications in space systems [14]. The results of the study confirmed many of Newman’s findings. Other launch vehicle failure studies by Chang [15], Harland and Lorenz [16], Isakowitz and Hopkins [17], and Leung [18] did not specifically identify systems engineering as contributing to failures; however, the identified causes of failure were associated with systems engineering. Although several studies have identified systems engineering or elements of systems engineering as contributors to launch vehicle failures, studies identifying the factors that influence systems engineering performance in launch vehicle organizations are scarce.

2.2. Value of Systems Engineering

The value of systems engineering has been a matter of much debate, because numerous complex systems were developed before the systems engineering discipline was established. Over the years, many studies have attempted to identify the value or effectiveness of systems engineering. One of the earliest studies was completed by Werner Gruhl at NASA. The study examined 32 NASA projects and aimed to identify the relationship between the investment in systems engineering and cost overruns. Projects that invested more in systems engineering were found to have smaller cost overruns, whereas projects that invested less in systems engineering had larger cost overruns [19,20]. In another study surveying many prominent engineering firms worldwide, including Boeing, Northrop Gruman, Lockheed Martin, Airbus, and Ford Motor Company, most survey respondents indicated that project costs were reduced when systems engineering was applied [21].
In the late 1990s, Boeing performed a study of three similar projects that they were conducting. The first project was to be completed without the use of systems engineering, whereas the other two used systems engineering. All three projects started simultaneously, and the two projects that used systems engineering were completed in half the time of the project that did not use systems engineering [20,22]. A similar study on automobile prototypes by Kamal Malek has shown that the development of prototypes is completed much faster when a systems engineering approach is used [23].
Eric Honour has performed extensive studies on the value of systems engineering, showing that systems engineering improves development quality, and the quality of systems engineering affects development quality [20]. Honour performed a second study on the return on investment on eight systems engineering activities: mission definition, requirements engineering, systems architecting, system implementation, technical analysis, technical management, scope management, and verification and validation. The results indicated that each activity significantly influenced project success [24].

2.3. Project Success Factors

Numerous studies have examined the critical success factors for projects. Project critical success factors are considered elements that lead to cost, schedule, quality, and customer satisfaction. The presence of these critical factors improves the likelihood of the project’s success [25,26,27,28,29,30,31]. An extensive literature review revealed that some of the most frequently found critical success factors are support from top management, a current and well-thought-out plan, good communication, adequate staff, competent leadership, sufficient and well allocated resources, effective control and monitoring, and adequate organizational structure and culture [26].

2.4. Key Systems Engineering Best Practices

Systems engineering best practices often result from lessons learned in numerous projects. The knowledge gained through experience in past projects is critical to improving systems engineering capabilities [9,10,14,32,33]. After a literature review of numerous studies on the best practices and lessons learned in systems engineering, the following themes emerged:
  • Systems engineering infrastructure is critical to project success [9,10,14,32,33].
  • Requirements should be unambiguous, up-to-date, and vetted with project stakeholders [9,10,14,32,33].
  • An effective systems engineering plan should be implemented as early as possible [9,10,14,32,33].
  • Failure to adhere to good systems engineering could lead to cost and schedule overruns [9,10,14,32,33].
  • People are the primary resource for successful systems engineering [9,10,14,32,33].
  • Communication is critical for successful systems engineering [9,10,14,32,33].
The following systems engineering handbooks and standards were compared: INCOSE Systems Engineering Handbook [9], NASA Systems Engineering Handbook [10], MIL-STD-499C—Systems Engineering [34], IEEE-1220—Application and Management of the Systems Engineering Process [35], EIA-632—Process for Engineering a System [36], ISO/IEC 15288—Systems and software Engineering System Life Cycle Process [37], and Capability Maturity Model Integration for Development [38]. The themes regarding best practices were found in each of these systems engineering standards and guidelines.

2.5. Systems Engineering Support (SES)

For the purposes of this study, systems engineering support is considered the framework provided by an organization for the systems engineering process. Systems engineering support includes planning, control and assessment, communication, tools and infrastructure, as well as the personnel provided by the organization to execute the systems engineering process. Systems engineering support is a major element enabling the systems engineering process in organizations [9,32,39].

2.5.1. Planning

Most of the systems engineering standards reviewed identified planning as a key element for implementing systems engineering in an organization. Leading systems engineering organizations such as INCOSE, Systems Engineering Body of Knowledge (SEBoK), NASA, and IEEE consider planning to be one of the most important aspects of implementing systems engineering. Systems engineering experts agree that planning should be implemented and well documented as early as possible. An adequate systems engineering plan should describe the systems being developed, the technical management process, the tailoring of the systems engineering life-cycle approach to be used, and integration of the technical disciplines into the systems engineering process.

2.5.2. Control and Assessment (C&A)

Systems engineering standards also identify control and assessment of the systems engineering process as another key element of implementing systems engineering. The purpose of control and assessment is to determine the effectiveness of the systems engineering process in meeting cost, schedule, and technical performance requirements for the systems under development [10,32,36,37]. Most systems engineering researchers believe that systems engineering assessment should evaluate the project’s performance on cost and schedule [19,24,40,41,42]. However, a study by Bruff, in 2009, on linking systems engineering best practices to project cost and schedule performance has shown a strong correlation between systems engineering best practices and project performance [43]. Another study by Componation in 2009 sought to relate the success of NASA projects to the systems engineering process [44]. Componation found only a correlation between cost and schedule, but not the systems engineering process.

2.5.3. Communication

Good communication has been identified as a critical factor for project success [14]. Communication has also been identified as a major systems engineering best practices. Systems engineers are expected to communicate throughout various levels of the launch vehicle organization. Communication is a key characteristic of high technology organizations [45].

2.5.4. Tools and Infrastructure (T&I)

Another systems engineering enabler identified through the literature is tools and infrastructure [32]. The instruments provided by an organization to execute the systems engineering process are considered systems engineering tools. For instance, a tool could be a process or software application. The infrastructure is described as the framework or environment in which the systems engineering tools are applied. Adequate tools and infrastructure enable good systems engineering [9,32,39].

2.5.5. Personnel

Numerous researchers have identified adequate personnel as a key systems engineering best practice and lesson learned. The personnel is the human capital provided for the systems engineering process: “the people are the primary resource for successfully developing a system” [46]. A sufficient number of competent personnel has also been identified as a critical success factor for projects [26].

2.6. Systems Engineering Culture

The culture of an organization forms the basis for the systems engineering process [43]. Culture, as described by systems engineering researchers, comprises the values, beliefs, and common practices of the organization [32,47,48,49]. Oppenheim has described systems engineering culture as “a pervasive mental state and bias for systems engineering methods applied to problem solving across the development lifecycle and all levels of enterprise processes” [39]. A healthy systems engineering culture promotes effective systems engineering and encourages system thinking. According to SEBoK, a healthy systems engineering culture is strong in the areas of leadership, trust and moral, cooperation and teamwork, empowering employees, confidence in processes, and job security.

2.6.1. Top Management Support

Research on organizational leadership and culture has shown that the leaders of an organization impart their values, assumptions, and beliefs on the organization [50,51]. The top or senior management of an organization is typically responsible for setting the organization’s goals, directions, and strategies. This aspect is also true in launch vehicle organizations. Numerous researchers have identified top management support as a critical success factor for projects [25,26,27,28,29,30,31].

2.6.2. Organizational Commitment

Another key element of organizational culture is organizational commitment: i.e., how loyal or committed employees are to the organization. Several studies in fields similar to systems engineering, such as safety culture, have shown that organizational commitment significantly affects process performance [52,53,54]. Organizational commitment is a key element of the systems engineering culture of an organization.

2.7. Literature Review Summary

The literature review identified key factors for implementing and enabling systems engineering. Factors critical to the success of systems engineering were also identified. Multiple studies have identified factors critical in implementing, enabling, and ensuring the overall success of systems engineering. However, no studies examining the relationships of these factors on the performance of the systems engineering process in launch vehicle organizations were found. This study specifically addresses this knowledge gap by focusing on answering the following questions:
  • RQ1: What factors affect the performance of the systems engineering process in launch vehicle organizations?
  • RQ2: What are the effects of systems engineering culture on the performance of the systems engineering process in launch vehicle organizations?
  • RQ3: What are the effects of systems engineering support on the performance of the systems engineering process in launch vehicle organizations?

3. Materials and Methods

Systems engineering in launch vehicle organizations can be very complex because of the complexity of the launch vehicles themselves. Other systems engineering studies have shown that identifying the factors that influence the systems engineering process can be equally complex [20,21,22,24]. However, sufficient information from the literature review and prior experience allowed us to hypothesize a model describing the relationships of the factors affecting the performance of the systems engineering process in launch vehicle organizations (Figure 1). The constructs of this study were developed using the research conceptualization process. Through this process definitions and theories were applied to each construct of the study [55]. Establishing preliminary construct definitions were critical for providing a starting point for the inquiry of this study [56].

3.1. Research Hypotheses

Systems engineering deficiencies have been identified as a contributor to launch failures [6,14,16,17,18]. Understanding the factors that affect systems engineering in launch vehicle organizations is critical to improve the performance of the systems engineering process. By studying these factors, insight could be gained to improve the performance of the systems engineering process, thus increasing the chance of a successful space launch. To test the relationships of the constructs identified in this study, the research hypotheses in Table 1 were developed.

3.2. Research Design

Studies that investigate critical factors are often qualitative and conducted via survey instruments [26,57]. Specifically, the primary data collection tool of systems engineering studies is survey instruments. A survey instrument was developed based on the construct of this study. The survey consisted of two parts: the first part collected demographic information, and the second part collected information on the study variables. The survey statements addressing the dimensions of the study variables are shown in Table 2.
Each construct had three or more corresponding survey statements. The survey statements were developed and approved as a part of doctoral research at the University of Central Florida, Orlando, FL, USA [11]. The survey was approved by the Institutional Review Board. The survey statements corresponding to the study variables were measured on a five-point Likert scale ranging from 1 = strongly disagree to 5 = strongly agree.

3.3. Data Source

This study required a population with intimate knowledge of the systems engineering processes in launch vehicle organizations. For this reason, launch vehicle organizations in the US were targeted. In addition, US launch vehicle organizations were specifically targeted to avoid any language barriers. Surveys were distributed to launch vehicle organizations throughout the US. The surveys targeted various people throughout the organizations involved in the systems engineering process. The target population included systems engineers, project managers, discipline and component level engineers, technical managers, program managers, and any other people involved in the systems engineering process. No restrictions were placed on the sizes of the organizations targeted for the survey. A minimum of 200 valid survey responses were required to complete a valid structural equation modeling (SEM) study [58,59,60].

3.4. Data Analysis Method

In this study, IBM SPSS Statistics 25 and Amos 26 were the software tools used for data analysis. IBM SPSS Statistics 25 was used to format, code, and run descriptive statistics analysis on the survey data. IBM SPSS Amos 26 was used to develop and test the measurement and research models, and identify significant factors. SEM was used to test the research hypotheses. The study followed the data analysis process below. The data analysis process is also illustrated in Figure 2.

3.4.1. Validate Survey Responses

Each survey was examined to ensure that no data were missing. Surveys with missing data would have led to different samples sizes for each variable dimension during data analysis, and would not have been suitable for regression and correlation analyses [61,62]. Invalid surveys were reviewed to ascertain whether any information could be inferred from the missing responses. Invalid surveys were retained for recordkeeping but were not included in the data analysis.

3.4.2. Descriptive Statistics Analysis

Descriptive statistics on each variable, including data such as mean, frequency, and standard deviation, was calculated. This analysis provides a basic understanding of the characteristics of the data collected. It is also useful in testing assumptions for confirmatory factor analysis (CFA) and SEM. Descriptive statistics plots were also visually examined to determine whether any information could be gained from the visual representation of the data.

3.4.3. Test Assumptions

Most statistical data analyses assume that the data are normal, linear, and homoscedastic, without multicollinearity. Verifying these assumptions is critical to account for the assumption violation in remainder of data analysis. In cases in which the data are not normal, bootstrapping could be used during SEM analysis. If assumptions testing shows evidence of multicollinearity, variables are removed from the model to satisfy that assumption.

3.4.4. Confirmatory Factor Analysis

CFA was used to evaluate the relationships between the observed dimensions and the study variables. The measurement model was a model of the relationship between the observed variable and corresponding survey statements (variable dimensions). A CFA was used because prior knowledge of the relationships existed [63]. CFA is an essential part of SEM. As part of the CFA, Cronbach’s alpha is calculated to determine the reliability of the survey statements for each corresponding survey variable during model identification. A Cronbach’s alpha value greater than 0.70 is considered adequate for a CFA [64]. To complete CFA, the following steps were performed [60]:
  • Specify the model to be evaluated.
  • Check whether the model is identified.
  • If the model is identified, determine whether the model fit is adequate.
  • If the model fit is not adequate, revise the model to improve the model fit.
  • If the model fit is adequate, validate the measurement model.
The goodness of fit is a measure of how well the hypothesized model fits the data. Numerous model fit indices may be calculated to determine goodness of fit. The goodness of fit indices and criteria selected for this study are shown in Table 3.

3.4.5. Hypothesis Testing

SEM is a robust statistical method that incorporates factor analysis, canonical correlation, and multiple regression [65,66]. SEM illustrates the relationships between observed variables and latent variables by combining various model types. This feature of SEM provided a method to quantitatively test the hypothesized structural model.

4. Results

More than 500 surveys were distributed to launch vehicle organizations throughout the US. A total of 203 valid survey responses were received, which yielded a response rate of approximately 40%. Furthermore, 27.7% of the respondents identified as systems engineers, which made up the largest percentage of systems engineering roles. The remaining 72.3% of respondents were distributed across other roles in the systems engineering process, thus addressing any sampling bias concerns. Approximately 40.9% of respondents had more than 20 years of experience. The number of projects worked on by the respondents varied, and the largest proportion of respondents (31%), by a small margin, had worked on 30 or more projects. Most of the respondents’ systems engineering experience (71.4%) was in the launch vehicle industry. Survey participants were from small, medium, and large organizations, and 60.1% came from large organizations. Participants had a range of levels of government involvement, and most (71.4%) came from government agencies. The details of the profiles of respondents are shown in Table 4.
Before CFA and SEM, multivariate assumptions of normality, linearity, and homoscedasticity were verified. When residuals were plotted and examined, the data were found to be in violation of the assumptions. Bootstrapping is a widely accepted technique to use for SEM when assumptions are violated [60,64]. Although research by Byrne [63] has suggested that validity can still be achieved when the data do not follow a normal distribution, we nevertheless used the bootstrapping technique to bolster analysis results.
Multicollinearity was the next assumption tested. This test identifies one or more observed variables that are strongly correlated. When two observed variables are highly correlated, they are essentially measuring the same thing and could cause problems later in the data analysis process. SEM researchers have suggested that a correlation above 0.90 between variables is considered evidence of multicollinearity [66]. No observed variables were found to have a correlation greater than 0.90; therefore, no observed variables were recommended for removal on the basis of multicollinearity.

4.1. Model Validity Test Results

Measurement models were developed and evaluated for each latent variable of this study. CFA was completed for each of the measurement models. The measurement models were revised until adequate model fit was achieved. After achieving adequate model fit of each of the measurement models, we constructed the hypothesized model (Figure 3). Final model fit results are shown in Table 5.
To test the validity of the latent variables in the study, we used CFA. In SEM, validity is considered the degree to which the model results accurately measures the construct that it was designed to measure [60,64,65,67]. Two types of validity were evaluated: convergent validity and discriminant validity. Convergent validity is when evidence shows sufficient overlap of variables measuring a particular construct. The CFA results and convergent validity results are shown in Table 6.
Three items were used to evaluate convergent validity: factor loading (item reliability), average variance extracted (AVE), and construct reliability (CR). Factor loadings above 0.5 are considered strong evidence of convergent validity [64]. All indicators in this study had factor loadings of 0.586 or greater. An AVE value greater than 0.5 shows strong evidence of convergent validity [64]. All constructs in this study had an AVE of 0.537 or greater. The final method of examining convergent validity is calculating CR for each construct. Researchers generally accept that a CR greater than 0.7 shows strong evidence of convergent validity [58]. Each construct in this study had a CR of 0.773 or greater. All three indicators, as shown in Table 6, showed strong evidence of convergent validity for the hypothesized model.
Discriminant validity is a necessary component for validating constructs. High discriminant validity is present when a construct uniquely measures a phenomenon that other constructs do not measure. A rigorous test for identifying discriminant validity is performed by comparing a construct’s square root of the AVE with its correlation with another construct. Discriminant validity is present when the correlation is less than the square root of the AVE [64]. As shown in Table 7, all correlation pairs were less than the associated square root of the AVE; therefore, the hypothesized model showed adequate discriminant validity.

4.2. Structural Equation Modeling Results

After successfully validating and achieving adequate model fit for the measurement model of the latent constructs by using CFA, we evaluated the structural model. The hypothesized structural model is illustrated in Figure 3. The structural model regression estimates are shown in Table 8. Goodness of fit of the model was also evaluated, and the revised structural model satisfied all model fit criteria. The model fit results are shown in Table 9.

4.3. Hypothesis Testing Results

After achieving adequate goodness of fit of the revised hypothesized structural model, we completed hypothesis testing by using SEM. The hypothesis test results are shown in Table 10. H1, H3, H4, H5, H6, H8, and H9 were all supported by the data. The model provided sufficient information to test the hypotheses.

5. Discussion

The purpose of this study was to identify and evaluate the factors affecting systems engineering process performance in launch vehicle organizations in the US. A secondary goal of this study was to develop a model explaining the relationships among systems engineering support, systems engineering culture, and systems engineering process performance. The model developed in the study identified factors that influenced systems engineering performance. Because the model showed a good fit and allowed the hypotheses to be tested, both goals were achieved.
The hypotheses that organizational commitment has a direct (H1) and an indirect (H8) effect on systems engineering process performance (H1) were supported by the data. Organizational commitment also directly affected systems engineering support (H5). Thus, the more committed employees are to the organization, the greater the systems engineering process is perceived to perform. In addition, the more committed employees are to an organization, the more support is provided to systems engineering. Several studies have shown that organizational commitment is critical to the culture of an organization, and organizational culture is the context for which the systems engineering process is executed. Therefore, it is reasonable to conclude that greater commitment among launch vehicle organization personnel is associated with greater support provided to the systems engineering process, thus resulting in better process performance.
The hypothesis testing results did not support top management support directly influencing systems engineering process performance (H2). Indicating that top management support could not be used as a predictor for systems engineering process performance. However, the data confirmed that top management support directly affects systems engineering support (H6). The hypothesis testing also indicated that top management support has an indirect effect on systems engineering process performance (H9), as mediated by systems engineering support. Numerous studies have shown that top management support is a key factor in organizational culture and a critical success factor in many projects. Therefore, we conclude that as the recognition of senior management’s support for systems engineering increases, the organization’s support for systems engineering increases, thereby increasing the performance of the systems engineering process.
The value of systems engineering was also identified as a factor affecting systems engineering. The data supported the hypothesis that the perceived value of systems engineering directly affects systems engineering process performance (H3), thus implying that as the perceived value of systems engineering increases, the performance of the systems engineering process increases. Therefore, perceiving and understanding the value of the process would be likely to enhance the performance of the process.
The results of this study suggest several implications for launch vehicle organizations and for systems engineering in general. Prior research has shown that systems engineering affects cost, schedules, technical performance, and customer satisfaction [9,10,20,32,41]. The present study identified that systems engineering process performance in launch vehicle organizations is influenced by organizational commitment, the value of systems engineering, top management support, and systems engineering support. Improving these factors could improve systems engineering process performance. For launch vehicle organizations, improving systems engineering process performance could decrease technical issues and launch failures, and ultimately result in time and cost savings.
After recognizing these factors influencing systems engineering, leaders of an organization could target these areas for improvement to enhance the performance of their organization’s systems engineering process. For example, the value of systems engineering has a significant direct effect on systems engineering process performance. The leader of an organization could choose to implement protocols to educate and ensure that the individuals in the organization recognize the value of systems engineering. This would in turn improve the performance of the organization’s systems engineering and ultimately improve the launch vehicle organization’s ability to meet cost, schedule, and technical performance goals.

6. Conclusions

Launch vehicles are very complex systems that often require equally complex systems engineering approaches to design, build, test, and launch them. Diverse systems engineering approaches can be implemented in different ways. Regardless of the systems engineering style or model, this study identified factors that affect systems engineering process performance in launch vehicle organizations. Deficient or low performing systems engineering has been identified in several studies as a leading contributor to launch vehicle failures. Given the significant influence of launch vehicle failures on human life, the economy, science, and national security, identifying factors that could enhance systems engineering process performance in launch vehicle organizations is imperative.
A systems engineering process performance model was developed to examine the relationships among perceived top management support, organizational commitment, the value of systems engineering, systems engineering support, and systems engineering process performance in US launch vehicle organizations. SEM and hypothesis testing results showed that organizational commitment, the value of systems engineering, and systems engineering support directly or indirectly influence systems engineering process performance. While top management support indirectly influence systems engineering process performance.
The data set for this study was provided voluntarily from participants in US launch vehicle organizations. Survey responses that assessed the latent variables of this study are based on participant’s perceptions. Responses may have been based on what participants believe is ideal and not what was observed or experienced in respective organizations. The survey also did not take in to account risk tolerance of the individual organizations. A difference in risk tolerance could potentially lead to different perceptions of the systems engineering process.
Despite these limitations, the results of this study identified factors that present implications for systems engineering and U.S. launch vehicle organizations. Past research has shown that systems engineering significantly impacts technical performance, cost, and schedule [7,9,10,19,20,21,22,24,41]. Prior research has also identified deficiencies in the systems engineering process as a major contributor to launch vehicle failures [6,16,17,18]. Improving the factors that affect systems engineering process performance could improve technical performance of the system, reduce cost, improve schedule, and reduce the number of launch vehicle failures.
This study’s results underscore the important roles of organizational factors in systems engineering process performance. Organizational leaders should emphasize their support for systems engineering, highlight the value of systems engineering, enhance systems engineering support, and bolster employee commitment to organizations. Improving on these factors may increase the ability of organizations to meet cost, schedule, technical performance, and customer satisfaction goals, thus ultimately improving launch vehicle success.

Author Contributions

Conceptualization, D.G., W.K. and T.K.; methodology, D.G.; software, D.G.; validation, D.G.; formal analysis, D.G.; investigation, D.G.; resources, D.G.; data curation, D.G.; writing—original draft preparation, D.G.; writing—review and editing, D.G. and W.K.; visualization, D.G.; supervision, W.K., T.K., L.R. and D.K.; project administration, D.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was determined to be exempt human research and approved by the Institutional Review Board of University of Central Florida (SBE-18-14051, 30 May 2018). Exempt status was granted since participation in the survey was completely voluntary, working adults in the launch vehicle industry was the target population, and survey statements were about the organization of employment.

Informed Consent Statement

Participation in this study was completely voluntary and informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data is not publicly available due to privacy concerns.

Acknowledgments

The authors would like to thank the US launch vehicle industry for participating in the survey.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. FAA. Origins of the Commercial Space Industry; FAA: Washington, DC, USA, 2016. [Google Scholar]
  2. Gibson, D.K. Launch vehicle systems engineering life-cycle evolution and comparison. Syst. Eng. 2019, 22, 330–334. [Google Scholar] [CrossRef]
  3. Pawlikowski, E.M. Mission Assurance—A Key Part of Space; Senior Leader Perspective: 2010; NRO: Washington, DC, USA, 2010; pp. 6–9. [Google Scholar]
  4. Sauvageau, D.R.; Allen, B.D. Launch Vehicle Historical Reliability. In Proceedings of the 34th AIAA/ASME/SAE/ASEE Joint Propulsion Conference & Exhibit, Cleveland, OH, USA, 13–15 July 1998; AIAA: Reno, Nevada. [Google Scholar]
  5. Gydesen, P.W. What Is the Impact to National Security without Commercial Space Applications; Air University: Maxwell Air Force Base, AL, USA, 2006. [Google Scholar]
  6. Newman, J.S. Failure-space—A systems engineering look at 50 space system failures. Acta Astronaut. 2001, 48, 517–527. [Google Scholar] [CrossRef]
  7. BKCASE Editorial Board. Guide to the Systems Engineering Body of Knowledge (SEBoK), v. 2.7, R.J Cloutier (Editor in Chief); The Trustees of the Stevens Institute of Technology: Hoboken, NJ, USA, 2022. [Google Scholar]
  8. Brill, J. Systems Engineering—A Restrospective View. Syst. Eng. 1999, 1, 258–266. [Google Scholar] [CrossRef]
  9. INCOSE. Systems Engineering Handbook; INCOSE: San Diego, CA, USA, 2011. [Google Scholar]
  10. NASA. NASA Systems Engineering Handbook, Administration; NASA: Washington, DC, USA, 2007.
  11. Gibson, D.K. Factors Affecting Systems Engineering in Launch Vehicle Organizations. In Industrial Engineering and Management Systems; University of Central Florida: Orlando, FL, USA, 2019; p. 243. [Google Scholar]
  12. NASA. Launch Services Risk Mitigation Policy for NASA-Owned and/or NASA-Sponsored Payloads/Missions; Administration; NASA: Washington, DC, USA, 2012.
  13. USAF; NRO; NASA. Memorandum of Understanding between USAF, NRO & NASA for New Entrant Launch Vehicle Certification; USAF, NRO, and NASA: Washington, DC, USA, 2011.
  14. Gill, P.S.; Garcia, D.; Vaughan, W.W. Engineering Lessons Learned and Systems Engineering Applications. In Proceedings of the 43rd AIAA Aerospace Sciences Meeting and Exhibit, Reno, Nevada, 10–13 January 2005; AIAA: Reno, Nevada. [Google Scholar]
  15. Chang, I.S. Investigation of space launch vehicle catastrophic failures. J. Spacecr. Rocket. 1996, 33, 198–205. [Google Scholar] [CrossRef]
  16. Harland, D.M.; Lorenz, R. Space Systems Failures: Disasters and Rescues of Satellites, Rocket and Space Probes, 1st ed.; Praxis: Omaha, NE, USA, 2005. [Google Scholar]
  17. Isakowitz, S.J.; Hopkins, J.B.; Hopkins, J., Jr. International Reference Guide to Space Launch Systems, 4th ed.; American Institute of Aeronautics and Astronautics: Reston, VA, USA, 2004. [Google Scholar]
  18. Leung, R. The Effect of Mission Assurance on ELV Launch Success Rate: An Analysis of Two Management Systems for Launch Vehicles. In The School of Engineering and Applied Science; The George Washington University: Ann Arbor, MI, USA, 2014. [Google Scholar]
  19. Gruhl, W. Lessons Learned, Cost/Schedule Assessment Guide; NASA Headquaters: Washington, DC, USA, 1992. [Google Scholar]
  20. Honour, E.C. Understanding the Value of Systems Engineering; INCOSE: San Diego, CA, USA, 2004. [Google Scholar]
  21. Kludze, A.-K.K.P. Engineering of Complex Systems: The Impact of Systems Engineering at NASA; The George Washington University: Ann Arbor, MI, USA, 2004; p. 346. [Google Scholar]
  22. Vanek, F.; Jackson, P.; Grzybowski, R. Systems engineering metrics and applications in product development- A critical literature review and agenda for further research. Syst. Eng. 2008, 11, 107–124. [Google Scholar] [CrossRef]
  23. Malek, K. A Case Study in Product Development: The 1984-199x Chevrolet Corvette; MIT Industrial Performance Center: Cambridge, MA, USA, 1996. [Google Scholar]
  24. Honour, E.C. Systems Engineering Return on Investment. In 20th annual International Symposium of the International Council on Systems Engineering INCOSE; INCOSE: Chicago, IL, USA, 2010. [Google Scholar]
  25. Belassi, W.; Tukel, O.I. A new framework for determining critical success failure factors in projects 1996. Int. J. Proj. Manag. 1996, 14, 141–151. [Google Scholar] [CrossRef]
  26. Fortune, J.; White, D. Framing of project critical success factors by a systems model. Int. J. Proj. Manag. 2006, 24, 53–65. [Google Scholar] [CrossRef]
  27. Müller, R.; Söderland, J.; Jugdev, K. Critical success factors in projects. Int. J. Manag. Proj. Bus. 2012, 5, 757–775. [Google Scholar] [CrossRef]
  28. Randt, F.J.d.; Waveren, C.C.v.; Chan, K.-Y. An empirical study on the critical success factors of small- to medium-sized projects. S. Afr. J. Ind. Eng. 2014, 25, 13–28. [Google Scholar]
  29. Shenhar, A.J.; Tishler, A.; Dvir, D.; Lipovetsky, S.; Lechler, T. Refining the search for project success factors: A multivariate, typological approach. R D Manag. 2002, 32, 111–126. [Google Scholar] [CrossRef]
  30. Pinto, J.; Slevin, D. Critical Factors in Successful Project Implementation. IEEE Trans. Eng. Manag. 1987, 34, 22–27. [Google Scholar] [CrossRef]
  31. Westerveld, E. The Project Excellence Model: Linking success criteria and critical success factors. Int. J. Proj. Manag. 2003, 21, 411–418. [Google Scholar] [CrossRef]
  32. SEBoK. Guide to the Systems Engineering Body of Knowledge (SEBoK); Board, B.E., Ed.; The Trustees of the Stevens Institute of Technology: Hoboken, NJ, USA, 2016. [Google Scholar]
  33. Blair, J.C.; Ryan, R.S.; Schutzenhofer, L.A. Lessons Learned in Engineering; NASA: Washington, DC, USA, 2011. [Google Scholar]
  34. Pennell, L.W.; Knight, F.L. USAF-Aerospace Corp Systems Engineering Handbook; USAF: El Segundo, CA, USA, 2005. [Google Scholar]
  35. ISO. ISO/IEC Standard for Systems Engineering-Application and Management of the Systems Engineering Process. In ISO/IEC 26702 IEEE Std 1220-2005 First Edition 2007-07-15; IEEE: New York, NY, USA, 2007; pp. c1–cc88. [Google Scholar]
  36. EIA. Processes for Engineering a System; EIA: Arlington, VA, USA, 1999. [Google Scholar]
  37. IEEE. Systems and Software Engineering-System Life Cycle Processes; IEEE: New York, NY, USA, 2008. [Google Scholar]
  38. SEI. Capability Maturity Model Integrated for Development; SEI: Hanscom AFB, MA, USA, 2010. [Google Scholar]
  39. Oppenheim, B.W.; Murman, E.M.; Secor, D.A. Lean Enablers for Systems Engineering. Syst. Eng. 2011, 14, 29–55. [Google Scholar] [CrossRef]
  40. Valerdi, R. The Constructive Systems Engineering Cost Model (COSYSMO); University of Southern California: Ann Arbor, MI, USA, 2005; p. 137. [Google Scholar]
  41. Elm, J.P.; Goldenson, D.R. The Business Case for Systems Engineering Study-Results of Effectiveness Survey; SEI: Hanscom AFB, MA, USA, 2012. [Google Scholar]
  42. Son, S.K.; Kim, S.-K. The Impact of Systems Engineering on Program Performance: A Case Study of the Boeing Company in Computer Science and its Applications; Springer: Dordrectht, The Netherland, 2012; Volume 203, pp. 537–545. [Google Scholar]
  43. Bruff, R.S. Systems Engineering Best Practices as Measures for Successful Outcomes in Selected United States Defense Industry Aerospace Programs; Walden University: Ann Arbor, MI, USA, 2008; p. 269. [Google Scholar]
  44. Componation, P.J.; Utley, D.R.; Farrington, P.A.; Youngblood, A.D. Assessing the Relationships between Project Success and Systems Engineering Processes at NASA. In Proceedings of the 2009 Industrial Engineering Research Conference, Miami, FL, USA, 30 May 2009; pp. 456–461. [Google Scholar]
  45. Reigle, R.F. Measuring Organic and Mechanistic Cultures. Eng. Manag. J. 2015, 13, 3–8. [Google Scholar] [CrossRef]
  46. Abrahamsson, P. Is Management Commitement a Necessity After All in Software Process Improvement? In Proceedings of the Proceedings of the 26th Euromicro Conference. EUROMICRO 2000. Informatics: Inventing the Future. Maastricht, The Netherlands, 5–7 September 2000. [Google Scholar]
  47. Iivari, J.; Huisman, M. The Relationship between Organizational Culture and the Deployment of Systems Development Methodologies. MIS Q. 2007, 31, 35–58. [Google Scholar] [CrossRef]
  48. Carroll, E. Systems Engineering Cultural Transformation. In Socorro Systems Summit; INCOSE: Socorro, NM, USA, 2016. [Google Scholar]
  49. NASA. Columbia Accident Investigation Report; NASA: Washington, DC, USA, 2003.
  50. Hogan, S.J.; Coote, L.V. Organizational culture, innovation, and performance: A test of Schein's model. J. Bus. Res. 2014, 67, 1609–1621. [Google Scholar] [CrossRef]
  51. Chatman, J.A.; O’Reilly, C.A. Paradigm lost: Reinvigorating the study of organizational culture. Res. Organ. Behav. 2016, 36, 199–224. [Google Scholar] [CrossRef] [Green Version]
  52. Alnoaimi, M. Safety Climate and Safety Outcomes in Aircraft Maintenance: A Mediating Effect of Employee Turnover and Safety Motivation. In Department of Industrial Engineering and Management Systems; University of Central Florida: Orlando, FL, USA, 2015. [Google Scholar]
  53. Fogarty, G.J. The Role of Organizational and Individual Variables in Aircraft Maintenance Performance. Int. J. Appl. Aviat. Stud. 2004, 4, 73–90. [Google Scholar]
  54. Alsowayigh, M. Assessing safety culture among pilots in Saudi Airlines: Quantitative Study Approach. In Industrial Engineering and Management System; University of Central Florida: Orlando, FL, USA, 2014. [Google Scholar]
  55. Mueller, C.W. Conceptualization, Operationalization, and Measurement: The SAGE Encyclopedia of Social Science Research Methods; Sage Publications: Thousand Oaks, CA, USA, 2004. [Google Scholar]
  56. Yin, R.K. Case Study Research: Design and Methods, 4th ed.; Sage Publications: Thousand Oaks, CA, USA, 2009. [Google Scholar]
  57. Niazi, M.; Wilson, D.; Zowghi, D. A framework for assisting the design of effective software process improvement implementation strategies. J. Syst. Softw. 2005, 78, 204–222. [Google Scholar] [CrossRef]
  58. Boomsma, A.; Hoogland, J.J. The Robustness of LISREL Modeling Revisited. In Structural Equation Modeling: Present and Future: A Festschrift in Honor of Karl Joreskog; Stam, L., Ed.; Scientific Software International, Inc.: Lincolnwood, IL, USA, 2001. [Google Scholar]
  59. Fabrigar, L.R.; Wegener, D.T. Exploratory Factor Analysis; Oxford: New York, NY, USA, 2011. [Google Scholar]
  60. Kline, R. Principles and Practices of Structural Equation Modeling, 3rd ed.; The Guilford Press: New York, NY, USA, 2011. [Google Scholar]
  61. Centre, S.S. Approches to the Analysis of Survey Data; The University of Reading: Reading, UK, 2001. [Google Scholar]
  62. Kitchenham, B.; Pfleeger, S.L. Principles of Survey Research Part 6: Data Analysis. Softw. Eng. Notes 2003, 28, 24–27. [Google Scholar] [CrossRef]
  63. Byrne, B.M. Structural Equation Modeling with Amos, 3rd ed.; Multivariate Applications Series; Routledge: Ney York, NY, USA, 2016. [Google Scholar]
  64. Hair, J.F. Multivariate Data Analysis, 7th ed.; Pearson: London, UK, 2014. [Google Scholar]
  65. Hoyle, R. (Ed.) Handbook of Structural Equation Modeling; The Guilford Press: New York, NY, USA, 2012. [Google Scholar]
  66. Tabachnick, B.G.; Fidell, L.S. Using Multivariate Statistics, 6th ed.; Pearson: Boston, MA, USA, 2013. [Google Scholar]
  67. Schumacker, R.E.; Lomax, R.G. A Beginner's Guide to Structural Equation Modeling, 3rd ed.; Routledge: Ney York, NY, USA, 2010. [Google Scholar]
Figure 1. The hypothesized conceptual model.
Figure 1. The hypothesized conceptual model.
Applsci 12 11541 g001
Figure 2. Survey Data Analysis Process.
Figure 2. Survey Data Analysis Process.
Applsci 12 11541 g002
Figure 3. Hypothesized Model.
Figure 3. Hypothesized Model.
Applsci 12 11541 g003
Table 1. Research Hypotheses.
Table 1. Research Hypotheses.
HypothesisDescription
H1Organizational commitment has a direct effect on systems engineering process performance.
H2Top management support has a direct effect on systems engineering process performance.
H3The value of systems engineering has a direct effect on systems engineering process performance.
H4Systems engineering support has a direct effect on systems engineering process performance.
H5Organizational commitment has a direct effect on systems engineering support.
H6Top management support has a direct effect on systems engineering support.
H7The value of systems engineering has a direct effect on systems engineering support.
H8Systems engineering support mediates the relationship between organizational commitment and systems engineering process performance.
H9Systems engineering support mediates the relationship between top management support and systems engineering process performance.
H10Systems engineering support mediates the relationship between the value of systems engineering and systems engineering process performance.
Table 2. Study Dimensions and Survey Statements.
Table 2. Study Dimensions and Survey Statements.
Top Management Support1Senior management strongly supports the systems engineering process.
2Senior management believes that a strong systems engineering process adds value to the organization.
3Senior management communicates its support for systems engineering to the organization.
4Senior management supports skipping a systems engineering step if it will help the organization save money.
5Senior management supports skipping a systems engineering step if it will help the organization meet schedule goals.
Value of Systems Engineering1Practicing good systems engineering reduces launch vehicle cost.
2Practicing good systems engineering reduces launch vehicle schedule delays.
3Practicing good systems engineering improves launch vehicle performance.
Organizational Commitment1I am willing to put in a great amount of effort beyond what is normally expected in order to help my organization be successful.
2I speak highly of this organization to my friends and family as a great place to work.
3I find that my values and my organization’s values are very similar.
4I am proud to tell others that I work for this organization.
5I really care about the fate of this organization.
6This is the best launch vehicle organization to work for.
Planning1My organization has a documented plan on how systems engineering should be implemented.
2My role in systems engineering is clearly identified.
3My organization identifies how all technical engineering disciplines are integrated.
4There was a systems engineering plan in place at the beginning of launch vehicle development.
Communication1My organization emphasizes effective communication between departments such as design, manufacturing, and operations.
2My organization emphasizes effective communication among the various engineering disciplines (disciplines such as avionics, structures, propulsion, environments, software, etc.).
3Management has an open door policy for discussing systems engineering issues.
4There is good communication about systems engineering items in the workplace.
5Documenting detailed rationale for technical decisions is highly encouraged.
Tools and Infrastructure1My organization follows an established systems engineering model such as Waterfall, V Model, Spiral, Agile, or Iterative.
2I have appropriate tools to successfully execute systems engineering in my organization.
3Appropriate training and guidance are provided for the systems engineering tools.
4The systems engineering tools provided are regularly used by my organization.
Personnel1My organization has employees whose sole responsibility is to facilitate the systems engineering process.
2My organization has the right people involved to successfully implement systems engineering.
3My organization has a sufficient number of people to successfully implement systems engineering.
4My organization understands the skills needed to successfully execute systems engineering.
5My organization provides access to systems engineering training.
6Training provided by my organization has prepared me well for my systems engineering duties.
Control and Assessment1There are performance measures or metrics used to evaluate the performance of systems engineering in my organization.
2Technical reviews are held at regular intervals to evaluate the performance of the systems engineering process, such as system requirements reviews, preliminary design reviews, critical design reviews, etc.
3All stakeholders are informed of the project’s progress.
4Resources allocated to a project are evaluated to determine whether they are adequate to achieve project success.
Systems Engineering Process Performance1Applying a thorough systems engineering process in my organization reduces the number of launch vehicle manufacturing problems.
2Applying a thorough systems engineering process in my organization reduces the severity of launch vehicle manufacturing problems.
3Applying a thorough systems engineering process in my organization reduces the number of launch vehicle integration and test problems.
4Applying a thorough systems engineering process in my organization reduces the severity of launch vehicle integration and test problems.
5Applying a thorough systems engineering process in my organization reduces the number of launch vehicle problems during flight.
6Applying a thorough systems engineering process in my organization reduces the severity of launch vehicle problems during flight.
Table 3. Goodness of Fit Indices.
Table 3. Goodness of Fit Indices.
Model Fit IndexCriteria
Chi - square   ( χ 2 )Low
Degrees of freedom (df)>0
Probability value (p)>0.05
χ 2 d f <5
Goodness-of-fit index (GFI)>0.90
Tucker–Lewis index (TLI)>0.90
Comparative fit index (CFI)>0.90
Root mean square error approximation (RMSEA)<0.08
90% Confidence interval (lo90—hi90)<0.05–0.08
Probability of closeness of fit (Pclose)>0.5
Table 4. Data Set (n = 203).
Table 4. Data Set (n = 203).
CharacteristicsFrequency%
RoleEngineering support73.4
Manager199.4
Design Engineer52.5
Test Engineer42
Manufacturing Engineer31.5
Operations Engineer73.4
Component Engineer136.4
Analyst2512.3
Integration Engineer146.9
Project Manager2813.8
Systems Engineer7838.4
Years of Experience1–5 years3115.3
5–10 years2914.3
10–15 years3014.8
15–20 years3014.8
20 years or more8340.9
# of Projects1–5 projects4723.2
6–10 projects4924.1
10–15 projects2612.8
15–20 projects188.9
20 or more projects6331
LV ExperienceNot all LV experience5828.6
All LV experience14571.4
Organization SizeSmall146.9
Medium6733
Large12260.1
Government InvolvementLittle21
Some157.4
A lot 4120.2
Government agency14571.4
Table 5. Model Fit of Hypothesized Model.
Table 5. Model Fit of Hypothesized Model.
Model Fit IndexCriteriaInitial Model
χ 2 d f <51.764
TLI>0.900.904
CFI>0.900.915
RMSEA<0.080.061
Table 6. Confirmatory Factor Analysis and Validity Test.
Table 6. Confirmatory Factor Analysis and Validity Test.
ConstructIndicatorFactor LoadingαCRAVE
Systems Engineering Process PerformanceSEPP10.753---0.9470.748
SEPP20.848---
SEPP30.941---
SEPP40.896---
SEPP50.871---
SEPP60.880---
SE SupportCommunication0.8810.7400.9490.790
Planning0.8680.726
Personnel0.9770.794
T&I0.8110.802
C&A0.9000.732
Top Management SupportTMS10.973---0.9330.874
TMS20.894---
Organizational CommitmentOC20.834---0.8890.730
OC40.938---
OC50.783---
Value of SEVSE10.840---0.7730.537
VSE20.749---
VSE30.586---
NOTE: All factor loadings were statistically significant at the p < 0.001 level.
Table 7. Discriminant Validity.
Table 7. Discriminant Validity.
ConstructsVSETMSOCSESSEPP
VSE0.733
TMS−0.0480.934
OC0.2820.2620.857
SES0.1260.5480.5400.890
SEPP0.4010.3450.5070.5770.868
Factor correlations. Square root of AVE on the diagonal. VSE = value of systems engineering, TMS = top management support, OC = organizational commitment, SES = systems engineering support, SEPP = systems engineering process performance.
Table 8. Structural Model Regression Estimates.
Table 8. Structural Model Regression Estimates.
Std. Estimate (β)S.E.C.R.p
OC ← Number of Projects0.1640.0262.5570.011
VSE ← Career Level0.1890.0292.8720.004
SES ← OC0.4570.0429.209***
SES ← TMS0.4660.0309.374***
SEPP ← OC0.1620.0632.5720.010
SEPP ← VSE0.3340.0516.403***
SEPP ← SES0.4590.0717.682***
*** p < 0.001.
Table 9. Revised Structural Model Goodness of Fit.
Table 9. Revised Structural Model Goodness of Fit.
Model Fit IndexCriteriaFinal Revised Model
Chi - square   ( χ 2 )low11.279
Degrees of freedom (df)>011
Probability value (p)>0.050.420
χ 2 d f <51.025
Goodness-of-fit index (gfi)>0.900.985
Tucker–Lewis index (TLI)>0.900.999
Comparative fit index (CFI)>0.900.999
Root mean square error approximation (RMSEA)<0.080.011
90% Confidence interval (lo90—hi90)<0.05–0.080.00–0.075
Probability of closeness of fit (Pclose)>0.50.775
Table 10. Hypothesis Testing Results.
Table 10. Hypothesis Testing Results.
HypothesisDescriptionβtSupported?
H1Organizational commitment has a direct effect on systems engineering process performance.0.1622.572 *Yes
H2Top management support has a direct effect on systems engineering process performance.0.0671.583No
H3The value of systems engineering has a direct effect on systems engineering process performance.0.3346.403 **Yes
H4Systems engineering support has a direct effect on systems engineering process performance.0.4597.682 **Yes
H5Organizational commitment has a direct effect on systems engineering support.0.4579.209 **Yes
H6Top management support has a direct effect on systems engineering support.0.4669.374 **Yes
H7The value of systems engineering has a direct effect on systems engineering support.0.0280.679No
H8Systems engineering support mediates the relationship between organizational commitment and systems engineering process performance.Partial Mediation
H9Systems engineering support mediates the relationship between top management support and systems engineering process performance.0.1862.889 *Yes
H10Systems engineering support mediates the relationship between the value of systems engineering and systems engineering process performance.No Mediation
β = standardize path coefficient, t = critical ratio, * p = < 0.001, ** p < 0.01.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gibson, D.; Karwowski, W.; Kotnour, T.; Rabelo, L.; Kern, D. The Relationships between Organizational Factors and Systems Engineering Process Performance in Launching Space Vehicles. Appl. Sci. 2022, 12, 11541. https://doi.org/10.3390/app122211541

AMA Style

Gibson D, Karwowski W, Kotnour T, Rabelo L, Kern D. The Relationships between Organizational Factors and Systems Engineering Process Performance in Launching Space Vehicles. Applied Sciences. 2022; 12(22):11541. https://doi.org/10.3390/app122211541

Chicago/Turabian Style

Gibson, Denton, Waldemar Karwowski, Timothy Kotnour, Luis Rabelo, and David Kern. 2022. "The Relationships between Organizational Factors and Systems Engineering Process Performance in Launching Space Vehicles" Applied Sciences 12, no. 22: 11541. https://doi.org/10.3390/app122211541

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop